Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Alan Turing and Neural Computation

(Caveat: for the sake of exposition, at times I articulate past views in slightly anachronistic terms; I do my best to capture the gist of what Alan Turing and others meant in terms that contemporary readers should find perspicuous.)

The computational theory of cognition says that cognition is largely explained by neural computations—computations carried out by the nervous system. A precursor of this theory may be seen as early as Thomas Hobbes’s remark that reasoning is computation. But the modern computational theory of cognition, as it is understood today, requires a rigorous mathematical understanding of computability.

For starters, the mathematical theory of computability includes the following: (1) a formalization of algorithmic computation procedures, (2) a persuasive argument that such a formalization is adequate (this is known as the Church-Turing thesis), (3) a demonstration that there are digital computing systems—universal machines—that can perform any algorithmic computation so long as they have enough time and memory, and (4) a proof that many functions are not computable by algorithmic procedures. Alan Turing’s provided all of the above in his landmark paper on (what are now known as) Turing machines (Turing 1936-7).

A few years later, Warren McCulloch and Walter Pitts picked up on Turing’s paper and placed it at the core of the computational theory of cognition. McCulloch was a visionary neurophysiologist, while Pitts was a gifted teenager who already had advanced training on how to model neural functions mathematically. Their joint paper (McCulloch and Pitts 1943) argued that neurons—the most important components of nervous systems—are equivalent to logic gates, that is, devices that can yield a binary digit in response to a few input digits. Their main reason for treating neurons as logic gates was the all-or-none law of nervous activity, the fact that most neurons either fire a signal—if their input reaches a threshold—or they don’t fire at all. In light of this, McCulloch and Pitts argued that networks of neurons perform logical operations on digital inputs like those performed by digital computers, and that such digital computations explain cognition. In subsequent years, McCulloch would describe their 1943 paper as treating brains as Turing machines.

McCulloch and Pitts’s computational theory of cognition had a huge impact on an eclectic and interdisciplinary community interested in cognition, computation, and brain theory. This community included John von Neumann, who used (and acknowledged) McCulloch and Pitts’s neural network notation for digital circuits in the first paper on electronic, general-purpose, digital computer design (von Neumann 1945). Von Neumann’s paper was widely circulated among those, such as Alan Turing, who were designing digital computers. By 1946, Alan Turing was using von Neumann’s notation, which von Neumann took from and attributed to McCulloch and Pitts (Hodges 1983, 343). Whether or not Turing ever read McCulloch and Pitts’s paper, at the very least he knew about their theory by reading von Neumann’s paper.

But Turing knew that computation need not be digital. Alongside digital calculators and computers, at Turing’s time there were machines that solved systems of equations by integrating continuous functions. They were called “differential analyzers”; later, they became known as analog computers. Turing was familiar with them. Even if we agree with McCulloch and Pitts that neural computations explain cognition, we may or may not agree that neural computations are digital. Could neural computations be analog?

Alan Turing was one of the first people who raised and discussed this question. In his 1947 Lecture to the London Mathematical Society, he implied that brains are “digital computing machines” (Turing 1947, 111, 123). In his 1948 “Intelligent Machinery,” which remained unpublished at the time, he asserted that brains are probably “continuous controlling machines,” that is, information processors, such as the differential analyzer, which operate on continuous variables. Turing added that brains are also “very similar to much discrete machinery” (Turing 1948, 6). In his famous “Computing Machinery and Intelligence,” he implied that neurons are analogous to “parts of modern machines,” by which he meant digital computers (Turing 1950, 455). He also flatly denied that nervous systems are digital computers (“discrete-state machines”) and implied that they operate on continuous variables, like differential analyzers:

> (Turing 1950, 451)

This passage implies that the nervous system is a “continuous machine” somewhat analogous to a differential analyzer, i.e., an analog computer. It falls short of saying that the nervous system is an analog computer.

A parallel discussion occurred on the other side of the Atlantic. At the Seventh Macy Conference on Cybernetics (23–24 March 1950), neurophysiologist Ralph Gerard argued that neural computations are more analog than digital. His remarks were followed by a lively discussion among audience members, centered on whether nervous systems are more digital or analog. Participants included Warren McCulloch, Walter Pitts, John von Neumann, and Norbert Wiener, among many others (Gerard 1951).

The discussion at the Seventh Macy Conference makes abundantly clear, even to the participants themselves, that they used the terms “digital” and “analog” in at least three different senses, roughly as follows:

(1) “Digital” for discrete or discontinuous variables and “analog” for continuous variables, at least with respect to variables that are relevant at a given level of analysis or organization of a physical system.
(2) “Digital” for digital computers and “analog” for differential analyzers, a.k.a. analog computers. Digital computers operate by manipulating strings of discrete states and analog computers operate primarily by integrating continuous signals, and the two types of computation require different types of components that must be organized in specific ways (Piccinini 2015).
(3) “Digital” for information that is “digitally coded,” or coded by using digital numerical representations, and “analog” for information that is “continuously coded,” or coded by using variables that covary with the represented variables (cf. Maley 2011).

As some of the Macy participants recognized, nervous systems may be digital in one of the above senses but analog in another sense. For instance, action potentials are all-or-none, which may be considered digital in sense (1), even though a primary way nervous systems encode information is in terms of firing rates, which may be considered analog in sense (3). As to whether nervous systems are more similar to digital or analog computers, a prescient exchange between John Bigelow and Walter Pitts suggests that those options are not exhaustive:

GERARD: That is a very good statement.
PITTS: There is a third between the two, because they are not opposite. The digital and analogical sorts of devices have been defined quite independently and are not logical opposites. >>(Gerard 1951, 47-8)

As Bigelow and Pitts were considering, and as Turing seemed to be considering, nervous systems may be neither digital nor analog computers. Instead, nervous systems may be their own type of computing mechanism. There is compelling evidence for this conclusion.

As I said, analog computers operate primarily by integrating continuous signals; digital computers operate by manipulating strings of discrete states. In contrast, typical neural signals—firing rates—are graded (like continuous signals) but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals like those processed by analog computers nor strings of digits like those processed by digital computers. There is a lot more to say about this; my own view—echoing Turing and Pitts—is that neural computation is sui generis (for more details, see Piccinini and Bahar 2013, Piccinini 2020).




References

Gerard, R. W. (1951). “Some of the Problems Concerning Digital Notions in the Central Nervous System.” In H. V. Foerster, M. Mead, and H. L. Teuber (eds.), Cybernetics: Circular Causal and Feedback Mechanisms in Biological and Social Systems. Transactions of the Seventh Conference (11–57). New York: Macy Foundation.

Hodges, A. (1983). Alan Turing: The Enigma. New York: Simon & Schuster.

Maley, C. J. (2011). “Analog and Digital, Continuous and Discrete.” Philosophical Studies, 155 (1): 117–131.

McCulloch, W. S. and W. H. Pitts: (1943). “A Logical Calculus of the Ideas Immanent in Nervous Activity.” Bulletin of Mathematical Biophysics 7, 115–33.

Piccinini, G. (2015). Physical Computation: A Mechanistic Account. Oxford: Oxford University Press.

Piccinini, G. (2020). Neurocognitive Mechanisms: Explaining Biological Cognition. Oxford: Oxford University Press.

Piccinini, G. and S. Bahar (2013). “Neural Computation and the Computational Theory of Cognition.” Cognitive Science 34: 453–88.

Turing, A. M., (1936–7). “On Computable Numbers, with an Application to the Entscheidungsproblem.” Proceeding of the London Mathematical Society 42(1): 230–65.

Turing, A. M. (1947). “Lecture to the London Mathematical Society on 20 February 1947.” In D. Ince (ed.), Mechanical Intelligence (87–105). Amsterdam: North-Holland.

Turing, A. M. (1948). “Intelligent Machinery.” In D. Ince (ed.), Mechanical Intelligence (87–106). Amsterdam, North-Holland.

Turing, A. M. (1950). “Computing Machinery and Intelligence.” Mind 59: 433–60.

von Neumann, J. (1945). “First Draft of a Report on the EDVAC.” Philadelphia, PA, Moore School of Electrical Engineering, University of Pennsylvania.



This post first appeared on The Brains Blog | Since 2005, A Leading Forum For, please read the originial post: here

Share the post

Alan Turing and Neural Computation

×

Subscribe to The Brains Blog | Since 2005, A Leading Forum For

Get updates delivered right to your inbox!

Thank you for your subscription

×