Thoughts on the Computational Theory of Mind

I was just reading the Stanford encyclopedia article on Searle’s Chinese room and I wanted to share this great paragraph:

A further related complication is that it is not clear that computers perform syntactic operations in quite the same sense that a human does—it is not clear that a computer understands syntax or syntactic operations. A computer does not know that it is manipulating 1’s and 0’s. A computer does not recognize that its binary data strings have a certain form, and thus that certain syntactic rules may be applied to them, unlike the man inside the Chinese Room. Inside a computer, there is nothing that literally reads input data, or that “knows” what symbols are. Instead, there are millions of transistors that change states. A sequence of voltages causes operations to be performed. We may choose to interpret these voltages as binary numerals and the voltage changes as syntactic operations, but a computer does not interpret its operations as syntactic or any other way. So perhaps a computer does not need to make the move from syntax to semantics that Searle objects to; it needs to move from complex causal connections to semantics.Furthermore, any causal system is describable as performing syntactic operations—if we interpret a light square as logical “0” and a dark square as logical “1”, then a kitchen toaster may be described as a device that rewrites logical “0”s as logical “1”s.

This seems right to me. The question, then, is not how do we get from syntax to semantics, but rather,  whether or not the MIND IS A COMPUTER metaphor is even worthwhile if we concede that we don’t know how the computer actually “computes” or “knows” anything, as opposed to simply changing physically in accordance with voltages, etc. Accordingly, if the computer does not “know” or “read” syntax except metaphorically, then what is going on when an organism “knows” the world? It seems unlikely that a messy biological brain would be doing the same thing as a computer.

If computational cognitivism is based on a flawed analogy, then we need to reconsider why we abandoned behaviorist approaches to cognition. In other words, if the mind is not “computing” the world when it knows the world, then the most obvious alternative simply claims that knowledge is for behavior, and accordingly, when the animal mind “knows” the world, it knows it in terms of possibilities and affordances of physical behavior. Perceptual cognition then becomes reactive i.e. organic behavior, not representational computation. The ontological bifurcation between syntax and semantics is replaced by a form of being in the world, and a question arises of how is it that our physical bodies resonate to the environment so as to comport with an optimal bodily grip.

But wait, did we not already learn our lesson about behaviorism? Psychology shifted into the Cognitive Revolution  because there are some human behaviors that cannot be explained in terms of a complex behavioral resonance. What are these behaviors? Introspection, internal workspaces with conscious content manipulation (working memory, visual sketchpads, phonological loops, etc. ), narratization, advanced social cognition, conscious planning, episodic and autobiographical memory,executive impulse control and decision making, etc. Can these epistemic actions be intelligibly explained in terms of a complex behaviorism? Seems unlikely. But the takeaway message here is that we need not explain the basic biological coping of “knowing” the world with the same explanatory framework we use to explain the more recent and more advanced epistemic actions of conscious content manipulation.

With this distinction between online coping and offline thinking we can deal with many of the philosophical problems associated with theories of mind, including qualia, inverted spectrums, the explanatory gap, etc.  But that is for another post.



Filed under Philosophy, Psychology

3 responses to “Thoughts on the Computational Theory of Mind

  1. Victor Panzica

    The great mind-body question is: Does the body think? All of that hard wired neural functionality in the body which we call behavioral is in reality thinking or “thinking”. Computers may be computational but they still run on a hardware platform with built in firmware and machine languages. If you had written your post in Japanese it would have meant nothing to me but my English language based software WITH American syntax version had no problem “UNDER”standing IT.

  2. To me the computational view implies some sort of Top-Down Assumption Driven Process. It means the consciousness is being derived from a phenomenological prospective outside of itself.
    A better angle of understanding would be a Ground-Up Knowledge Driven Evolution, which means that instead of thinking of the brain’s modular instances as hardwired and autonomous, we have to adopt a new view and think about the brain as being in a state of dynamic equilibrium with each other and with the environment including the body. By dynamic equilibrium I mean that new connections are being constantly re-made in response to environment changes.

  3. Gary Williams

    >>By dynamic equilibrium I mean that new connections are being constantly re-made in response to environment changes.

    Doru, this sounds right to me. Are you familiar with the work being done in dynamic systems theory or Gibsonian theory? Such frameworks essentially espouse what you just said, namely, that the brain dynamically resonates to the changing environment.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s