Is Higher-order Theory Really Defunct?

Last year Ned Block published a paper in Analysis called “The higher-order approach to consciousness is defunct“. In it, he offers a very simple and compelling argument that is supposed to expose the incoherency of both Higher-order Thought theory (HOT) and Higher-order Perception theory (HOP). Block first distinguishes modest and ambitious versions of these theories. The modest view is simply an account of “Higher-order consciousness” as distinct from what-it-is-likeness while the ambitious view is designed to be a theory of what-it-is-likeness itself. According to Block, the Higher-order view is as follows:

The higher order theory: A mental state is conscious if and only if the state is the object of a certain kind of representation arrived at non-inferentially.

Block’s argument against the ambitious view rests on the possibility of radical misrepresentation, something acknowledged by all HO theorists. More specifically, Block has in mind the possibility of a “targetless” Higher-order representation. Block formulates his argument in terms of HOT, but since I am more interested in HOP, I will do it the other way. Suppose that Jones has a HOP to the effect that it says “I am now having a red sensory experience” when in fact there is no first-order representation of redness. The HOP is in this case “empty”. But according to ambitious Higher-order theory, it is sufficient for there to be what-it-is-likeness so long as there is a HOP, since it is the HOP that generates what-it-is-likeness. But notice how the Higher-order theory is formulated. A mental state is conscious IFF the state is the object of an HOP. But there is no first-order mental state! As Block says, “Thus, the sufficient condition and the necessary condition are incompatible in a situation in which there is only one non-self-referential higher order representation.” Block (rightly) thinks this is incoherent.

According to ambitious Higher-order theorists, the targetless HO representation is enough to generate what-it-is-likeness. But the theory seems to require there to be a first-order state, since the theory is designed to show how first-order states become conscious. So the HO theorist seems to be stuck. The theory is supposed to be a theory of how first-order states become conscious but the theory is committed to the idea that HO representations all by themselves can generate what-it-is-likeness completely independently of the existence of any first-order state.

To be honest, I actually think Block has a nice argument here. But this is because I have always thought the ambitious version of HO theory is confused (see my paper “What is it like to be nonconscious?“). I don’t think higher-order theory is a theory of the origin of what-it-is-likeness, but rather, a theory of introspection. This is what William Lycan supposedly has claimed all along: that he is only offering a theory of introspective awareness. But wouldn’t it just be trivial to develop a “higher-order theory” of higher-order introspection? Well, it’s not trivial so long as we are trying to decide between HOP and HOT as an account of higher-order consciousness. Personally, I think HOP is better suited neurologically to explain higher-order introspective awareness.

But I am also skeptical of the very possibility of a truly targetless HOP. I just can’t make much neurological sense of such a possibility. Let’s assume an overly simplistic neural theory of introspection such that introspection is neurally realized in the frontal cortex. On this simplistic view, the frontal cortex is constantly receiving input from the other areas of the brain and introspecting upon that content. It seems to me that in order for there to be a truly targetless HOP, either the frontal cortex would have to be completely isolated from the rest of the brain, or the rest of the brain would have to be turned off. In the latter case, it seems like the person would simply be brain dead. And the former case seems just as unrealistic, since the idea of the frontal cortex have zero synaptic connections to any other area of the brain seems too incredible. So long as the rest of the brain is working, and there is at least one synaptic connection to the frontal cortex, then the frontal cortex will have something to “work with” in performing its introspective monitoring function.

Consider Damasio’s theory of primal background feelings arising in the brain stem and other primitive circuitry. Presumably these kinds of first-order mental states can’t just be “turned off” without severely incapacitating the subject. And if these background feelings can make their way to the frontal cortex (as seems plausible), the introspective machinery will always have something to work with. So the case of a truly targetless HOP seems unrealistic to me. However, it seems more realistic to assume that radical misrepresentation of first-order states is possible. This seems like what’s going on when people are on psychedelic drugs or hallucinating. But it’s never the case that the frontal cortex is completely spinning in the void, without having any input from first-order systems. We can then reformulate the higher-order theory to coherently (and perhaps trivially) say “a mental state is the object of introspective awareness just when it is accompanied by a higher-order representation”. No surprises there. The only thing that’s left is just to develop a theoretical model of the evolutionary and ontogenetic origins of such introspective awareness (no easy feat, as Jaynes shows).

Where does this leave us then in terms of Block’s attack on HO theory? Well, I believe the attack is successful against ambitious HO views, since it seems entirely plausible to me that there is something-it-is-like for first-order sensorimotor systems to be operative. But so long as we are sufficiently modest in our ambitious about what HO theory can explain, then it seems like HOP theory is on solid grounds for making sense of our human powers of introspection. Where I disagree with Lycan however is that Lycan thinks the introspective machinery of HOP is simplistic enough to be shared by many nonhuman mammals. My own research has led to me conclude that the introspective machinery of HOP is unique to humans, and that such introspective machinery is what accounts for the great cognitive differences between humans and nonhuman animals. If HOP is a theory of higher-order consciousness, then I believe that HOP is also a theory of what makes humans cognitively unique. While there are likely simpler homologues of introspective machinery in other primates, it seems to me that human introspection is at a much higher level of sophistication. Following Julian Jaynes, I believe this sophistication stems from our linguistic mastery. More specifically, learning linguistic concepts related to psychological functions allows us think about thinking. This linguistically mediated recursion seems to allow for an “intentional ascension” whereby we engage in truly metarepresentational cognition. This allows us to thinking about the fact that we are thinking about the fact that we are thinking, and so on.

So, I don’t think Higher-order theory is really defunct. It’s defunct as a theory of what-it-is-likeness, but that not really all that surprising given the usual criteria for ascribing what-it-is-likeness are cases where we think there is simple sensation going on. And it’s just absurd to suppose that sensation requires the existence of metarepresentation. So that alone gives us good reason to make a phenomenological distinction between what-it-is-like to be a simple sensing creature and what-it-is-like to be a creature with sensation and the capacity for higher-order representation. Where I disagree with Block is his view that what-it-is-likeness is a property generated in neural systems, since I think there is good reason to ascribe phenomenality to creatures lacking nervous systems.  And unlike Block, I also don’t think what-it-is-likeness generates an epistemic or explanatory gap once we understood what exactly it is we are referring to when we use such a term.

Advertisements

1 Comment

Filed under Consciousness

One response to “Is Higher-order Theory Really Defunct?

  1. Pingback: Is Higher-order Theory Really Defunct? | Minds and Brains « Knowledge Team

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s