The Curious Quest for the Neural Correlates of Consciousness: A Methodological Review and Critique

[Note: I wrote this for a course this semester]


“Consciousness is a fascinating but elusive phenomenon; it is impossible to specify what it is, what it does, or why it evolved. Nothing worth reading has been written on it.” – Stuart Sutherland, International Dictionary of Psychology

1. Introduction

For much of the twentieth century psychologists used the word “consciousness” only in muted tones and never as a topic of serious inquiry. There were good historical reasons for this moratorium given previous attempts to study consciousness primarily relied on introspection, intuition, and one’s prior metaphysical convictions. Since psychology at the turn of the century strived to turn itself into a more “objective” science through the guise of gleaming stainless steel and the promise of rigor offered by behaviorism, consciousness seemed too subjective to ever amend itself to serious scientific study. After all, how do you measure something classically conceived as wholly private and ineffable?

The reticence to talk about unobservable mental events lessened dramatically as the “Cognitive Revolution” picked up steam. Since psychologists were no longer squeamish about explaining complex phenomena with theoretical constructs like mental representation, the study of consciousness could no longer be ignored for reasons of privacy alone. Nevertheless, philosophers were still ahead of the game for several decades, albeit embroiled in esoteric debates about mind-brain identity (Smart, 1959) , whether there is something-it-is-like to be a bat (Nagel, 1974), whether the existence of qualia generates an “epistemic gap” for science (Jackson, 1982; Levine, 1983), or whether “zombies” are possible (Chalmers, 1996).

It wasn’t until the late eighties and early nineties that the scientific world became more interested (or less reluctant) in studying this philosophically refined notion of consciousness (Baars, 1988; Churchland, 1988). A seminal paper that kicked off what was to become a churning cottage industry was Francis Crick and Christof Koch’s article “Towards a neurobiological theory of consciousness” (1990), with their hypothesis that neurons generate consciousness through coherent  semi- oscillations  at a frequency  in  the 40-70 Hz  range. It is a testament to Crick and Koch’s influence that their preferred methodology of looking for the “neuronal correlates of consciousness” (NCC) is now the dominant method of scientifically investigating consciousness and research articles, book, and edited volumes dedicated to NCCs are being produced at a rapid and growing pace (Metzinger, 2000; Tononi & Koch, 2008), even taking up shelf space in popular nonfiction (Koch, 2012).

In this review article I will briefly explore some popular methodological techniques used in the search for NCCs. For brevity’s sake, I will keep my analysis focused at the level of technique only and will not report on the massive but largely conflicting set of findings about where in the brain consciousness has been localized (e.g. whether it’s been correlated with early, late, or intermediate processing areas). A more general reason for not including the localization findings will be explored in the final section of the paper where I discuss some conceptual difficulties in making sense of a claim that consciousness “arises” in, say, either V1 or extra-striate cortex. I will diagnose this conceptual puzzle as resulting from difficulties in trying to study a phenomenon that is operationally defined with only a loose and ephemeral connection to physical reality.

2. What Are We Looking for When Hunting for NCCs?

I have already mentioned the philosophical notion of consciousness as involving a sense of there being “something-it-is-like” for an entity to exist. The essential idea is that it “feels like” something to look at a red sunset, taste the saltiness of a potato chip, or smell the rich aroma of freshly brewed coffee. The perceptual content of these experiences (redness, saltiness, etc.) is the primary target of the NCC paradigm. The quest is to find “the minimal set of neuronal events and mechanisms jointly sufficient for a specific conscious percept” (Koch, 2004, p. 16).

To understand the NCC paradigm it is important to distinguish between intransitive and transitive notions of consciousness. We use intransitive notions to talk about general states of arousal and wakefulness, such as when we say “Jones is conscious (as opposed to asleep”. In contrast, we use the transitive notion when we say “Jones is conscious of the apple over on that table”. The search for NCCs is specifically a search for the correlates of transitive contents of consciousness. If I am staring at a computer screen that is shifting back and forth between red and blue, my conscious percept is correspondingly shifting from a red percept to a blue percept. The goal of the NCC paradigm would to find the minimal set of neuronal events that correlates with the subjective shift from redness to blueness. Thus, the quest for NCCs is an attempt to find the neuronal mechanisms underlying the particular contents of our conscious experience. Having set up the overarching goal of the NCC paradigm, what methodological procedures are used to investigate conscious percepts?

3. A Brief Overview of Various Methods of Looking for NCCs


3.1 Masking, Priming, and Neuropsychology

One popular method of hunting for NCCs is the classic psychological paradigm of subliminal backward masking (Breitmeyer & Ogmen, 2000). Typically, a word (the target) is flashed on the computer screen for less than 50ms. If the word is presented by itself, subjects typically report awareness of the word. However, if immediately after the word is presented random geometric patterns (the mask) are flashed on the same retinal location then subjects will not report awareness of the target. Despite lack of reports for awareness, priming experiments demonstrate that these words are nevertheless processed nonconsciously at orthographic, phonological, and even semantic levels (Koechlin et al., 1999). 

            Because researchers using this technique can precisely control the variables that influence whether or not the target crosses the threshold of conscious reportability, the masking paradigm has been widely used to investigate the neural correlates of consciousness (Dehaene et al., 2001; Naccache et al., 2002). By systematically manipulating factors that influence the generation of reports of awareness while simultaneously measuring neural activity (either with single-unit or imaging technologies), the hope would be to find the brain areas that make the most direct contribution to the generation of consciousness itself. Using an analogous logic, some researchers have used a neuropsychological approachto study patients with brain damage in order to discover the neuronal factors that determine whether or not a patient retains consciousness, as well as learn about potential dissociations between attention and conscious awareness (Goodale & Milner, 1992; Kentridge et al., 1999; Weiskrantz et al., 1995).

Similar to the ongoing debates over how to interpret the classic Sperling experiments (Block, 2007; Lau & Rosenthal, 2011), it should be noted that there is considerable controversy about how to properly interpret experiments like these. I will return to the issue of interpretation in the final section, but for now let’s gr
ant that the interpretation problem is tractable.

3.2 Binocular Rivalry, Visual Illusions, and Flash Suppression

Another popular method of investigating the neural correlates of consciousness involves the use of multi-stable percepts. The key concept here is that when presented with a stimulus that could be interpreted in one of mulitple ways, subjects always report being aware of one interpretation at a time. That is, subjects tend to report that their conscious experience is “unitary” in nature (Bayne & Chalmers, 2003; Ramachandran & Hirstein, 1997). Moreover, if you present separate images to each eye, subjects tend to report that their perception shifts back and forth spontaneously between the competing images i.e. their brain doesn’t fuse the two images into a single percept, but rather, switches back and forth between coherent interpretations. By recording neural activity either directly with electrodes or indirectly with fMRI (Lumer & Rees, 1999) as the subjects are reporting shifts in perceptual content, researchers using this paradigm aim to discover the underlying neural correlates of the particular (transitive) contents of consciousness (Blake & Logothetis, 2002; Lumer et al., 1998). Because the perceptual content is reliably shifting back and forth, any alteration of neural activity associated with these shifts should tell us “where” in the brain consciousness arises.

            Bistable stimuli are not the only class of interesting visual stimuli used to find NCCs. Another fascinating method involves studying the neural correlates of the perception of illusory contours (perceived but nonexistent edges) using either single-unit recordings in nonhuman animals (Sheth et al., 1996) or in humans with fMRI (Mendola et al., 1999). Similar work has been done studying the neural correlates underlying illusory color effects (Morita et al., 2004). A variant of the binocular rivalry paradigm that enables more precise control over shifts in rivalrous perceptions is the flash suppression paradigm (Sheinberg & Logothetis, 1997) whereby if you present an image to one eye and then flash a second image to the other eye, the new image will suppress the conscious awareness of the first image. Another variation is continuous flash suppression whereby distracter images are continuously flashed to the contralateral eye. This paradigm can suppress awareness of the first image for up to several minutes even though the image is continuously present to a single eye (Tsuchiya & Koch, 2005). By asking subjects to report precisely when they become aware of the original image, researchers can pinpoint NCCs.

3.3 Inattentional Blindness and Change Blindness

The study of inattentional blindness is another maturing paradigm for investigating NCCs. This phenomenon has been most vividly demonstrated in the well-known “Gorilla experiment” (Chabris & Simons, 2010; Simons & Chabris, 1999). In this set-up, subjects are asked to watch a video of a group of people passing a basketball back and forth and told to count the number of times the ball is passed. In the middle of the task, a graduate student in a gorilla suit conspicuously walks into the middle of the frame, stops, waves, and then walks off screen. When later asked if they noticed anything unusual about the video, a surprisingly large amount of people (46% of 192) failed to report any awareness of the gorilla.

A more controlled paradigm for investigating inattentional blindness has been developed by Mack and Rock (1998) as a visual analogue to classic studies of dichotic-listening  (Treisman, 1964). In a typical set-up, observers are asked to judge which arm of a briefly presented large cross is longer. On the fourth trial, an unexpected object is presented at the same time as the cross, and later, subjects are asked to report if they noticed anything except for the cross. By comparing this trial to later trials where the subject expects the other object, researchers can determine the relative importance of attention for conscious awareness. This is in effect another way of experimentally manipulating variables that effect whether perceptual contents cross the threshold for conscious reportability.

A similar paradigm investigates the difficulties people have in consciously noticing large changes in perceptually salient features (Grimes, 1996). Known as “change blindness” (Simons & Rensink, 2005), this phenomenon is especially surprising giving that researchers have found most people have “change blindness blindness” (Levin et al., 2000), i.e. an overconfidence in their meta-judgements about their ability to detect large changes in visual scenes. By investigating the neural correlates of when conscious awareness of change is reported, researchers aim to pin down the NCCs.

Some theorists conclude from these studies that attention is both necessary and sufficient for consciousness (Mack & Rock, 1998; Prinz, 2012).However, as I will show in the next section, we must be exceedingly cautious in drawing conclusions about the nature of consciousness from experimental paradigms that are open to several mutually inconsistent interpretations with no obvious (empirical) method for evaluating which interpretation is true.

4.0 Prospects for the NCC Project: Conceptual and Methodological Worries

There is a looming and potentially fatal problem at the heart of the quest for the neural correlates of consciousness: do we even have a clear and distinct idea of what we are looking for in the first place? Throughout this review I have been using the phrase “reports of consciousness” or “reports of awareness” as more or less synonymous with conscious awareness itself. However, the lack of a priori definition and the fundamental fact that psychologists must rely on some kind of indirect behavioral report of consciousness in order to study it generates a methodological puzzle sometimes called the “refrigerator light problem” (Bayne et al., 2009, pp. 561-562; Blackmore, 2002; Block, 2001; Schwitzgebel, 2007, 2011). In essence, the problem is this: can you infer from a lack of a report of conscious awareness that there was no conscious awareness? If so, what empirical fact justifies this inference?

            A troubling possibility for the NCC paradigm is that no empirical study can determine the right answer to this question without first assuming a particular operational definition of awareness that is likely to be controversial (Goldman, 1993). Philosophers who debate about how to best characterisize consciousness disagree about whether it’s best to define consciousness as something that can exist independently of higher-order accessibility (Block, 1995, 2009; Dretske, 1993) or whether you can only be conscious if you are aware that you are conscious (Lau & Rosenthal, 2011; Rosenthal, 1997, 2005).

While some scientists have simply operationally defined consciousness as being whatever it is that enables introspective reporting (Dehaene et al., 2006; Jack & Shallice, 2001) and then developed theoretical models accordingly (wholly ignoring thorny issues like the inaccessible qualia), other scientists contest that “Until the problem is better understood, a more formal definition of consciousness is likely to be either misleading or overly restrictive, or both” (Koch, 2004, p. 12). Koch admits this move is evasive, but raises the analogy of defining genes: “So much is now known about genes that any simple definition is likely to be inadequate. Why should it be any easier to define something as elusive as consciousness?” (ibid.). However, this analogy is fatally flawed. Prior
to the modern refinement of our understanding of what a gene is, scientists at least had a loose functional sense that, whatever genes ultimately are, they have something to do with transferring information from one generation to the next. This loose definition at least gave researchers a useful “discovery heuristic” for asking questions and proposing experiments.

            But if the NCC paradigm is truly hunting for elusive notions like qualia, phenomenality, or experience, there is no analogous consensus on what sort of functional role qualia plays that couldn’t be explained without appealing to conscious awareness (Chalmers, 1996; Flanagan & Polger, 1995). The impenetrable subjective privacy implicit in the concept of “experience” makes it devishly hard to operationalize, let alone explain using a garden-variety scientific model. The usefulness of operational definitions in science is proportional to the extent they have a solid basis in physical reality. The worry about notions like qualia or what-it-is-likeness is that they are defined in a way that has no obvious conceptual connection to physical reality, and appeals to intuition to ostensively define the concept are famously prone to killing debate rather than encouraging it (Dennett, 1988; Dennett, 1995). This stands in contrast to the loose definition of genes as involving transfers of information between generations since we can easily point to the physical instantiations of each generation in order to constrain possible research strategies. When hunting for correlates of “experience itself”, we are faced with no way to analogously point out what we are trying to explain. We are forced to rely on our famously finicky capacity for introspection (Pereboom, 1994; Schwitzgebel, 2007). But if this phenomenon is supposed to be shared with creatures only capable of indirect behavioral reports like button pressing, we cannot just open their skulls and point out where the locus of experience “all comes together” (Dennett, 1991).

            It is important to note that I am not objecting to the use of operational definitions in principle. So long as our chain of operational definitions is linked to instrumental operations that have a well-known link to physical reality, we can “ground” more abstract operational definitions to chains of more concrete ones, absolving skeptical worries that our theoretical concepts are only too loosely connected to reality. Shared scientific knowledge about physical reality places constraints on what operational definitions will be most productive.

The worry about operationalizing a notion as vaguely defined as “consciousness” is that different theorists can wind up using radically different operational definitions in order to measure consciousness. For example, theoretical biologist Guilio Tononi has recently developed an operational definition for picking out consciousness that relies on the notion of integrated information (Tononi, 2008). Counter-intuitvely, the definition of integrated information winds up attributing the lowest level of consciousness to simple inorganic entities like photodiodes. Compare this stipulated definition with more neurobiologically focused ones (See Chalmers (2000) for a catalogue), and we are left with the question of whether these competing “theories of consciousness” are explaining the same thing.

Why not let a thousand flowers bloom? Suppose we encourage a definitional pluralism whereby we let competing researchers use widely different operational definitions and we simply let them compete in the marketplace of ideas. The problem however is that it’s unclear how we could measure “success” in the science of consciousness. Given the subjective privacy of what we are trying to explain, we would have no independent measure for construct validity other than the very stipulated criteria up for comparison. In other words, there is no “neutral” way of picking out the phenomena that doesn’t beg the question against competing operational definitions. And given competing operational definitions of consciousness will result in rival theorists giving inconsistent interpretations of the same experimental results, we are left with the troubling possibility that debates in the science of consciousness are nothing more than verbal disputes.



5.0 Conclusion

Whether or not you think the search for NCC has been a provisional success or doomed from the start depends largely on your prior assumptions about the nature of consciousness. If you think that consciousness represents a well-circumscribed natural kind itching for scientific explanation, then you will probably laud the experimental paradigms described above as indicators of slow but steady progress towards a more fundamental theory. However, if you are skeptical that the English term “consciousness” picks out any well-defined natural kind (Allport, 1988; Wilkes, 1988) , then you might have reservations about the success of a program that is hunting for something that can only be found using operational definitions are either untethered to physical reality or stipulated on an arbitrary basis (e.g. why should Tononi’s notion of integrated information be any more plausible a criterion than Rosenthal’s notion of higher-order thoughts?). In the end, the science of consciousness will succeed or fail in proportion to whether it can clearly delineate what it is trying to explain.




Allport, A. (1988). What concept of consciousness? In A. Marcel & E. Bisiach (Eds.), Consciousness in contemporary science (pp. 159-182). New York: Oxford University Press.

Baars, B. (1988). A Cognitive Theory of Consciousness. Cambridge: Cambridge University Press.

Bayne, T., & Chalmers, D. (2003). What is the unity of consciousness. In A. Cleeremans (Ed.), The unity of consciousness: Binding, integration, and dissociation (pp. 23-58). New York: Oxford University Press.

Bayne, T., Cleeremans, A., & Wilken, P. (Eds.). (2009). The Oxford Companion to Consciousness. New York: Oxford University Press.

Blackmore, S. (2002). There Is No Stream of Consciousness. What is all this? What is all this stuff around me; this stream of experiences that I seem to be having all the time? Journal of Consciousness Studies, 9(5-6), 17-28.

Blake, R., & Logothetis, N. K. (2002). Visual competition. Nature Reviews Neuroscience, 3(1), 13-21.

Block, N. (1995). A confusion about a function of consciousness. Behavioral and Brain Sciences, 18(2), 227–247.

Block, N. (2001). Paradox and cross purposes in recent work on consciousness. Cognition, 79(1-2), 197-219.

Block, N. (2007). Consciousness, accessibility, and the mesh between psychology and neuroscience. Behavioral and Brain Sciences, 30(5), 481-498.

Block, N. (2009). Comparing the Major Theories of Consciousness. In M. Gazzaniga (Ed.), The Cognitive Neurosciences IV. Cambridge, MA: MIT Press.

Breitmeyer, B. G., & Ogmen, H. (2000). Recent models and findings in visual backward masking: A comparison, review, and update. Attention, Perception, & Psychophysics, 62(8), 1572-1595.

Chabris, C., & Simons, D. (2010). The Invisible Gorilla. New York: Crown.

Chalmers, D. (1996). The Conscious Mind: In Search of a Fundamental Theory. Oxford: Oxford University Press.

Chalmers, D. J. (2000). What is a neural correlate of consciousness. In T. Metzinger (Ed.), Neural correlates of consciousness: Empirical and conceptual questions (pp. 17-39). Cambridge, MA: MIT Press.

Churchland, P. S. (1988). Reduction and the neurobiological basis of consciousness. In A. J. Marcel
& E. Bisiach (Eds.), Consciousness in Contemporary Science. New York: Oxford University Press.

Crick, F., & Koch, C. (1990). Towards a neurobiological theory of consciousness. Seminars in the Neurosciences, 2, 203.

Dehaene, S., Changeux, J.-P., Naccache, L., Sackur, J., & Sergent, C. (2006). Conscious, preconscious, and subliminal processing: a testable taxonomy. Trends in Cognitive Sciences, 10(5), 204-211.

Dehaene, S., Naccache, L., Cohen, L., Bihan, D. L., Mangin, J.-F., Poline, J.-B., et al. (2001). Cerebral mechanisms of word masking and unconscious repetition priming. Nat Neurosci, 4(7), 752-758.

Dennett, D. (1991). Consciousness Explained: Little, Brown, and Co.

Dennett, D. C. (1988). Quining Qualia. In A. J. Marcel & E. Bisiach (Eds.), Consciousness in Contemporary Science. New York: Oxford University Press.

Dennett, D. C. (1995). Commentary on Ned Block, “On a Confusion about a Function of Consciousness,”. Behavioral and Brain Sciences, 18(2), 252-253.

Dretske, F. (1993). Conscious Experience. Mind, 102(406), 263-283.

Flanagan, O., & Polger, T. (1995). Zombies and the function of consciousness. Journal of Consciousness Studies, 2(4), 313-321.

Goldman, A. I. (1993). Consciousness, Folk Psychology, and Cognitive Science. Consciousness and Cognition, 2(4), 364-382.

Goodale, M. A., & Milner, A. D. (1992). Separate visual pathways for perception and action. Trends in neurosciences, 15(1), 20-25.

Grimes, J. (1996). On the failure to detect changes in scenes across saccades. In K. Akins (Ed.), Perception. Vancouver Studies in Cognitive Science. New York: Oxford University Press.

Jack, A. I., & Shallice, T. (2001). Introspective physicalism as an approach to the science of consciousness. Cognition, 79(1–2), 161-196.

Jackson, F. (1982). Epiphenomenal qualia. The Philosophical Quarterly, 32(127), 127-136.

Kentridge, R. W., Heywood, C. A., & Weiskrantz, L. (1999). Attention without awareness in blindsight. Proceedings of the Royal Society of London. Series B: Biological Sciences, 266(1430), 1805-1811.

Koch, C. (2004). The Quest for Consciousness: A Neurobiological Approach. Englewood, CO: Roberts & Company Publishers.

Koch, C. (2012). Consciousness: Confessions of a Romantic Reductionist. Cambridge, MA: MIT Press.

Koechlin, E., Naccache, L., Block, E., & Dehaene, S. (1999). Primed numbers: Exploring the modularity of numerical representations with masked and unmasked semantic priming. Journal of Experimental Psychology: Human Perception and Performance, 25(6), 1882.

Lau, H., & Rosenthal, D. (2011). Empirical support for higher-order theories of conscious awareness. Trends in cognitive Sciences, 15(8), 365-373.

Levin, D. T., Momen, N., Sarah IV, B., Drivdahl, I., & Simons, D. J. (2000). Change blindness blindness: The metacognitive error of overestimating change-detection ability. Visual Cognition, 7(1-3), 397-412.

Levine, J. (1983). Materialism and qualia: The explanatory gap. Pacific Philosophical Quarterly, 64(October), 354-361.

Lumer, E. D., Friston, K. J., & Rees, G. (1998). Neural Correlates of Perceptual Rivalry in the Human Brain. Science, 280(5371), 1930-1934.

Lumer, E. D., & Rees, G. (1999). Covariation of activity in visual and prefrontal cortex associated with subjective visual perception. Proceedings of the National Academy of Sciences, 96(4), 1669-1673.

Mack, A., & Rock, I. (1998). Inattentional Blindness. Cambridge, MA: MIT Press.

Mendola, J. D., Dale, A. M., Fischl, B., Liu, A. K., & Tootell, R. B. H. (1999). The representation of illusory and real contours in human cortical visual areas revealed by functional magnetic resonance imaging. The Journal of Neuroscience, 19(19), 8560-8572.

Metzinger, T. (2000). Neural correlates of consciousness: Empirical and conceptual questions. Cambridge, MA: MIT press.

Morita, T., Kochiyama, T., Okada, T., Yonekura, Y., Matsumura, M., & Sadato, N. (2004). The neural substrates of conscious color perception demonstrated using fMRI. NeuroImage, 21(4), 1665-1673.

Naccache, L., Blandin, E., & Dehaene, S. (2002). Unconscious masked priming depends on temporal attention. Psychological Science, 13(5), 416-424.

Nagel, T. (1974). What is it like to be a bat? The Philisophical Review, 83(4), 435-450.

Pereboom, D. (1994). Bats, Brain Scientists, and the Limitations of Introspection. Philosophy and Phenomenological Research, 54(2), 315-329.

Prinz, J. (2012). The Conscious Brain. New York: Oxford University Press.

Ramachandran, V. S., & Hirstein, W. (1997). Three laws of qualia: What neurology tells us about the biological functions of consciousness. Journal of Consciousness Studies, 4(5-6), 5-6.

Rosenthal, D. M. (1997). A theory of consciousness. In G. Güzeldere (Ed.), The Nature of Consciousness: Philosophical Debates. Cambridge, MA.

Rosenthal, D. M. (2005). Consciousness and mind: Oxford University Press.

Schwitzgebel, E. (2007). Do You Have Constant Tactile Experience of Your Feet in Your Shoes?: Or Is Experience Limited to Whats in Attention? Journal of Consciousness Studies, 14(3), 5-35.

Schwitzgebel, E. (2011). Perplexities of Consciousness. Cambridge, MA: MIT Press.

Sheinberg, D. L., & Logothetis, N. K. (1997). The role of temporal cortical areas in perceptual organization. Proceedings of the National Academy of Sciences, 94(7), 3408-3413.

Sheth, B. R., Sharma, J., Rao, S. C., & Sur, M. (1996). Orientation maps of subjective contours in visual cortex. science, 274(5295), 2110-2115.

Simons, D. J., & Chabris, C. F. (1999). Gorillas in our midst: Sustained inattentional blindness for dynamic events. Perception, 28(9), 1059-1074.

Simons, D. J., & Rensink, R. A. (2005). Change blindness: Past, present, and future. Trends in cognitive Sciences, 9(1), 16-20.

Smart, J. J. C. (1959). Sensations and brain processes. The Philosophical Review, 68(2), 141-156.

Tononi, G. (2008). Consciousness as Integrated Information: A Provisional Manifesto. Biological Bulletin, 215(3), 216-242.

Tononi, G., & Koch, C. (2008). The neural correlates of consciousness. Annals of the New York Academy of Sciences, 1124(1), 239-261.

Treisman, A. (1964). Monitoring and storage of irrelevant messages in selective attention. Journal of Verbal Learning and Verbal Behavior, 3(6), 449-459.

Tsuchiya, N., & Koch, C. (2005). Continuous flash suppression reduces negative afterimages. Nature Neuroscience, 8(8), 1096-1101.

Weiskrantz, L., Barbur, J. L., & Sahraie, A. (1995). Parameters affecting conscious versus unconscious visual discrimination with damage to the visual cortex (V1). Proceedings of the National Academy of Sciences, 92(13), 6122-6126.

Wilkes, K. V. (1988). ——, yìshì, duh, um, and consciousness. In A. J. Marcel & E. Bisiach (Eds.), Consciousness in Contemporary Science. New York: Oxford University Press.



Leave a comment

Filed under Consciousness, Philosophy, Psychology

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s