Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
15857 Personality and Control u1:Eliot Werner Publications 5/7/15 6:51 AM Page 15 CHAPTER 1 The Conscious Control of Behavior Revisiting Gray’s Comparator Model Philip J. Corr Ezequiel Morsella INTRODUCTION This chapter was inspired by the authors’ admiration for Jeffrey Gray (1934–2004), a scientist who contributed much to our understanding of the mind/brain—including the study of the elusive relationship between consciousness1 and behavioral control, which is the focus of this chapter. One of us (PJC) had the good fortune of being a protégé of Gray; the other (EM) had the great pleasure not only of reading and benefiting from Gray’s theorizing, but from having once met him in New York to discuss his ideas for several hours—thanks to John Bargh, who generously arranged the meeting. During this wonderful conversation, which took place more than a decade ago, it became apparent to both the distinguished scientist and the young Ph.D. that Gray’s (1995) comparator model of conscious processing (presented in Behavioral and Brain Sciences) could explain more about consciousness and behavioral control than even envisioned by its author, which was already quite a bit—including disparate phenomena such as the contents of consciousness (Gray, 1995), the neuropsychology of anxiety (Gray, 1982a, 1982b; Gray & McNaughton, 2000), and the positive symptoms of acute schizophrenia (Gray, 1998; Gray, Feldon, Rawlins, Hemsley, & Smith, 1991). These extensions of Gray’s theory of the behavioral inhibition system are the focus of this chapter. Personality and Control edited by Philip J. Corr, Małgorzata Fajkowska, Michael W. Eysenck, and Agata Wytykowska. Eliot Werner Publications, Clinton Corners, New York, 2015. 15 15857 Personality and Control u1:Eliot Werner Publications 16 5/7/15 6:51 AM Page 16 Philip J. Corr and Ezequiel Morsella GRAY’S COMPARATOR MODEL To appreciate the insights discussed on that day, now many years ago, it is important to understand Gray’s comparator model of consciousness. The model explains, among other things, the lateness of conscious processing (Gray, 2004; Libet, 2004; Velmans, 1991, 2000), error detection in behavioral control, and most importantly how some contents—but not others—are selected to enter consciousness. Perhaps no one explained the model better than Gray (2002) himself. The essential computational function discharged by the comparator is to compare, non-consciously and quite generally, information currently received via all thalamocortical sensory pathways (up to the level of neocortical analysis) with a prediction as to what that information should be. The prediction is based jointly upon previous stimulus-stimulus and response-stimulus regularities (stored as memories) under circumstances similar to those operating now; the circumstances “operating now” are themselves defined by the output of the comparator at the preceding comparison process. In addition, the comparator takes account of the subject’s ongoing motor program, as what the world will be like in the next moment depends upon what the subject is doing in this one. These processes occur on a time base of the order of 100 ms from the termination of one process of comparison to termination of the next. The output from the comparison process selects a series of items in the neocortical description of the sensory world in the light of their novelty/familiarity and predictedness/unpredictedness (these concepts are not identical to one another). . . . The selection is biased towards items which are novel, either because they occur despite not being expected or because they fail to occur despite being expected; and towards items which are goals or sub-goals for an ongoing motor program. The selected items are reactivated by feedback from the comparator system to those areas of the sensory neocortex (visual, auditory, somatosensory, etc.) in which they have just been non-consciously analysed. It is this reactivation by feedback from the comparator that selects these items for entry into consciousness. (pp. 4–5) As Gray noted, similar ideas had been proposed before (e.g., by Jackendoff, 1987; Miller, Galanter, & Pribram, 1960; Neisser, 1967). However, until Gray’s own model, no “nuts and bolts” theory existed that contained as much specificity regarding both the component processes of consciousness (e.g., detecting, com- Here we are speaking of the most basic kind of consciousness. This kind of consciousness, also referred to as “sentience” (Pinker, 1997), “phenomenal state” (Tye, 1999), “qualia” (Gray, 2004), and subjective experience, has perhaps been best defined by the philosopher Thomas Nagel (1974), who proposed that an organism possesses subjective experiences if there is something it is like to be that organism—something it is like, for example, to be human and experience pain, love, or breathlessness. Similarly, Block (1995) claimed, “[T]he phenomenally conscious aspect of a state is what it is like to be in that state” (p. 227). 1 15857 Personality and Control u1:Eliot Werner Publications 5/7/15 6:51 AM The Conscious Control of Behavior Page 17 17 paring, and matching) and its neuranatomical substrates (see Gray, 1995, for hypotheses about the hippocampus and neocortex in conscious processing). According to the model, unconscious motor programs (discussed below) lead to expressed action, which then leads to action effects—which are perceptual in nature—that are then compared with the anticipated action effects, which themselves are perceptual-like memories based on previous experience (Gray, 1995). The stages of processing in situations in which the comparator detects a mismatch could be conceptualized as follows. Unconscious motor programs [Stage 1] ! perceptual-like action effects [Stage 2] ! comparator process [Stage 3] ! mismatch detection [Stage 4] ! entry into consciousness of mismatched, perceptual-like information and error signals (along with other goal-relevant information) [Stage 5]. As is clear in this sequence, consciousness occurs late, as when one withdraws one’s hand reflexively from a hot pot. In this case consciousness regarding the action is experienced only after the pain withdrawal action is already mediated successfully, albeit unconsciously, by the nervous system (Gray, 2004). According to Gray (2002), the pain (the quale that is a consequence of late error detection) influences not so much the nature of ongoing action at the moment (for the appropriate action to the situation already took place in an unconsciously mediated manner), but future actions transpiring in a similar context. In this way entry into consciousness influences future behavior in a manner that is not well understood (see treatment in Corr, 2011). In this comparator framework, when outcomes from actions do not match expected outcomes, representations of the salient features about these unexpected outcomes enter consciousness—as, for example, when we learn that the pot was hotter than expected. This representation occurs also for actions that do not involve pain: any outcome mismatch has the potential to have its salient features represented in the contents of conscious awareness. For example, imagine the case in which a child intended to say something but then, unexpectedly and for the first time in its life, found itself coughing. The child becomes very much aware of this cough experience, long after the motor plans engendering the cough behavior transpires. It was while discussing mechanisms such as these, during our conversation more than a decade ago, that something became clear. When the sequence of the comparator is reversed, such that the stages flow from 5 to 1 rather than from 1 to 5, the model resembles ideomotor theory (Greenwald, 1970; Harleß, 1861; Hommel, 2009; Hommel, Müsseler, Aschersleben, & Prinz, 2001; James, 1890/1950; Lotze, 1852), a historic approach illuminating how behavior can be controlled voluntarily. Interestingly, the ideomotor approach developed independently of comparator frameworks, but the two have much in common—as we will now discuss. 15857 Personality and Control u1:Eliot Werner Publications 18 5/7/15 6:51 AM Page 18 Philip J. Corr and Ezequiel Morsella IDEOMOTOR APPROACHES TO BEHAVIORAL CONTROL In ideomotor approaches one’s conscious knowledge regarding action production and control is limited to the perceptual consequences of expressed action (or action effects). From this standpoint motor control—which specifies the muscles that should be activated at a given time in order to express an action (e.g., flexing a finger)—is largely unconscious (see evidence in Fecteau, Chua, Franks, & Enns, 2001; Goodale & Milner, 2004; Grossberg, 1999; Heath, Neely, Yakimishyn, & Binsted, 2008; Jeannerod, 2006; Liu, Chua, & Enns, 2008; Rosenbaum, 2002; Rossetti, 2001). In this way, before an act the mind is occupied with perception-like representations of what that act is to be. As William James stated, “In perfectly simple voluntary acts there is nothing else in the mind but the kinesthetic idea . . . of what the act is to be” (James, 1890/1950, p. 771). These action-generated perceptual effects include bodily states (e.g., a flexed finger) or remote effects in the external world, such as the change in position of a lever (Hommel, 1998; Hommel & Elsner, 2009; Jordan, 2009). Harleß (1861) referred to these perceptual consequences of a given action as the Effektbild (i.e., the picture of the effect). Motor Programming as an Unconscious Process From the perspective of ideomotor theory, one is unconscious of efference generation to the muscles. According to a minority of theorists, one is conscious of the efference to the muscles (what Wundt called the feeling of innervation; see James, 1890/1950). Although this efference was believed to be responsible for action outcomes (see a review in Sheerer, 1984), Wundt himself later abandoned the feelingof-innervation hypothesis (Klein, 1970). Following the controversy James (1890/1950) concluded, “There is no introspective evidence of the feeling of innervation” (p. 775). In everyday acts such as grasping a handle, one is unconscious of the efference that is sent to the muscles. This efference dictates which fibers should be activated at which time. Highly flexible and “online” adjustments are made unconsciously during an act such as grasping a fruit (Rosenbaum, 2002). Because the spatial relationship between the objects of the world and one’s body is seldom fixed (e.g., a fruit is sometimes at left or right), each time an action is performed, new motor programs must be generated unconsciously to deal with the peculiarities of each setting (Rosenbaum, 2002). One is unconscious of these complicated programs (see compelling evidence in Johnson and Haggard, 2005) but—as noted by James—is often aware of their proprioceptive and perceptual consequences (e.g., perceiving the hand grasping; Gottlieb & Mazzoni, 2004; Gray, 2004).2 An influential case study revealing the unconscious nature of motor control was reported by Milner and Goodale (1995). In this case study Patient D. F., following a brain lesion, displayed a striking dissociation between action control and 2 See Berti and Pia (2006) for a review of motor awareness and its disorders. 15857 Personality and Control u1:Eliot Werner Publications The Conscious Control of Behavior 5/7/15 6:51 AM Page 19 19 conscious perception. Patient D. F. suffered from a kind of visual form agnosia and was incapable of, for example, reporting the orientation of a tilted slot. However, this patient could nonetheless insert an object into the slot, much as one deposits a letter into a mailbox. This is not an isolated case. Other patients with lesions in the perception pathway (the ventral-visual system; Goodale & Milner, 2004) cannot identify (recognize) objects but are still able to reach for them and manipulate them when prompted to do so. From such observations Milner and Goodale (1995) propose that conscious perception and action control are dissociable systems in the brain. In support of this conclusion, it is documented that there are patients who—because of a brain lesion—may be able to correctly identify an object (e.g., an object held up to them by an experimenter), but may be unable to reach for it correctly based on its spatial orientation (e.g., whether the orientation is horizontal or vertical). Thus one group exhibits appropriate action tendencies toward an object in the absence of consciousness about that object (i.e., action without perception), while the other group is conscious of the object but cannot act appropriately toward it (i.e., perception without action). Dissociations between action control and consciousness are found not only in neurological populations, but also in neurologically intact populations. First, such a dissociation is observed in how neurologically intact subjects respond to visual illusions (Wraga, Creem, & Proffitt, 2000). When responding motorically to such illusions, although subjects’ conscious self-reports reflect the illusion that one circle appears larger than another in the Ebbinghaus/Titchener illusion, the manual behavior of subjects toward the visual objects responsible for the illusion is accurate and does not reflect what subjects report.3 In support of these conclusions stemming from research on illusions, there are many findings revealing that one can be unconscious of the adjustments that are made “online” as one performs a motor act (Fecteau et al., 2001; Fourneret & Jeannerod, 1998; Heath et al., 2008; Liu, Chua, & Enns, 2008; Rossetti, 2001). Second, the dissociation is supported by research demonstrating not only the unconscious guidance of motor control, but the unconscious learning of motor sequences. In these experiments (see review in Taylor & Ivry, 2013), subjects are trained to perform a series of key presses with their fingers, much as piano players play a sequence of keys to perform a song. Unbeknownst to subjects, some sequences are repeated more times than other sequences. The subjects demonstrate a performance benefit for these repeated sequences, even though they are unaware that these sequences were repeated. This type of effect has been construed as a case of implicit procedural learning. Implicit motor learning is also evidenced in For arguments against the notion of perception-action dissociations, see Cooper, Sterling, Bacon, and Bridgeman (2012), Franz, Gegenfurtner, Bülthoff, and Fahle (2000), and Jeannerod (2003). Stottinger and Perner (2006) conclusively demonstrated the dissociation using an illusion (the diagonal illusion) that is free of the kinds of limitations found in previous experiments. 3 15857 Personality and Control u1:Eliot Werner Publications 20 5/7/15 6:51 AM Page 20 Philip J. Corr and Ezequiel Morsella certain forms of amnesia in which a patient, such as the famous Patient H. M. (Milner, 1966), shows a performance benefit from extensive rehearsal even though the patient cannot remember—and is thus unconscious of—the rehearsal episodes that led to the performance benefit. Perceptual Representations of Action Consequences Can Direct Future Action Ideomotor theory also proposes that when these perceptual-like representations are activated in the future, they automatically activate the unconscious motor programs responsible for enacting the action that led to them. For example, when holding the image in mind of flexing one’s finger, the image of the action activates the motor programs that would give rise to the action. In short, activation of the (perceptual-like) representation of action effects leads to the automatic expression of the associated action. According to James (1890/1950), this form of ideomotor action must always take place—unless, that is, one simultaneously has activated in mind the perceptual consequences of an incompatible action. From this standpoint mere thoughts of action effects produce impulses that, if not curbed or controlled by “acts of express fiat” (i.e., the representation of incompatible action effects), result in the performance of those actions). Thus James emphasized that the image of the sensorial effects of an action leads to the corresponding action. Of importance in this framework is that there is no central homunculus, preferring to realize one action effect over another: the process is effortless, automatic, and without any knowledge of the motor programs involved. Rather, activation of the representations of action effects lead to those actions, unless there is also the activation of representations of incompatible action effects. To take one example, when one imagines one’s finger moving but decides not to move the finger, it is only because—when imagining the former—one also had activated the idea of not moving the finger, which is an incompatible idea. In this way voluntary action can be guided by the activation of the perceptuallike representations of action effects. In some situations this guidance is intentional and is accompanied by the sense of agency (Moore, Wegner, & Haggard, 2009), especially when action outcomes match one’s action goals. According to ideomotor theory, voluntary action control requires memory of previous action effects. The process unfolds as follows. Activation of conscious, perceptual-like representations of action effects [Stage 1] ! activation of unconscious motor programs [Stage 2] ! perceptual action effects [Stage 3] ! comparator process [Stage 4] ! entry into consciousness of, say, mismatching perceptual consequences [Stage 5]. One can appreciate that this resembles the reversed sequence of Gray’s comparator model, which begins not with the conscious action effects, but with the unconscious motor program. (For treatments of how these ideas are related to social 15857 Personality and Control u1:Eliot Werner Publications 5/7/15 The Conscious Control of Behavior 6:51 AM Page 21 21 cognition, see Johnson & Shiffrar, 2013; Jordan, 2009.) It was implicit in Gray’s model that (always automatic) actions were elicited, or afforded, by stimuli; however, it was never made explicit how such stimuli trigger these actions. Ideomotor theory provides an account of this process and highlights the recursive interplay of automatic processes and controlled (often conscious) processes. Neither process is in exclusive control of behavior; rather, they are joint causal partners in an experience-action system of coordination. Ideomotor Theory and Mirror Neuron Approaches In line with ideomotor accounts, contemporary research on mirror neurons (see review in Rizzolatti, Sinigaglia, & Anderson, 2008) suggests that there is overlap in the neural networks involved in (a) the perception of actions (e.g., the perception of actions by others) and (b) the execution of one’s own actions. It is through such overlap that one can learn to perform actions based on imitation (Rizzolatti et al., 2008). From the perspective of research on mirror neurons, perceptual processing is an inextricable part of action control (Iacoboni, 2005; Jordan, 2009; Miall, 2003).4 From this standpoint voluntary action can be guided by the perceptual representations not only of the behaviors performed by one, but by the observed behaviors of others. Consistent with both ideomotor and mirror neuron accounts, Desmurget et al. (2009) concluded in their brain stimulation study (on awake patients undergoing brain surgery for the treatment of epilepsy) that action intentions in perceptual regions may be processed in terms of the perceptual consequences of the intended action (see review of convergent evidence in Jordan, 2009; Miall, 2003). Complementing these findings is research on the role of reafference in action control, which reveals that reafference to perceptual areas of the brain, such as the parietal cortex (Berti & Pia, 2006; Chambon, Wenke, Fleming, Prinz, & Haggard, 2013; Iacoboni, 2005; Miall, 2003), is essential to the control of intentional action. In summary, theorizing falling under the rubrics of ideomotor, common code, or mirror neuron research supports the counterintuitive hypothesis that the perceptual representations about the external world—including those about the behaviors of others—can be a major influence on behavioral control (see discussion in Jordan, 2009). As such, mirror neuron research supplies theoretically important empirical support for the ideomotor theories advanced by William James and others. It is intriguing to see how they can easily be incorporated into Gray’s neuropsychological model of behavioral control and consciousness. It is worth mentioning that this is consistent with contemporary ideomotor models, which propose that perceptual action effects and action codes share the same representational format—hence the description of these accounts as “common code” theories of perceptionand-action (Hommel, 2009). 4 15857 Personality and Control u1:Eliot Werner Publications 22 5/7/15 6:51 AM Page 22 Philip J. Corr and Ezequiel Morsella SUBJECTIVE ASPECTS OF SKINNER’S THREE–TERM CONTINGENCY Ideomotor theory, Gray’s neuropsychological work, and more recent insights from mirror neuron research highlight the importance of the stimulus-response processes favored by Skinner and other radical behaviorists; their work can now be extended to understanding the machinery hidden in their black boxes. Neglected in traditional ideomotor accounts is the mechanism by which currently experienced favorable outcomes from action production increase the likelihood that only some behaviors are expressed in the future. As far as we know, and in accordance with Loewenstein (1996),5 operant conditioning remains the best mechanistic model to explain this phenomenon—especially when considering Skinner’s (1953) three-term contingency description of operant learning. From this point of view, the traditional circumstance under which operant conditioning takes place involves three different terms. The first term involves the discriminative stimulus (SD ), which is the stimulus that signifies the appropriate context for expressing the operant. In a standard operant conditioning experiment, a lever may be the discriminative stimulus. In everyday life a traffic light signaling green may be a discriminative stimulus. The SD is considered the first term of the three-term contingency. Faced with this SD the organism issues the operant behavior, or response (R). It is important to note in this framework that the operant is not a simple response. When a rat depresses a lever, the action can be accomplished by the leg or the snout. Similarly, the operant may be getting the soccer ball into the goal, one way or another, as in a game of soccer. This can be accomplished through several means (e.g., pushing the ball in the goal with the left leg, the right leg, or instead by a header). That the same action goal (which Gray referred to simply as “goals”) can be accomplished through several motoric means is called motor equivalence (Lashley, 1942). The operant (R), learned through trial and error, is the second term of the three-term contingency. The third and last term of the three-term contingency is the most relevant to our question regarding how behavioral outcomes could reinforce (i.e., increase the likelihood of) some behaviors over others in a specific context (i.e., when faced with a particular SD). This is the outcome variable, or O. The outcome is either a Why one choice of action over another is selected in the future remains mysterious. It seems that, regardless of mentalistic or decision-making dynamics, past behavior is still the most reliable predictor of future behavior. Speaking of decision-making approaches following the cognitive revolution, Loewenstein (1996) concludes, “Another area in which the decision making perspective falls short is its treatment of motivation and effort. In the decision paradigm there is no qualitative distinction between choosing, say one car over another, or ‘deciding’ to pick up one’s pace in the last mile of a marathon; both are simply decisions. Years after the decline of behaviorism, behaviorists still offer the most coherent theoretical perspective on motivation and the most sophisticated and comprehensive program of research” (p. 287). 5 15857 Personality and Control u1:Eliot Werner Publications 5/7/15 6:51 AM The Conscious Control of Behavior Page 23 23 Figure 1. Conscious and unconscious aspects of Skinner’s three-term contingency. reinforcer, which increases the likelihood of a certain operant in the presence of a given SD (e.g., the presentation of something positive or removal of something negative) or a punisher, which decreases the likelihood of a certain operant in the presence of a given SD (e.g., the presentation of something negative or removal of something positive). Skinner (1953) explains that, in the three-term contingency (SD → R → O), it is the outcome term (O) that determines the strength of the association between SD and R. If the outcome is a reinforcer, then the association is strengthened, making R more likely in the presence of SD. If the outcome is a punisher, then the association between SD and R is weakened, such that R is now less likely to occur when SD is presented. According to Gray, because SD is part of the perceptual world, it is a conscious representation—as is, importantly, the action effect and outcome term (O). However, the motor aspects of R are unconscious. This is consistent with both Gray (1995, 2004) and ideomotor theory. Figure 1 diagrams schematically that which is conscious and unconscious in the three-term contingency and reveals that what falls within consciousness can be described, in terms of its neural processing, as afference or reafference (Sherrington, 1906). Neural Correlates of the Subjective Aspects of the Three-Term Contingency Gray (1995, 2004) was more concerned with the nature of the different component processes of the comparator model than with the actual neural substrates of these processes. Speaking of alternative models, Gray (2002) concludes: Where I stress the hippocampal system, more recent views tend to emphasise the prefrontal, anterior cingulate and/or parietal cortex. The precise anatomical localisation of the computations, however, does not bear upon the issues raised. . . . What does bear upon these issues is the emphasis in all these mod- 15857 Personality and Control u1:Eliot Werner Publications 24 5/7/15 6:51 AM Page 24 Philip J. Corr and Ezequiel Morsella els upon the interaction of top-down (contextual) and bottom-up (perceptual) processing as giving rise to the contents of consciousness. (p. 6) Regarding the conscious aspects of behavioral control in the three-term contingency, it seems that much of the control-related processing in frontal cortex may be unconscious. Consistent with this view and with ideomotor frameworks, it seems—as mentioned above—that one does not have direct, conscious access to motor programs or other kinds of efference generators (Grossberg, 1999; Morsella & Bargh, 2010; Rosenbaum, 2002), including those for language (Levelt, 1989), emotional systems (e.g., the amygdala; Anderson, & Phelps, 2002; Öhman, Carlsson, Lundqvist, & Ingvar, 2007), or executive control (Crick, 1995; Suhler & Churchland, 2009). The notion that efference generation is largely unconscious illuminates why, when speaking, one does not always know exactly which words one will utter next (Levelt, 1989; Slevc & Ferreira, 2006). Neural Correlates of the Perceptual Dimensions of the Control of Action Regarding conscious awareness of action effects, there is evidence implicating posterior perceptual regions (e.g., parietal areas), rather than frontal areas, as being the key regions responsible for conscious states (see review in Godwin, Gazzaley, & Morsella, 2013).6 In addition, in a study with seven patients undergoing awake brain surgery, direct electrical stimulation of parietal areas of the brain gave rise to the subjectively experienced will (an “urge”) to perform an action. Interestingly, increased activation made subjects believe that they actually executed the corresponding action (e.g., flexing a finger), even though no action was performed (Desmurget et al., 2009; Desmurget & Sirigu, 2010). Activating frontal motor areas (e.g., in premotor areas) resulted in the performance of the actual action, but surprisingly subjects believed that they did not perform any action (see also Fried et al., 1991). “Stimulation of the premotor region triggered overt mouth and contralateral limb movements. Yet, patients firmly denied that they had moved” (Desmurget et al., 2009, p. 811). These observations are consistent with the age-old Sensorium Hypothesis first proposed by the great Johannes Müller and then advocated, in one fashion or another, by others (Godwin et al., 2013; Gray, 2004; James, 1890/1950; Müller, 1843). The Sensorium Hypothesis is that action/motor processes are largely unconscious (Goodale & Milner, 2004; Gray, 2004; Grossberg, 1999) and that the contents of consciousness are influenced primarily by perceptual-based (as opposed to action-based) events and processes, which is in direct agreement with Gray (1995) and ideomotor theory. Consistent with these perspectives, Relevant to this hypothesis is research on the phenomenon of sensory neglect (cf. Graziano, 2001; Heilman, Watson, & Valenstein, 2003). 6 15857 Personality and Control u1:Eliot Werner Publications The Conscious Control of Behavior 5/7/15 6:51 AM Page 25 25 Desmurget et al. (2009) concluded in their brain stimulation study that action intentions in perceptual regions may be processed in terms of the perceptual consequences of the intended action (see reviews of convergent evidence in Jordan, 2009; Miall, 2003). Complementing these findings is research on the role of reafference in action control. This research reveals that a key component of the control of intentional action is reafference to perceptual areas of the brain, such as parietal cortex (Berti & Pia, 2006; Chambon et al., 2013; Iacoboni, 2005; Miall, 2003). Accordingly, it has been proposed that what characterizes conscious content in neural processing is the notion of perceptual afference (information arising from the world that affects sensory-perceptual systems) or perceptual reafference, such as the proprioceptive information generated during action production.7 Finally, consistent with Gray (1995), the conscious contents (e.g., urges and perceptual representations) of behavioral control are similar to—or perhaps one and the same as—the contents occupying the “buffers” in working memory (WM), a large-scale mechanism that is used to sustain the activation of content-based representations in mind (e.g., for information manipulation) and is intimately related to both consciousness and action production (Baddeley, 2007; Fuster, 2003). Recent developments reveal that WM is intimately related to both action control and consciousness (LeDoux, 2008), as is evident in the title and contents of a treatise on WM—Working Memory, Thought, and Action (Baddeley, 2007). Indeed, perhaps no mental operation is as reliably coupled with conscious processing as WM (LeDoux, 2008). When trying to hold in mind action-related information, a person’s consciousness is consumed by this goal (James, 1890/1950). For instance, when holding a to-be-dialed telephone number in mind (or when gargling with mouthwash for thirty seconds), action-related mental imagery occupies one’s consciousness during the delayed action phase. Similarly, before making an important toast (or, more dramatically, making the toast in an unmastered language), a person has conscious imagery regarding the words to be uttered—much as when an actor rehearses lines for an upcoming scene. In this way, before an act the mind is occupied with perception-like representations of what that act is to be—again, as James (1890/1950) stated, “In perfectly simple voluntary acts there is nothing else in the mind but the kinesthetic idea . . . of what the act is to be” (p. 771). Thus voluntary action control often occupies both WM and perceptual consciousness. In conclusion, it is clear that there are several contemporary accounts that are consistent with Gray (1995) and with the age-old hypothesis that the urges associSherrington (1906) aptly referred to these two, similar kinds of information as exafference, when the source of information stems from the external world, and reafference, when the source is feedback from overt actions. There is also similar feedback from the activation of internal action plans (e.g., information arising from “corollary discharges” or “efference copies” of our own action plans; Chambon et al., 2013; Christensen et al., 2007; Jordan, 2009; Miall, 2003; Obhi, Planetta, & Scantlebury, 2009). 7 15857 Personality and Control u1:Eliot Werner Publications 26 5/7/15 6:51 AM Page 26 Philip J. Corr and Ezequiel Morsella ated with intentional action should involve regions of the brain that have historically been associated with perceptual processing. Faced with these insights, one may ask the question “What is it about the sensorium?” Proposals have been made regarding why consciousness is associated with Müller’s sensorium but not with his motorium. For example, according to one framework about the microarchitecture of cognition (Grossberg, 1999), motor programming involves a neural process called inhibitory matching, which is unconscious and does not involve resonant states (according to Grossberg, 1999, all conscious states are resonant states, but not all resonant states are conscious states), whereas perceptual detection often involves excitatory matching, which can be conscious (see Grossberg, 1999). THE PRIMARY ROLE OF CONSCIOUS PROCESSING According to several frameworks, the primary function of conscious processing is to integrate processes that would be unintegrated otherwise (Baars, 1988, 2002; Boly et al., 2011; Clark, 2002; Damasio, 1989; Dehaene & Naccache, 2001; Del Cul, Baillet, & Dehaene, 2007; Doesburg, Green, McDonald, & Ward, 2009; Freeman, 1991; Koch, 2004; Llinás & Ribary, 2001; Ortinski & Meador, 2004; Sergent & Dehaene, 2004; Tononi & Edelman, 1988; Ulhaas et al., 2009; Varela, Lachaux, Rodriguez, & Martinerie, 2001; Zeki & Bartels, 1999). These accounts have fallen under the integration consensus (Morsella, 2005). Evidence for the integration consensus stems from both perception-based and action-based research. Regarding the former, it has been demonstrated that the neural correlates of conscious perceptual representations involve a wider network of brain regions than the neural correlates of unconscious representations (Dehaene & Naccache, 2001; Del Cul et al., 2007). Regarding the latter, the neural correlates of consciously mediated actions involve a more extensive network of regions than the neural correlates of unconsciously mediated actions (Kern, Jaradeh, Arndorfer, & Shaker, 2001; McKay, Evans, Frackowiak, & Corfield, 2003; Ortinski & Meador, 2004).8 The “Broadcasting” of Conscious Contents Germane to Gray (1995), according to the integration consensus, conscious contents are available to various systems, as if the contents were somehow “broadcast.” It has been proposed that for contents to have such communicability, the contents of consciousness must be communicable (Fodor, 1983). For communica8 It has been proposed that it might be more parsimonious to hypothesize that consciousness is not for this form of integration, but instead for suppression or for the mappings of arbitrary stimulus-response mapping, but there are problems with these accounts (see Poehlman, Jantz, & Morsella, 2012). 15857 Personality and Control u1:Eliot Werner Publications The Conscious Control of Behavior 5/7/15 6:51 AM Page 27 27 bility to occur successfully, representations must be in a format that is understood by multiple systems, especially systems involved in behavioral control. Some theorists have proposed that this format must be perceptual in nature, since most brain systems evolved to be sensitive to perceptual-like representations (Morsella, Lanska, Berger, & Gazzaley, 2009). These representations provide information about what Gestalt psychologists described as the “distal” object (Koffka, 1922).9 In terms of neural processing, the representations rely on afference from the external world as well as on perceptuo-semantic knowledge (Most, Scholl, Clifford, & Simons, 2005). Regarding these perceptual-like representations, one must consider that multiple systems in the brain respond in various ways to the same perceptual stimulus. In the processing of emotion-related stimuli, for example, LeDoux (1996) proposes that the same perceptual afference is processed by a “quick and dirty” subcortical pathway and by a slower, more accurate cortical pathway. In either case it is perceptual afference that is capable of activating analysis by systems that, most likely, evolved at different times and follow distinct operating principles. Independent of these considerations, Fodor (1983) proposed that the most communicable kind of representation in the brain is that of the perceptual kind. Figuratively speaking, the perceptual-like information is the common currency or lingua franca of the brain. The Simulacrum of the World in Consciousness The idea that consciousness represents a model of the external world, and one’s place and inclinations within that world, is not new and has become uncontroversial (Hesslow, 2002; Merker, 2007; Yates, 1985). However, it should be noted that the representations making up this simulacrum represent a small subset of what is really “out there.” This subset includes objects and other physical information that are of concern (Frijda, 1986) to the organism. We humans, for example, do not represent in our conscious simulacrum ultraviolet radiation. This is of little consequence because such energies are not of terrible concern to human welfare, though detection of such energies is essential for other species. It is also important to add that this simulacrum serves to afford adaptive action (Morsella, Montemayor, Hubbard, & Zarolia, 2010) and is not in the business of accurately representing the external world. Such an evolutionary-based perspective on the nature of the conscious simulacrum begins with the assumption that most mental phenomena are primarily concerned with how the organism should behave at one moment in time. (This was the functionalist approach adopted by William James and others.) As beautifully explained in Gray (2004), the nature of the isomorphism to the world remains unclear 9 See research on how perceptual analysis reaches the stage of processing known as “objecthood” in Goodhew, Dux, Lipp, & Visser (2012). 15857 Personality and Control u1:Eliot Werner Publications 28 5/7/15 6:51 AM Page 28 Philip J. Corr and Ezequiel Morsella with respect to many representational processes. What of the outside world is represented by a “mood”? What does the aversive feeling of holding one’s breath represent? What does the pungent flavor of hydrogen peroxide represent? This nasty chemical differs molecularly from water only by the addition of a single oxygen atom, but few would perceive it as “water with a little too much oxygen.” Instead the toxic chemical is perceived (or represented) as something that “tastes bad” and should be violently expelled from the body. Similarly, in the real (physical) world out there, the color blue and red are just the same thing (electromagnetic frequencies) occurring at different speeds, but no one perceives the color red as a slower version of the color blue. Rather, color perception is intimately associated with action (and not the way the world is): it evolved for selecting fruits and detecting camouflaged prey (Morsella et al., 2010). Some representations in vision (e.g., the spatial layout of a garden) do seem isomorphic to what is out there in the real world. In such cases it happens that representing space as accurately as possible does lead to the most adaptive response. But representing how things are is not the primary goal of the conscious simulacrum. Thus representational accuracy is secondary to the adaptive guidance of a response. CONSCIOUSNESS AND ENCAPSULATION To summarize the foregoing conclusions, there are several independent accounts— based on different considerations—proposing that conscious representations should be of a perceptual-like nature, which is in complete accord with Gray (1995, 2004). These representations possess an interesting property: encapsulation. Visual illusions such as the Ebbinghaus/Titchener illusion reveal how conscious percepts can be encapsulated (Fodor, 1983), which means that the representations cannot be affected by beliefs or other conscious contents (e.g., motivation). Even though one knows that the two circles in the Ebbinghaus/Titchener illusion are of exactly the same size, one cannot help but perceive them as having different diameters. It has been argued that such encapsulation is adaptive (Firestone & Scholl, 2014). One argument is that if perception could be “corrupted” by beliefs and desires, it would lose its value as a system for negotiating a real, external world. That perceptual processes are encapsulated and independent of voluntary processing is also evident in the phenomenon of earworms (e.g., when one cannot “get a song out of one’s head”) and in certain forms of psychopathology (e.g., when a patient knows that a percept is a hallucination but the abnormal percept persists in consciousness). Such encapsulation occurs not only for perceptual processing, but also in action control—as in the case of action-related urges, which are triggered in a predictable and insuppressible manner by certain stimuli. For example, when one holds one’s breath while underwater or runs barefoot across hot sand, one cannot help but consciously experience the inclinations to inhale or to avoid touching the hot sand, respectively (Morsella, 2005). These urges arise despite one’s beliefs (e.g., holding one’s breath underwater is a good thing) and desires. The conscious strife triggered 15857 Personality and Control u1:Eliot Werner Publications 5/7/15 The Conscious Control of Behavior 6:51 AM Page 29 29 by the external stimuli cannot be turned off voluntarily (Morsella, 2005; Öhman & Mineka, 2001). In these cases the externally activated action-related urges are encapsulated from voluntary control. In this way, although inclinations triggered by external stimuli can be behaviorally suppressed, they often cannot be mentally suppressed (Bargh & Morsella, 2008). One can think of many examples in which externally triggered conscious contents are more difficult to control than is overt behavior (cf. Bargh & Morsella, 2010). Higher-Level Processes Stemming from Encapsulation Because of encapsulation, suppressed actions and their resultant inclinations can function like internalized reflexes (Vygotsky, 1962), which is consistent with Sherrington’s (1941) definition of pain as “the psychical adjunct of an imperative protective reflex” (p. 286). These internalized reflexes can be co-opted to play an essential, evaluative role in the high-level mental operation of mental simulation. As known by strategists and engineers, short of performing an action, the best way of knowing the consequences of a course of action is by simulating it. Simulators train novice pilots and laser scopes on a rifle simulate the destination of a rifle shot. One obvious value of simulation is that knowledge of an action outcome is learned without the risks of performing the action (Barsalou, 1999).10 Importantly, the outcome of simulation must be evaluated. Because of encapsulation, most knowledge regarding what is favorable or not is already built into the organism: It is possessed by the very processes that, in the absence of suppression, control behavior directly (Bargh & Morsella, 2008). The encapsulated inclinations respond to simulacra as if they were responding to real, external stimuli. Changes in consciousness, in response to the simulacra constructed voluntarily within our minds, lead to statements such as “I would rather not do or even imagine doing that.” From this point one immediately has a sense of whether a simulated bodily action outcome (e.g., an approach-approach situation) is desirable, although such a judgment must take many considerations into account. Accordingly, research has shown that faced with options, people can experience inexplicable “gut feelings” (or somatic markers; Tranel & Damasio, 1985) reflecting the inclinations of agents whose inner workings and learning histories are opaque to awareness (Öhman & Mineka, 2001). The Three-Term Contingency Redux Returning to our three-term contingency, that which enters consciousness (the SD and outcome) is not the kind of nervous event that is directly associated with ef- Indeed, some theories propose that the function of explicit, conscious memory is to simulate potential future actions (Schacter & Addis, 2007). 10 15857 Personality and Control u1:Eliot Werner Publications 30 5/7/15 6:51 AM Page 30 Philip J. Corr and Ezequiel Morsella ference generation; instead it is the kind of event that resembles perceptual processing. Gray (1995, 2004) argues that conscious awareness comprises what, in everyday life, we refer to as perception. Regarding why this kind of processing can be associated with consciousness, whereas so many other kinds of processes cannot, Gray (2002) states, “I have no serious idea how such an ‘entry into consciousness’ actually occurs, but then neither does anyone else; this is the nub of the Hard Problem” (p. 5). Here Gray is referring to the so-called “hard problem of consciousness”: How does consciousness arise from physical, brain processes? It remains a mystery why this form of perception-related processing in the brain can bring with it subjectivity. Explaining why this is so is one of the greatest puzzles in science, one that has been tackled by the some of the greatest scientific minds—including Nobel Laureates Leon Cooper, Francis Crick, Gerald Edelman, Eric Kandel, and Charles Sherrington. At this stage of understanding, the field possesses not even an inkling regarding how physical events in the brain (or anywhere else) can give rise to a subjectivity of any kind (Godwin et al., 2013). As philosophers such as Karl Popper long ago noted, physical-objective systems are closed, neither needing nor able to accommodate the subjective material of the mind, especially the contents of the conscious mind that is so central to psychology. The type of approach epitomized by Gray (2004), as well as the other theorists noted above, of creeping up on this hard problem may be starting to bear fruit. There does seem a convergence of theory and data, of which the mirror neuron work is one recent example, that can trace its origins at least back to the ideomotor theory of William James. More creeping will be needed, but our prey—the understanding of the function and form of the conscious sensorium— may well be within albeit indistinct sight, if not immediate grasp. This is especially the case regarding the outcome term of the three-term contingency, to which we now turn. Although behaviorism avoided mention of mentalistic variables, consciousness is inevitably encountered when examining the three-term contingency. The outcome term, for example, is said to be a reinforcer if it increases the future likelihood of a behavior. This definition was criticized for being circular. That which renders something a reinforcer is that it increases the likelihood of a behavior. When one asks, “Why does a reinforcer increase the likelihood of behavior?” The answer is “Because it is a reinforcer.” When one then asks, “Why is it a reinforcer?” the answer is “Because it increases the likelihood of a behavior.” As is clear, this line of reasoning is circular and it seems that there must be something else at play for something to be a reinforcer. In everyday life one would argue that this extra something may be pleasure, joy, relief, or some other kind of positive feeling. It is this kind of mentalistic variable that may play a role in operant conditioning and even in decision making (Loewenstein, 1996). Consider, for example, Hull’s (1943) law of least work (or law of least effort), which states that given two means to reach some end, an organism tends to select the means associated with the least effort/aversiveness (see recent treatments in Botvinick, 2007). In this case the negative affect associated with effort (Morsella, Feinberg, Cigarchi, 15857 Personality and Control u1:Eliot Werner Publications 5/7/15 6:51 AM The Conscious Control of Behavior Page 31 31 Newton, & Williams, 2011) is one of the variables in the calculation regarding which course of action to take, as Loewenstein (2007) states. Humans have the capacity, perhaps uniquely, to deliberate about their own behavior and to make trade-offs between near-term and long-term rewards. Such deliberations require consciousness, but consciousness is not enough. To make trade-offs between rewards at different points in time, there has to be something to trade off. The subjective sensations of affective states provide that thing; they allow us to make conscious trade-offs between, for example, the immediate pleasure of indulging in dessert and being thin, or between smoking a cigarette and enjoying better health. We may not make such trade-offs optimally, but were it not for the subjective feelings associated with affective states, we would have no basis for making them at all. (p. 409) As Loewenstein (2007) further notes, it is an undeniable aspect of conscious life that some states are preferred over others. The pleasure of drinking when thirsty is more positive than enduring pain. As quotidian as these examples are, they remain mysterious. Indeed, this is the very basis of the reinforcement-based theory of personality for which Gray is perhaps most famous (for reviews of this literature, see Corr, 2008; Corr & McNaughton, 2012). But how can a physical system prefer some states over others? That is, how can something be an affinity-based system in which the system prefers—and strives to be in—some states over others? One may say that the northern pole of magnet A prefers to be adjacent to the southern pole of another magnet, but few would propose that magnet A prefers subjectively such a situation. That some physical systems, such as the nervous system, have inclinations of this kind remains outside of our current explanatory scope (Shallice, 1972). Chomsky (1988) adds that unlike machines, which are compelled to act in one way or another, we humans can also be inclined to act in a certain way. It is this peculiar state of being inclined to act one way, but to not act overtly, that currently remains unexplained from a mechanistic point of view. Gray (2005; Gray, Williams, Nunn, & Baron-Cohen, 1997) astutely points out that because of such undeniable mentalistic variables (positive and negative subjective states), a strict functionalistic account of nervous function in which understanding is based solely on the association between objective variables (e.g., neural activity and behavior), without invoking the physical processes underlying consciousness (e.g., Dennett, 1991), does not provide a complete picture of nervous function. Gray’s Insights About Consciousness from Synesthesia To make this argument, Gray (2005) entertains the phenomenon of synesthesia. In this phenomenon sensory qualities from one modality (e.g., color) are experienced when perceiving stimuli from another modality (sound). For example, a synesthete may reliably experience the color red when hearing a high-pitched sound or when seeing the letter A (Gray et al., 1997). In synesthesia two people may experience 15857 Personality and Control u1:Eliot Werner Publications 32 5/7/15 6:51 AM Page 32 Philip J. Corr and Ezequiel Morsella different quale toward the same object, even though the overt behavior of both people may be the same toward the object. For example, when perceiving an apple, John may have the experience of Rachel’s blue, and Rachel may experience what John experiences as red. Yet both Rachel and John refer to the apple as red. According to Gray et al. (1997), this provides evidence against a strict interpretation of functionalism (e.g., Dennett, 1991) in which consciousness is directly tied to overt behavior. Regarding the mystery of consciousness, one may argue that the real mystery is not so much the existence of an affinity-based system, but rather the subjectivity that is associated with the inclinations of such a system. From this standpoint subjectivity is the unsolved puzzle regarding not only inclinations, but all brain processing—including color perception, music perception, and other conscious states (the very states that make human life worthwhile). As mentioned in note 1, an organism possesses subjectivity (or basic consciousness) if there is something it is like to be that organism. One may argue that the real puzzle is not how a physical thing could prefer to be in one state versus another, but how such a preference could be experienced subjectively, which is part of a larger question: how could anything ever have a subjective experience of any kind? Nevertheless, it seems that the three-term contingency—our best conceptual account explaining how favorable outcomes can increase the likelihood of a given R in the presence of a given SD—requires mention of mentalistic states in order to explain everyday operant conditioning in humans. Why should there be any central states in operant conditioning when all that needs to occur for instrumental learning is for the connection between SD and R to be strengthened? Function of Central States The case for central states was made long ago by Neal Miller (1959), who claimed that central states render the nervous system more efficient in terms of its many connectivities. This proposal is obvious in the following scenario. Imagine a simplified nervous system that only experiences two inclinations: to approach and to avoid. Now consider that in the simplified environment of this organism, there are eight discriminative stimuli, four of which (such as food) should be approached and four of which (such as noxious stimuli) should be avoided (Figure 2, top left). In addition to these eight discriminative stimuli, there are several different potential motor responses, some for approach and some for avoidance (Figure 2, top right). Miller reasoned that it would be inefficient in terms of processing speed and wiring for there to be direct connections between all SDs and all potential Rs. Instead Miller proposed that it would be more efficient for the inputs and outputs to be connected to a central state (Figure 2, bottom), one for the state of “approach” and one for the state of “avoid.” These two states obviously resemble positive and negative affect, respectively (Frijda, 1986). This would be even more true in an actual nervous system, in which there is a larger set of discriminative stimuli and po- 15857 Personality and Control u1:Eliot Werner Publications The Conscious Control of Behavior 5/7/15 6:51 AM Page 33 33 tential responses. Which particular R is selected may depend on contextual details (e.g., a rat freezing, fleeing, or attacking, depending on the context). Once the central states are established, then, depending on context, the appropriate action can be selected. In some contexts an organism should freeze when under threat; in other circumstances an organism should flee (Corr, 2011, 2013). The view that central states may serve such a functional role in the nervous system, and that these states may involve consciousness, whose contents are often “projected” on to an apparently external world (Merker, 2007), is consistent with the aforementioned integration consensus about the function of consciousness. According to the consensus, conscious states integrate information processes that would otherwise be independent. The consensus is consistent with evidence from neurology, neuroscience, and psychology (see Zarolia, Tomory, Rosen, & Morsella, this volume) showing that (a) consciously mediated actions involve more information integration than unconsciously mediated actions; (b) conscious states involve a wider network of brain activations than unconscious states; and (c) conscious perceptual information processing involves the integrating, or binding, of more kinds of information than unconscious perceptual processing (see a review in Godwin et al., 2013). The information involved in the conscious state is available to multiple systems, much as information broadcast on television is available to many viewers who can act toward the information as they like, depending on their interests. In other words, the systems that have access to the information must evaluate the information and re- Figure 2. Schematic of Neal Miller’s theorizing that direct connections between discriminative stimuli and responses (top) yield a framework that is less efficient than one invoking central states (bottom). 15857 Personality and Control u1:Eliot Werner Publications 34 5/7/15 6:51 AM Page 34 Philip J. Corr and Ezequiel Morsella spond to it according to their concerns (Frijda, 1986; Morsella, 2005). Sometimes conscious content can cause systems to provide additional content (which can then too become conscious) or to generate action plans, which can influence behavior directly or indirectly—as in consciously experienced inclinations. It has been proposed that the integration involving consciousness is intimately related to the skeletal muscle effector system (Morsella, 2005), which, by no accident, has been called voluntary muscle. Interestingly, this effector has been associated with operant conditioning more than any other effector system.11 With all this in mind, we will attempt to synthesize the various frameworks that we have discussed—Gray’s comparator model, ideomotor theory, the three-term contingency, Miller’s central states, and the integration consensus about conscious processing. In doing so, it is clear that these frameworks have much in common. A New Synthesis It is a fairly straightforward process to integrate all frameworks into one overarching framework. Let us begin by revisiting the sequence of stages outlined in the comparator model. As mentioned above, in some ways the sequence is the mirror image of that of ideomotor models. In this framework the sequence is as follows. Unconscious motor programs [Stage 1] ! conscious action effects [Stage 2], which are perceptual-like and can include both afference and re-afference ! comparator process [Stage 3] ! entry into consciousness of, say, mismatching perceptual consequences or an error signal, as seen in the experience of pain [Stage 4]. We will present the sequence again but combine it with the sequence of ideomotor models, such that the actor can willfully repeat an expressed action intentionally. This would occur, for example, if, while dancing or playing the drums, one exhibited a strange and unintentional move that led to a favorable outcome, an outcome that should be repeated. In addition, we will add elements Many kinds of information in the nervous system can be integrated unconsciously. Unconscious integrations can involve smooth muscle, such as in the pupillary reflex (see evidence in Morsella, Gray, Krieger, & Bargh, 2009), and intersensory processing. For example, the McGurk effect (McGurk & MacDonald, 1976) involves unconscious interactions between visual and auditory processes: An observer views a speaker mouthing ba while presented with the sound ga. Surprisingly, the observer is unaware of any intersensory interaction, perceiving only da. Similar consciously impenetrable interactions are exemplified in countless other intersensory phenomena (see Morsella, 2005, Appendix A), including the popular ventriloquism effect, in which visual and auditory inputs regarding the source of a sound interact unconsciously (cf. Vroomen & de Gelder, 2003). It appears that the information that requires conscious integration is intimately related to skeletal muscle action, or skeletomotor control (Morsella, Gray et al., 2009; Morsella, Wilson et al., 2009). 11 15857 Personality and Control u1:Eliot Werner Publications 5/7/15 6:51 AM The Conscious Control of Behavior Page 35 35 of the three-term contingency—specifically, the outcome variable. The resultant sequence is as follows. Unconscious motor programs [Stage 1] ! conscious action effects [Stage 2] ! comparator process [Stage 3] ! entry into consciousness of, say, positive affect (outcome) [Stage 4] ! activation of representation of conscious action effects [Stage 5] ! unconscious motor programs [Stage 6] ! conscious action effects [Stage 7] ! comparator process [Stage 8] ! entry into consciousness of, say, positive affect (outcome) from repeating action successfully [Stage 9]. There are several features in the model that are worthy of some reflection. First, it is no accident that Stages 2, 4, 5, 7, and 9—stages in which information must be evaluated by diverse systems in the brain—involve consciousness, which is consistent with the integration consensus. Second, the conscious states of these stages resemble the central states to which Miller (1959) alluded, especially the outcome variables (e.g., positive or negative affect). Third, consistent with Morsella (2005), the action-related conscious states in Stages 5–9 influence behavior only through skeletal muscle. Fourth, one can appreciate that efference generation is unconscious and that the actor only has conscious access to the representations of action effects, which are perceptual-like (Gray, 2004) and are experienced after action production (as in the comparator model) and can be experienced before action production (as in ideomotor control). Regarding the limited information to which the actor has access, James (1890/1950) proposes that—in behavioral control—all the will can do is pay attention to the representation of one action effect versus another. It is by this allotment of attention that the actor can, through ideomotor mechanisms, influence behavior intentionally: activation (through attention) to the representation of a given action effect will lead to the expression of that action through unconscious motor control. This, of course, fails to occur if there is simultaneously the activation of a representation of an incompatible action effect (James, 1890/1950; Lotze, 1852). In this way behavioral control is only through mental control, involving perceptuallike representations (Gray, 1995). As a pleasing by-product of this analysis, our model assigns a causal role to attention—viz., to recruit processing resources toward salient areas of the phenomenal field and by so doing appropriate affording automatic actions. It is perhaps no surprise that we “pay attention” to what we consider to be important. CONCLUSION In this chapter we revisited Gray’s (1995) pioneering comparator model of consciousness, which focuses on the control of behavior and the contents of consciousness. We then combined this model with that of ideomotor theory, which 15857 Personality and Control u1:Eliot Werner Publications 36 5/7/15 6:51 AM Page 36 Philip J. Corr and Ezequiel Morsella complements the comparator model in several respects. For instance, it explains how the architecture of something like the comparator model could illuminate the mechanisms underlying the intentional control of behavior. We also revisited the integration consensus and approaches in operant conditioning, which explain the function of consciousness and how outcomes can influence future behaviors, respectively. Both the comparator model and ideomotor theory posit that it is perceptual-like content that is conscious. The integration consensus explains why this is so: it is because this is the content that is the most communicable, the most capable of being detected and processed by multiple systems (Bargh & Morsella, 2010; Fodor, 1983). What no account to date has been able to explain is why subjectivity must be part of this process. At this stage of understanding, we propose that more knowledge about the limitations of the hardware of the nervous function may reveal why something as strange as consciousness was selected in evolution to perform an integrative—albeit circumscribed—role. From this viewpoint, just as intrapsychic conflict is not something that an engineer would ever program into a von Neumann computer, but in the course of evolution natural selection may have selected it as a solution for biological systems having slow processing units (i.e., neurons; Livnat & Pippenger, 2006), perhaps—given the constraints and limits of biological function—consciousness is actually a clever solution for the challenges faced by the nervous system. REFERENCES Anderson, A. K., & Phelps, E. A. (2002). Is the human amygdala critical for the subjective experience of emotion? Evidence of intact dispositional affect in patients with amygdala lesions. Journal of Cognitive Neuroscience, 14, 709–720. Baars, B. J. (1988). A cognitive theory of consciousness. Cambridge, UK: Cambridge University Press. Baars, B. J. (2002). The conscious access hypothesis: Origins and recent evidence. Trends in Cognitive Sciences, 6, 47–52. Baddeley, A. D. (2007). Working memory, thought, and action. Oxford, UK: Oxford University Press. Bargh, J. A., & Morsella, E. (2008). The unconscious mind. Perspectives on Psychological Science, 3, 73–79. Bargh, J. A., & Morsella, E. (2010). Unconscious behavioral guidance systems. In C. R. Agnew, D. E. Carlston, W. G. Graziano, & J. R. Kelly, (Eds.), Then a miracle occurs: Focusing on behavior in social psychological theory and research (pp. 89–118). New York: Oxford University Press. Barsalou, L. W. (1999). Perceptual symbol systems. Behavioral and Brain Sciences, 22, 577–609. Berti, A., & Pia, L. (2006). Understanding motor awareness through normal and pathological behavior. Current Directions in Psychological Science, 15, 245–250. 15857 Personality and Control u1:Eliot Werner Publications The Conscious Control of Behavior 5/7/15 6:51 AM Page 37 37 Block, N. (1995). On a confusion about a function of consciousness. Behavioral and Brain Sciences, 18, 227–287. Boly, M., Garrido, M. I., Gosseries, O., Bruno, M-A., Boveroux, P., Schnakers, C., et al. (2011). Preserved feedforward but impaired top-down processes in the vegetative state. Science, 332, 858–862. Botvinick, M. (2007). Conflict monitoring and decision making: Reconciling two perspectives on anterior cingulate function. Cognitive, Affective, and Behavioral Neuroscience, 7, 356–366. Chambon, V., Wenke, D., Fleming, S. M., Prinz, W., & Haggard, P. (2013). An online neural substrate for sense of agency. Cerebral Cortex, 23, 1031–1037. Chomsky, N. (1988). Language and problems of knowledge: The Managua lectures. Cambridge, MA: MIT Press. Christensen, M. S., Lundbye-Jensen, J., Geertsen, S. S., Petersen, T. H., Paulson, O. B., & Nielsen, J. B. (2007). Premotor cortex modulates somatosensory cortex during voluntary movements without proprioceptive feedback. Nature Neuroscience, 10, 417–419 Clark, A. (2002). Is seeing all it seems? Action, reason and the grand illusion. Journal of Consciousness Studies, 9, 181–202. Cooper, A. D., Sterling, C. P., Bacon, M. P. & Bridgeman, B. (2012). Does action affect perception or memory? Vision Research, 62, 235–240. Corr, P. J. (2008). The reinforcement sensitivity theory (RST): Introduction. In P. J. Corr (Ed.), The reinforcement sensitivity theory of personality (pp. 1–43). Cambridge, UK: Cambridge University Press. Corr, P. J. (2011). Anxiety: Splitting the phenomenological atom. Personality and Individual Differences, 50, 889–897. Corr, P. J. (2013). Approach and avoidance behavior: Multiple systems and their interactions. Emotion Review, 5, 285–290. Corr, P. J., & McNaughton, N. (2012). Neuroscience and approach/avoidance personality traits: A two stage (valuation–motivation) approach. Neuroscience and Biobehavioral Reviews, 36, 2339–2354. Crick, F. (1995). The astonishing hypothesis: The scientific search for the soul. New York: Touchstone. Damasio, A. R. (1989). Time-locked multiregional retroactivation: A systems-level proposal for the neural substrates of recall and recognition. Cognition, 33, 25–62. Dehaene, S., & Naccache, L. (2001). Towards a cognitive neuroscience of consciousness: Basic evidence and a workspace framework. Cognition, 79, 1–37. Del Cul, A., Baillet, S., & Dehaene, S. (2007). Brain dynamics underlying the nonlinear threshold for access to consciousness. PLoS Biology, 5, e260. Dennett, D. C. (1991). Consciousness explained. Boston: Little, Brown. Desmurget, M., Reilly, K. T., Richard, N., Szathmari, A., Mottolese, C., & Sirigu, A. (2009). Movement intention after parietal cortex stimulation in humans. Science, 324, 811–813. Desmurget, M., & Sirigu, A. (2010). A parietal-premotor network for movement intention and motor awareness. Trends in Cognitive Sciences, 13, 411–419. Doesburg, S. M., Green, J. L., McDonald, J. J., & Ward, L. M. (2009). Rhythms of consciousness: Binocular rivalry reveals large-scale oscillatory network dynamics mediating visual perception. PLoS, 4, 1–14. Fecteau, J. H., Chua, R., Franks, I., & Enns, J. T. (2001). Visual awareness and the online modification of action. Canadian Journal of Experimental Psychology, 55, 104–110. 15857 Personality and Control u1:Eliot Werner Publications 38 5/7/15 6:51 AM Page 38 Philip J. Corr and Ezequiel Morsella Firestone, C., & Scholl, B. J. (2014). “Top-down” effects where none should be found: The El Greco fallacy in perception research. Psychological Science, 25, 38–46. Fodor, J. A. (1983). Modularity of mind: An essay on faculty psychology. Cambridge, MA: MIT Press. Fourneret, P., & Jeannerod, M. (1998). Limited conscious monitoring of motor performance in normal subjects. Neuropsychologia, 36, 1133–1140. Franz, V. H., Gegenfurtner, K. R., Bülthoff, H. H., & Fahle, M. (2000). Grasping visual illusions: No evidence for a dissociation between perception and action. Psychological Science, 11, 20–25. Freeman, W. J. (1991). The physiology of perception. Scientific American, February, 78–85. Fried, I., Katz, A., McCarthy, G., Sass, K. J., Williamson, P., Spencer, S. S., & Spencer, D. D. (1991). Functional organization of human supplementary motor cortex studied by electrical stimulation. Journal of Neuroscience, 11, 3656–3666. Frijda, N. H. (1986). The emotions. Cambridge, UK: Cambridge University Press. Fuster, J. M. (2003). Cortex and mind: Unifying cognition. New York: Oxford University Press. Godwin, C. A., Gazzaley, A., & Morsella, E. (2013). Homing in on the brain mechanisms linked to consciousness: Buffer of the perception-and-action interface. In A. Pereira and D. Lehmann (Eds.), The unity of mind, brain and world: Current perspectives on a science of consciousness (pp. 43–76). Cambridge, UK: Cambridge University Press. Goodale, M., & Milner, D. (2004). Sight unseen: An exploration of conscious and unconscious vision. New York: Oxford University Press. Goodhew, S. C., Dux, P. E., Lipp, O. V., & Visser, T. A. W. (2012). Understanding recovery from object substitution masking. Cognition, 122, 405–415. Gottlieb, J., & Mazzoni, P. (2004). Neuroscience: Action, illusion, and perception. Science, 303, 317–318. Gray, J. A. (1982a). The neuropsychology of anxiety: An enquiry into the functions of the septo-hippocampal system. Oxford, UK: Oxford University Press. Gray, J. A. (1982b). Précis of The Neuropsychology of Anxiety: An Enquiry into the Functions of the Septo-Hippocampal System. Behavioral and Brain Sciences, 5, 469–484. Gray, J. A. (1995). The contents of consciousness: A neuropsychological conjecture. Behavioral and Brain Sciences, 18, 659–676. Gray, J. A. (1998). Integrating schizophrenia. Schizophrenia Bulletin, 24, 249–266. Gray, J. A. (2002). The sound of one hand clapping. Psyche, 8, 8–11. http://psyche.cs. monash.edu.au/v8/psyche-8-11-gray.html. Gray, J. A. (2004). Consciousness: Creeping up on the hard problem. New York: Oxford University Press. Gray, J. A. (2005). Synesthesia: A window on the hard problem of consciousness. In L. C. Robertson & N. Sagiv (Eds.), Synesthesia: Perspectives from cognitive neuroscience (pp. 127–146). New York: Oxford University Press. Gray, J. A., Feldon, J., Rawlins, J. N. P., Hemsley, D. R., & Smith, A. D. (1991). The neuropsychology of schizophrenia. Behavioral and Brain Sciences, 14, 1–20. Gray, J. A., & McNaughton, N. (2000). The neuropsychology of anxiety: An enquiry into the functions of the septo-hippocampal system (2nd ed.). Oxford, UK: Oxford University Press. Gray, J. A., Williams, S. C. R., Nunn, J., & Baron-Cohen, S. (1997). Possible implications of synaesthesia for the hard question of consciousness. In S. Baron-Cohen & J. E. Har- 15857 Personality and Control u1:Eliot Werner Publications The Conscious Control of Behavior 5/7/15 6:51 AM Page 39 39 rison (Eds.), Synaesthesia: Classic and contemporary readings (pp. 173–181). Oxford, UK: Blackwell. Graziano, M. S. A. (2001). Awareness of space. Nature, 411, 903–904. Greenwald, A. G. (1970). Sensory feedback mechanisms in performance control: With special reference to the ideomotor mechanism. Psychological Review, 77, 73–99. Grossberg, S. (1999). The link between brain learning, attention, and consciousness. Consciousness and Cognition, 8, 1–44. Harleß, E. (1861). Der apparat des willens [The apparatus of the will]. Zeitshrift für Philosophie und Philosophische Kritik, 38, 499–507. Heath, M., Neely, K. A., Yakimishyn, J., & Binsted, G. (2008). Visuomotor memory is independent of conscious awareness of target features. Experimental Brain Research, 188, 517–527. Heilman, K. M., Watson, R. T., & Valenstein, E. (2003). Neglect: Clinical and anatomic issues. In T. E. Feinberg & M. J. Farah (Eds.), Behavioral neurology and neuropsychology (2nd ed., pp. 303–311). New York: McGraw-Hill. Hesslow, G. (2002). Conscious thought as simulation of behavior and perception. Trends in Cognitive Sciences, 6, 242–247. Hommel, B. (1998). Perceiving one’s own actions—and what it leads to. In J. S. Jordan (Ed.), Systems theories and a priori aspects of perception (pp. 143–179). Amsterdam: Elsevier/North-Holland. Hommel, B. (2009). Action control according to TEC (theory of event coding). Psychological Research, 73, 512–526. Hommel, B., & Elsner, B. (2009). Acquisition, representation, and control of action. In E. Morsella, J. A. Bargh, & P. M. Gollwitzer (Eds.), Oxford handbook of human action (pp. 371–398). New York: Oxford University Press. Hommel, B., Müsseler, J., Aschersleben, G., & Prinz, W. (2001). The theory of event coding: A framework for perception and action planning. Behavioral and Brain Sciences, 24, 849–937. Hull, C. L. (1943). Principles of behavior. New York: Appleton-Century-Crofts. Iacoboni, M. (2005). Understanding others: Imitation, language, and empathy. In S. Hurley & N. Chater (Eds.), Perspectives on imitation: From mirror neurons to memes (pp. 77– 99). Cambridge, MA: MIT Press. James, W. (1950). The principles of psychology. New York: Dover. (Original work published in 1890) Jackendoff, R. (1987). Consciousness and the computational mind. Cambridge, MA: MIT Press. Jeannerod, M. (2003). Simulation of action as a unifying concept for motor cognition. In S. H. Johnson-Frey (Ed.), Taking action: Cognitive neuroscience perspectives on intentional acts. Cambridge, MA: MIT Press. Jeannerod, M. (2006). Motor cognition: What action tells the self. New York: Oxford University Press. Johnson, H. & Haggard P. (2005). Motor awareness without perceptual awareness. Neuropsychologia, 43, 227–237. Johnson, K. L., & Shiffrar, M. (2013). People watching: Social, perceptual, and neurophysiological studies of body perception. New York: Oxford University Press. Jordan, J. S. (2009). Forward-looking aspects of perception-action coupling as a basis for embodied communication. Discourse Processes, 46, 127–144. 15857 Personality and Control u1:Eliot Werner Publications 40 5/7/15 6:51 AM Page 40 Philip J. Corr and Ezequiel Morsella Kern, M. K., Jaradeh, S., Arndorfer, R. C., & Shaker, R. (2001). Cerebral cortical representation of reflexive and volitional swallowing in humans. American Journal of Physiology: Gastrointestinal and Liver Physiology, 280, G354–G360. Klein, D. B. (1970). A history of scientific psychology: Its origins and philosophical backgrounds. New York: Basic Books. Koch, C. (2004). The quest for consciousness: A neurobiological approach. Englewood, CO: Roberts & Company. Koffka, K. (1922). Perception: An introduction to the Gestalt-Theorie. Psychological Bulletin, 19, 531–585. Lashley, K. S. (1942). The problem of cerebral organization in vision. In H. Kluver (Ed.), Visual mechanisms (Biological Symposia, Vol. 7, pp. 301–322). Lancaster, PA: Cattell Press. LeDoux, J. E. (1996). The emotional brain: The mysterious underpinnings of emotional life. New York: Simon & Schuster. LeDoux, J. E. (2008). Emotional colouration of consciousness: How feelings come about. In L. W. Weiskrantz & M. Davies (Eds.), Frontiers of consciousness (pp. 69–130). Oxford, UK: Oxford University Press. Levelt, W. J. M. (1989). Speaking: From intention to articulation. Cambridge, MA: MIT Press. Libet, B. (2004). Mind time: The temporal factor in consciousness. Cambridge, MA: Harvard University Press. Liu, G., Chua, R., & Enns, J. T. (2008). Attention for perception and action: Task interference for action planning, but not for online control. Experimental Brain Research, 185, 709–717. Livnat, A., & Pippenger, N. (2006). An optimal brain can be composed of conflicting agents. Proceedings of the National Academy of Sciences, 103, 3198–3202. Llinás, R. R., & Ribary, U. (2001). Consciousness and the brain: The thalamocortical dialogue in health and disease. Annals of the New York Academy of Sciences, 929, 166–175. Loewenstein, G. (1996). Out of control: Visceral influences on behavior. Organizational Behavior and Human Decision Processes, 65, 272–292. Loewenstein, G. (2007). Defining affect. Social Science Information, 46, 405–410. Lotze, R. H. (1852). Medizinische Psychologie oder Physiologie der Seele [Medical psychology or physiology of the soul]. Leipzig, Germany: Weidmann’sche Buchhandlung. McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264, 746–748. McKay, L. C., Evans, K. C., Frackowiak, R. S. J., & Corfield, D. R. (2003). Neural correlates of voluntary breathing in humans determined using functional magnetic resonance imaging. Journal of Applied Physiology, 95, 1170–1178. Merker, B. (2007). Consciousness without a cerebral cortex: A challenge for neuroscience and medicine. Behavioral and Brain Sciences, 30, 63–134. Miall, R. C. (2003). Connecting mirror neuron and forward models. NeuroReport, 14, 1–3. Miller, G. A., Galanter, E., & Pribram, K. H. (1960). Plans and the structure of behavior. New York: Henry Holt. Miller, N. E. (1959). Liberalization of basic S-R concepts: Extensions to conflict behavior, motivation and social learning. In S. Koch (Ed.), Psychology: A study of a science, Vol. 2 (pp. 196–292). New York: McGraw-Hill. Milner, B. (1966). Amnesia following operation on the temporal lobes. In C. W. M. Whitty & O. L. Zangwill (Eds.), Amnesia (pp. 109–133). London: Butterworths. 15857 Personality and Control u1:Eliot Werner Publications The Conscious Control of Behavior 5/7/15 6:51 AM Page 41 41 Milner, A. D., & Goodale, M. (1995). The visual brain in action. New York: Oxford University Press. Moore, J. W., Wegner, D. M., & Haggard, P. (2009). Modulating the sense of agency with external cues. Consciousness and Cognition, 18, 1056–1064. Morsella, E. (2005). The function of phenomenal states: Supramodular interaction theory. Psychological Review, 112, 1000–1021. Morsella, E., & Bargh, J. A. (2010). What is an output? Psychological Inquiry, 21, 354–370. Morsella, E., Feinberg, G. H., Cigarchi, S., Newton, J. W., & Williams, L. E. (2011). Sources of avoidance motivation: Valence effects from physical effort and mental rotation. Motivation and Emotion, 35, 296–305. Morsella, E., Gray, J. R., Krieger, S. C., & Bargh, J. A. (2009). The essence of conscious conflict: Subjective effects of sustaining incompatible intentions. Emotion, 9, 717–728. Morsella, E., Lanska, M., Berger, C. C., & Gazzaley, A. (2009). Indirect cognitive control through top-down activation of perceptual symbols. European Journal of Social Psychology, 39, 1173–1177. Morsella, E., Montemayor, C., Hubbard, J., & Zarolia, P. (2010). Conceptual knowledge: Grounded in sensorimotor states, or a disembodied deus ex machina? Behavioral and Brain Sciences, 33, 455–456. Morsella, E., Wilson, L. E., Berger, C. C., Honhongva, M., Gazzaley, A., & Bargh, J. A. (2009). Subjective aspects of cognitive control at different stages of processing. Attention, Perception, and Psychophysics, 71, 1807– 824. Most, S. B., Scholl, B. J., Clifford, E., & Simons, D. J. (2005). What you see is what you set: Sustained inattentional blindness and the capture of awareness. Psychological Review, 112, 217–242. Müller, J. (1843). Elements of physiology. Philadelphia: Lea & Blanchard. Nagel, T. (1974). What is it like to be a bat? Philosophical Review, 83, 435–450. Neisser, U. (1967). Cognitive psychology. New York: Appleton-Century-Crofts. Obhi, S., Planetta, P., & Scantlebury, J. (2009). On the signals underlying conscious awareness of action. Cognition, 110, 65–73. Öhman, A., Carlsson, K., Lundqvist, D., & Ingvar, M. (2007). On the unconscious subcortical origin of human fear. Physiology and Behavior, 92, 180–185. Öhman, A., & Mineka, S. (2001). Fears, phobias, and preparedness: Toward an evolved module of fear and fear learning. Psychological Review, 108, 483–522. Ortinski, P., & Meador, K. J. (2004). Neuronal mechanisms of conscious awareness. Neurological Review, 61, 1017–1020. Pinker, S. (1997). How the mind works. New York: Norton. Poehlman, T. A., Jantz, T. K., & Morsella, E. (2012). Adaptive skeletal muscle action requires anticipation and “conscious broadcasting.” Frontiers in Psychology, 3, 369. Rizzolatti, G., Sinigaglia, C., & Anderson, F. (2008). Mirrors in the brain: How our minds share actions, emotions, and experience. New York: Oxford University Press. Rosenbaum, D. A. (2002). Motor control. In H. Pashler (Series Ed.) & S. Yantis (Vol. Ed.), Stevens’ handbook of experimental psychology: Vol. 1. Sensation and perception (3rd ed., pp. 315–339). New York: Wiley. Rossetti, Y. (2001). Implicit perception in action: Short-lived motor representation of space. In P. G. Grossenbacher (Ed.), Finding consciousness in the brain: A neurocognitive approach (pp. 133–181). Amsterdam: Benjamins. 15857 Personality and Control u1:Eliot Werner Publications 42 5/7/15 6:51 AM Page 42 Philip J. Corr and Ezequiel Morsella Schacter, D. L., & Addis, D. R. (2007). The cognitive neuroscience of constructive memory: Remembering the past and imagining the future. Philosophical Transactions of the Royal Society of London, Series B: Biological Sciences, 362, 773–786. Sergent, C., & Dahaene, S. (2004). Is consciousness a gradual phenomenon? Evidence for an all-or-none bifurcation during the attentional blink. Psychological Science, 15, 720–728. Shallice, T. (1972). Dual functions of consciousness. Psychological Review, 79, 383–393. Sheerer, E. (1984). Motor theories of cognitive structure: A historical review. In W. Prinz & A. F. Sanders (Eds.), Cognition and motor processes. Berlin: Springer-Verlag. Sherrington, C. S. (1906). The integrative action of the nervous system. New Haven, CT: Yale University Press. Sherrington, C. S. (1941). Man on his nature. New York: Macmillan. Skinner, B. F. (1953). Science and human behavior. New York: Macmillan. Slevc, L. R., & Ferreira, V. S. (2006). Halting in single word production: A test of the perceptual loop theory of speech monitoring. Journal of Memory and Language, 54, 515– 540. Stottinger, E., & Perner, J. (2006). Dissociating size representation for action and for conscious judgment: Grasping visual illusions without apparent obstacles. Consciousness and Cognition, 15, 269–284. Suhler, C. L., & Churchland, P. S. (2009). Control: Conscious and otherwise. Trends in Cognitive Sciences, 13, 341–347. Taylor, J. A., & Ivry, R. B. (2013). Implicit and explicit processes in motor learning. In W. Prinz, M. Beisert, & A. Herwig (Eds.), Action science (p. 63–87). Cambridge, MA: MIT Press. Tononi, G., & Edelman, G. M. (1988). Consciousness and complexity. Science, 282, 1846– 1851. Tranel, D., & Damasio, A. R. (1985). Knowledge without awareness: An autonomic index of facial recognition by prosopagnosics. Science, 228, 1453–1454. Tye, M. (1999). Phenomenal consciousness: The explanatory gap as cognitive illusion. Mind, 108, 705–725. Ulhaas, P. J., Pipa, G., Lima, B., Melloni, L., Neuenschwander, S., Nikolic, D., et al. (2009). Neural synchrony in cortical networks: History, concept and current status. Frontiers in Integrative Neuroscience, 3, 17. Varela, F., Lachaux, J. P., Rodriguez, E., & Martinerie, J. (2001). The brainweb: Phase synchronization and large-scale integration. National Review of Neuroscience, 2, 229–239. Velmans, M. (1991). Is human information processing conscious? Behavioral and Brain Sciences, 14, 651–669. Velmans, M. (2000). Understanding consciousness. London: Routledge. Vroomen, J., & de Gelder, B. (2003). Visual motion influences the contingent auditory motion aftereffect. Psychological Science, 14, 357–361. Vygotsky, L. S. (1962). Thought and language (E. Hanfmann & G. Vakar, Trans.). Cambridge, MA: MIT Press. Wraga, M., Creem, S. H., & Proffitt, D. R. (2000). Perception-action dissociations of a walkable Müller–Lyer configuration. Psychological Science, 11, 239–243. Yates, J. (1985). The content of awareness is a model of the world. Psychological Review, 92, 249–284. Zeki, S., & Bartels, A. (1999). Toward a theory of visual consciousness. Consciousness and Cognition, 8, 225–259.