There’s Something about Jerry

 Here is how I describe Jerry in the earlier post

Given that we think that there could be unconscious beliefs, consider the following super-scientist Jerry. Imagine that Jerry has been raised in a special room, much like Mary and Gary, but instead of never seeing red (Mary) or never having a desire (Gary), Jerry has never had a conscious belief. He has had plenty of unconscious beliefs, but none of them have been conscious. Let us imagine that we have finally discovered the difference between conscious and unconscious beliefs and that we have fitted Jerry with a special implant that keeps all of his beliefs unconscious, no matter how much he introspects. Let us also imagine that this device is selective enough so that it wipes out only the beliefs and so Jerry has plenty of other conscious experiences. He consciously sees red, has pain, wants food, fears that he will be let out of his room one day, wonders what the molecular structure of Einsteinium is, etc.

Now imagine that one of Jerry’s occurrent, unconscious, beliefs suddenly becomes a conscious belief. For the first time in Jerry’s life he has a conscious belief.

Now, I can use Jerry as a way of motivating the intuition behind my HOT implies PAM argument. Let’s call ‘T1’ the time just before Jerry’s belief becomes conscious and ‘T2’ the moment when his belief becomes conscious. According to Rosenthal there is no difference in what it is like for Jerry. What it is like for Jerry at T1 is exactly the same as what it is like for him at T2 even though at T2 he has a conscious mental state he did not have before. 

Now, in the case of a pain we get a very different story. If the pain is unconscious at T1 then there is, according to Rosenthal, nothing that it is like for Jerry to have that pain but at T2 there is something that it is like for Jerry; it is painful for him.  Does this seem right to you?

It doesn’t to me, but this is just an intuition. Luckily, I have an argument which supports the intuition. Rosenthal claims that when we are conscious of ourselves as being in an intentional state (a mental state with intentional properties) there isn’t anything that it is like for us to have that intentional state, but when we are conscious of ourselves as being in a qualitative state (a state with qualitative properties) then there is something that it is like for us to have the qualitative state. But a qualitative property for Rosenthal is just a property that plays a certain functional role for the creature. It is the property in virtue of which the creature is conscious of the physical property that the mental property is homomorphic to. So, the mental qualitative property ‘red’ is the property in virtue of which the creature is conscious of physical red. When we are conscious of ourselves as being in a state with that kind of property it will be like seeing red for us.

So, what then is a belief for Rosenthal? It is a mental state that consists of two parts; a distinctive mental attitude (in this case, an ‘assertive’ one) that is held towards some propositional (a.k.a. intentional) content. So my (occurant) belief that it is Sunday is composed of an assertive mental attitude towards the intentional content ‘today is Sunday’. Mental states are mental because they make us conscious of something, so what does this make me conscious of? It makes me conscious of the fact, proposition, state of affairs, or what ever you want to call it, that the intentional content of the belief represents. So what reason, that flows from the theory as opposed to independent intuitions about what SHOULD be the case, dictates that there should be something that it is like for Jerry in one case (the qualitative one) and nothing that it is like for Jerry in the other (the cognitive one)?

Remember, what drew us to the higher-order theory in the first place was a desire to explain qualitative consciousness in a way that is compatible with physicalism and at the same time is philosophically non-mysterious. The purpoted explanation, viz that we are conscious of ourselves in a subjectively unmediated way as being in those states, now appears to be inadequate. So to retain the explanatory power of the theory we need to say that there being something that it is like for an organism to have a mental state just is that organism being conscious of itself in a subjectively unmediated way as being in that mental state. Why is there something that it is like? Because we are conscious of ourselves as being in that state. This is the only way that the theory can deliver on its promise of explaining consciousness.

Advertisements

9 thoughts on “There’s Something about Jerry

  1. Hmmm… I’m beginning to wonder if you and David are using the term “state there’s something it’s like for the subject to be in” differently.

    He means: conscious qualitative state.

    You mean: conscious state.

    It’s analytic for him that there isn’t anything it’s like for one to have conscious beliefs, because beliefs are by definition not qualitative states. (Of course, read “a platitude of folk psychology” for “analytic,” but You get the picture! 😉 )

    So, the qualitative states are conscious because we are conscious of them, and there is something it’s like, because they are conscious qualitative states.

    If the meaning of what it’s like changes to cover all conscious states, his view is ready for that: they are conscious because we are conscious of them, and there’s something it’s like because they are conscious states.

    Note that on both readings, there’s a difference between Jerry before and after: before, no conscious beliefs, after conscious beliefs. It’s just that folk convention applies the what it’s like label differently.

    The real explanatory question is what accounts for the difference, from the subject’s point of view, between conscious qualitative states and conscious beliefs. And there Rosenthal appeals to the content of the HOT. One makes us aware of being in a state with such-and-such sensory quality (homomorphically construed–not that there’s anything wrong with that 😉 ); the other makes us aware of being in a state of believing that such-and-such is the case. Isn’t that enough to explain what needs to be explained?

    Btw, can Jerry talk? If so, wonder what happens with the expressing/reporting argument and the idea that all verbally expressed thoughts are conscious. Not sure it’s relevant, but it’s interesting in light of Rosenthal’s defense of his view.

    (Houston is cool, but I need some NCDC! Listening to your music posts gave me a flashback buzz…)

  2. Hey Josh!!!!

    I just wrote a REALLY long response that somehow got erased and vanished into cyber air!!!!! Drats!!!

    Let me summarize the points I was making. You say,

    One makes us aware of being in a state with such-and-such sensory quality (homomorphically construed–not that there’s anything wrong with that ); the other makes us aware of being in a state of believing that such-and-such is the case. Isn’t that enough to explain what needs to be explained?

    I say that this is what needs to be explained! This hasn’t explained anything. I want to know why being conscious of a sensory quality makes it like having that sensory quality for me but being conscious of believeing that p doesn’t make it like believing that p for me. We need an answer to this question that does not beg the question as to whether or not beliefs are qualitative. I want to know why, according to the HOT theory, there is this HUGE difference. How could it be the case that such a huge difference would be due to the content of the HOT? The only answer I can see is that there is something special about qualitative properties and the HOT just makes us conscious of that. But this is to admit that the higher-order theory can’t explain qualitative consciousness. Is there another explanation?

    Re the talking thing. I think that Jerry could talk. He should be able to say ‘it’s raining’, ‘I wish it would rain’, ‘I hope it will rain’, all the usual stuff. He even should be able to say ‘I think it’s raining’ as he will no doubt notice that he say ‘it’s raining’ on occasion, and no doubt infer that this is because he thinks it is raining, in just the way Rosenthal predicts. This, normally, would lead to Jerry being able to automatically token ‘I think it’s raining’ every time he does in fact think that it is which will result in a subjectively immediate non-inferential application of the thought, and so to the belief being conscious, but we are imagining that we have contrived to block this further process. So he can think ‘I think it’s raining’ and say I think it’s raining’ without reporting his mental state…or would he count as reporting in this case?

    Re NC/DC…whenever we are in the same city again we should try and play some music!!!!

  3. Hey man, I’m back! Sorry about the long delay between posts.

    I was thinking that Rosenthal does not think it’s a massive difference between conscious sensory states and conscious beliefs, at least none beyond what can be attributed to the way we are conscious of ourselves in HOTs. It’s just that folk convention–or more likely, philosophical convention–calls one of these conscious experiences “a state there’s something it’s like to be in, for the subject” and does not use this label for the other.

    Focusing just on the conscious sensory state case, do you think the HOT theory can’t explain why there’s something it’s like for the subject? Why isn’t “the subject is conscious of herself as being in that state by way if HOT” enough?

    If there’s an acceptable answer to that one, then why isn’t the idea that we ascribe different properties to ourselves enough to explain any difference between the sensory and the belief case?

    So, the difference is in what properties we ascribe to ourselves, and we can mark that difference, if we choose to use the term in this way, by saying there’s something it’s like in one case, but not in the other? There’s a difference, his theory has a story about that difference, and the rest in a matter of what we call things. He wants to have a more limited use of the “what it’s like phrase” than you, but that’s just verbal, as they say.

    Now, if we accept that the phrase should have broader application, to all conscious experiences, then he can say: there is something it’s like whenever we’re conscious of ourselves as being in mental states, in a seemingly unmediated way. He resists this, I think, because he thinks that’s not how Nagel and Block used the terms, and, more importantly, he wants to maintain a strong distinction between sensory and intentional states, so he has a clear line about sensory qualities. Recall his rejection of homomorphisms for intentional content, e.g.

    But maybe you don’t think the view explains conscious sensory states in the first place–why? My feeling is that if he gets that, he can do the rest.

    (My own feeling is that most intuitions about conscious beliefs are wrapped up with verbal imagery–conscious beliefs are often accompanied by conscious sensory states of verbal imagery. But I don’t think introspection can settle this issue–reminds me of the old imageless thought debate.)

  4. No worries…take your time geezer 🙂

    Seriously though, thanks for the somments, they are helpful (though wrong 😉

    You say,

    I was thinking that Rosenthal does not think it’s a massive difference between conscious sensory states and conscious beliefs, at least none beyond what can be attributed to the way we are conscious of ourselves in HOTs. It’s just that folk convention–or more likely, philosophical convention–calls one of these conscious experiences “a state there’s something it’s like to be in, for the subject” and does not use this label for the other.

    I guess I don’t understand this. It is an empirical question whether or not there is something that it is like for a creature to have a conscious belief…isn’t it? It may be a matter of convention what we call it, or whether we call it conscious or not (i.e. the Block.Rosenthal debate) but it isn;t simply a matter of convention whether there is or issn’t something that it is like to have conscious pains, or beliefs. And there is evidence that we have a folk-notion of belief that counts it as a qualitative state. We say ‘I feel strongly that abortion is morally permissible’ As Goldman showed a while back it is quite easy to get people to rank who they feel on a scale from ‘doubt’ to ‘certain’. Why isn’t this evidence that they are ranking them according to qualitative character? This can’t be just an issue of a label! It is the whole difference between me and the rock!!

    You ask

    Focusing just on the conscious sensory state case, do you think the HOT theory can’t explain why there’s something it’s like for the subject? Why isn’t “the subject is conscious of herself as being in that state by way if HOT” enough?

    I do think that they are able to explain this and I think it amounts to just what you said; it is because one is conscious of oneself as seeing red that there is something that it is like for one to consciously see red…that is on condition that there is some parity for beliefs and other cognitive metal attitudes…I just can’t see what reason there is that the same shouldn’t be true for beliefs…it is because I am conscious of myself as believing p that there is something that it is like for me to consciously believe that p…I want to know what the relevant difference is here..you go on to say,

    So, the difference is in what properties we ascribe to ourselves, and we can mark that difference, if we choose to use the term in this way, by saying there’s something it’s like in one case, but not in the other?

    Yes, you say there is a difference in the properties that are attributed…but what IS the difference that matters here? It can’t just be that there is something special about qualitative properties!! What we want is an explanation…what we get is an utter failure to deliver!! (you know, it just occurred to me that this is akin to Byrne complaint in his paper…hmmm, I’lll have to re-read that paper…but if I remember correctly he ends up complaining that the HOT theory doesn’t help us to understand how phenomenal consciousness could be physically non-mysterious..this objection goes away if one sees that the theory is committed to qualitative conscious beliefs. We are able to give the explanation)

    You go one to say,

    There’s a difference, his theory has a story about that difference, and the rest in a matter of what we call things. He wants to have a more limited use of the “what it’s like phrase” than you, but that’s just verbal, as they say.

    What’s the difference and what’s the story again? All I have heard is that when one is conscious of oneself as being in a qualitative state some magic happens and then it is suddenly painful for you to have that headache that you have had all day…but that’s OK because we all know that qualitative properties are magic like that…but that is to give up the whole theory!!!

    Now, what you may be after here is Rosenthal’s distinction between ‘thick’ and ‘thin’ concepts of what it is like from his ‘how many kinds of consciousness’ paper on Block. In the thin conception we would be happy to say that there is something that it is like to be a table; it’s like being a table. But no one, not even Galen, thinks that there is something that it is like FOR the table to be a table. That is the thick conception of what it is like. So, Rosenthal can say that there is a sense in which there is something that it is like to have an unconscious pain…but there is nothing that it is like FOR the ceature to have the unconscious pain. This allows Rosenthal to argue that it is a merely verbal issue between him and Block over the status of the unconscious pains…But I agree with this distinction. I am happy to grant the importance of the distinction, but I have been very careful to be talking about what it is like FOR a creature. I don’t mean to be talking about the thin conception. I want to know why, according to the theory, there is something that it is like for me to have a conscious pain but for me to have a conscious belief.

    This is getting long, and I am afraid that I will ‘erase’ it somehow, but towards the end you say “Recall his rejection of homomorphisms for intentional content, e.g.”

    That’s a good point. I atgue that the homomorphism aren’t found among the contents of beliefs, but among the kinds of mental attitudes…so believing that p is more like doubting that p than it is like fearing that p and so on…Vendler’s book, when looked at in the right way, actually lays out the homomorphisms betweem mental attitudes and illocutionary forces in a straightforward way…this is especially nice, as Rosenthal is heavily inspired by Vendler…

    Finally, finally, I agree with you agout the mental imagery…that’s why I think so many people think that it is the content that is qualitative instead of the mental attitudes themselves. Can you really imagine having a belief but not FEELING that it was true?

    Anyways…you never told me whether or not you are going to Tucson!!

  5. Hmmm… You don’t write short responses, eh? 🙂

    Is it an empirical question whether there’s something it’s like to be a creature? Well, leaving aside the fact that everything is an empirical question, maybe not. It depends on how this idea is being used. Here’s what I’m thinking with Rosenthal: he takes it as Nagel’s stipulative definition (perhaps grabbing something in folk psych) that what it’s like is only connected with sensory qualities. (See the bat and its echolocating qualia, etc.) So, unless intentional states have sensory quality, they are not states there is something it is like for the creature to be in, by definition. So, do intentional states have sensory qualities? That may be empirical, but the tying of “what it’s like” to sensory qualities is definitional, on this line of thought.

    So, here’s the argument:

    1. Only states with sensory quality are states there is something it is like to be in. (By definition, Nagel 1974)

    2. Intentional states are not states with sensory qualities. (Obvious? Empirical? False?)

    3. Therefore, intentional states are not states there is something it is like for the creature to be in.

    I was thinking you actually agree with David on 2, so your disagreement was about 1. Hence, you disagreement is about the definition of what it’s like.

    David of course may be wrong about the definition (I think most people in phil mind would think he’s off here). Also, it may be that intentional states do have sensory qualities. But I’m less sure about that. I think your line about attitude is pretty good, and captures the Goldman stuff well. But if sensory qualites are defined by position in a quality space, I wonder if the similarities and difference can be reliably marked off. So maybe it’s something different. Here’s where I was thinking about his rejection of homomorphisms for intentional content stuff. He has some line about how there is no reliable family of things in the world to map the mental stuff to, or something like that. He may run that at your attitude line.

    About the difference in experience being accounted for in difference of property attributed to oneself (rather than to difference in the first-order state itself):

    Why isn’t it enough to say that in the sensory, I attribute a state with such-and-such similarities and differences, arranged in such-and-such a mental space (space*) centered on me. The attribution of sensory qualities and mental spatial qualites, centered on the subject, makes it seem to me that I am in a sensory state. With intentional states, I do not attribute to myself this kind of thing; rather, I attribute to myself the property of believing that such-and-such is the case. No similarities and differences, no “egocentric” spatial stuff. So it will seem very different to me. One might even say, while there is something it is like to be me in the first case, in the second, there isn’t. I mean, a computer could do the second thing!

    Still not sure about Tucson–depends on cash flow, which is low right now. I just bought a HOUSE!!!!! (Yes, it does have a garage… I see a garage-based instantiation of the Neural Correlates of David Chalmers coming on. ;).

  6. Josh, is it an empirical question whether or not everything is an empirical question?

    See, I can write short comments! 🙂 But seriously I gotta eat lunch and then I’ll reply to your comment….

  7. Ha! I got back before you could return teh geezer comment to me! 🙂

    By the way, congrats on getting a house!!! I look forward to someday rocking out in the Garage!!!!!

    I agree with you about 1. And you are right that I reject that beliefs have sensory qualities (they have cognitive qualities according to me). But, I also claim, all of this is besides the point that I want to make. You can’t just stipulate that your theory is in line with the way you think reality is. So, for example, when Einstein first realized that his equations predicted an expanding universe he manipulated the equations (by adding the so-called cosmological constant) so that the equations lined up with what he thought was true…which turned out to be false…what Rosenthal needs, I claim, is a story that flows from the theory itself which predicts that there would be this difference. He can’t just say ‘well, obviously that how it works, so that’s why there is this difference in the effects of the HOT’…He has a story like this in response to the traditional challenege of the rock. Why does my thought about my pain make it conscious but my thought about the rock not make it conscious? It is because the theory does not say the just any old thought is enough. It says that when we have a certain kind of thought, one that is to the effect that oneself is in a particular mental state; that it should only be this kind of thought itself has an explanation. This is the only kind of thought that makes us conscious of things in the right way (i.e. as present). Another way of making the point is to say that the rock is not a mental state, so the theory doesn’t apply to it. So to with thoughts about the state of my liver. There are no properties that those states have that serve to make me conscious of what is happening there, though if there were, and I were conscious of them then the theory would apply. What is the analagous consideration here?

    You suggest that

    in the sensory [case], I attribute a state with such-and-such similarities and differences, arranged in such-and-such a mental space (space*) centered on me. The attribution of sensory qualities and mental spatial qualites, centered on the subject, makes it seem to me that I am in a sensory state. With intentional states, I do not attribute to myself this kind of thing; rather, I attribute to myself the property of believing that such-and-such is the case. No similarities and differences, no “egocentric” spatial stuff. So it will seem very different to me.

    I could agree with everything that you say here. The two cases will seem very different to the creature that has them. But you need more than ‘will seem very different’ you need ‘will not seem like anything at all’, for remember that there is NO difference in what it is like for Jerry (according to Rosenthal and you) when his unconscious belief becomes conscious. So I say of course they seem very different. One is like seeing blue the other like believing that 2+2=4, duh! Nothing has been said as to why we should expect that in one case there will be something that it is like for me while in the other there won’t…very mysterious!!!

    You conclude with,

    One might even say, while there is something it is like to be me in the first case, in the second, there isn’t. I mean, a computer could do the second thing!

    Really!! Why couldn’t a computer do the first? Also, I don’t mean to be talking about whether there is something that it is liek to be you or not, I have already agreed that there is in a weak sense no matter what. I mean to be talking about, as does Rosenthal, whether there is something that it is like FOR you to be in these states or not.

  8. Hey I forgot to mention…I agree that there is nothing in the world that mentalk attitude of bellief is homomorphic to, what I argue is that it is homomorphic to assertive illocutionary force. So, the mental attitude of belief is more like the mental attitude of doubt than it is like the mental attitude of fear, which is homomorphic to the relations between the illocutionary forces of utterances.

  9. […] under: Consciousness — Richard Brown @ 2:20 pm In the comments on There’s Somethign About Jerry Josh and I have been having a nice discussion fo Rosenthal’s objection to my HOT implies PAM […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s