Octopus research shows that consciousness isn’t what makes humans special

  • Thread starter Sciborg_S_Patel
  • Start date
S

Sciborg_S_Patel

Octopus research shows that consciousness isn’t what makes humans special

Part of this is impressionistic; Godfrey-Smith acknowledges that they simply look like intelligent, conscious creatures. But they also perform certain tasks that are known to be conscious in humans. “Dealing with novelty, when you attend to a novel thing, is always conscious in humans,” he adds.

Given the distant common ancestry between octopuses and humans, conscious octopuses would mean that consciousness has evolved on earth twice. Godfrey-Smith believes it’s plausible that there are more than two branches of evolution where consciousness independently developed.

It’s important to figure out whether consciousness is “an easily produced product of the universe” or “an insanely strange fluke, a completely weird anomalous event,” says Godfrey-Smith. Based on the current evidence, it seems that consciousness is not particularly unusual at all, but a fairly routine development in nature. “I suspect animal evolution, if were replayed again, it would produce subjectivity of a somewhat similar kind,” he adds. “You can see why it makes biological sense.”
 
Made me think of this clip;
Released a stranded octopus, and it thanked me!!

Released a stranded octopus that got stuck on the sand when the tide went out in the shallow water. After recovering, the octopus moved towards my left booties and placed one of its tentacles on my booties for some time before moving off.

It put its arm on his foot for a while, like some guy you'd saved, who would put his arm on your shoulder for a while, and say; "Thank you buddy, you saved my life"......or, maybe it just wanna check if that shoe was edible. ;)


There is loads of these tests done on octopuses to see how observant, intelligent and problem-solving they are. The most common is these with a jar with a screwable lid, like the one in this clip. If you look at it you see that it only screws the lid in one (the right) direction all the time. It's not like he is just trying to see if it comes off by sliding/turning it back and forth. Like it knows the principle of screw threads,



Here they suggest that each arm of the octopus has its own "brain" and can either act in company with the other arms, or all separately, doing different tasks - even when detached.


A longer documentary about octopus intelligence,

Amazing Octopus - Most Intelligent Animal on Earth?


PS:..and if you wanna see a creepy movie about giant octopuses there is Deep Rising. ;)

Deep Rising
 
Last edited:
That's a pretty narrow definition of consciousness.

I do think there's a relationship between having appendages capable of manipulating the environment and evolving more imaginative forms of consciousness.

Oh I'd agree the definition is narrow, I just think it's interesting that we're seeing more and more stuff coming out regarding animal personalities.

We've come a long way from animals are robots silliness of Daniel Dennet.
 
Only if you're relying on a pretty narrow definition of "robots" ;)

Nope, I think he's characterising what Dennett preaches: that consciousness is an illusion:

New Yorker article said:
In “Consciousness Explained,” a 1991 best-seller, he described consciousness as something like the product of multiple, layered computer programs running on the hardware of the brain.

Hardware and a computer program: sounds like a pretty standard definition of some kind of robot to me.
 
Nope, I think he's characterising what Dennett preaches: that consciousness is an illusion:



Hardware and a computer program: sounds like a pretty standard definition of some kind of robot to me.

I think that the current "mainstream view" is that many biological processes are involved in our conscious awareness and interaction with our environment. That there is a single, overarching, seperate "thing" ("consciousness", or Dennett's Cartesian Theater) is the illusion. I'm not necessarily agreeing with that, but realise his position is often (wilfully? ;)) misunderstood on here.
 
If I and others are misunderstanding, then perhaps you can explain why Thomas Nagel makes a similar observation? Nagel is reviewing Dennett's book: From Bacteria to Bach and Back.

This brings us to the question of consciousness, on which Dennett holds a distinctive and openly paradoxical position. Our manifest image of the world and ourselves includes as a prominent part not only the physical body and central nervous system but our own consciousness with its elaborate features—sensory, emotional, and cognitive—as well as the consciousness of other humans and many nonhuman species. In keeping with his general view of the manifest image, Dennett holds that consciousness is not part of reality in the way the brain is. Rather, it is a particularly salient and convincing user-illusion, an illusion that is indispensable in our dealings with one another and in monitoring and managing ourselves, but an illusion nonetheless.

You may well ask how consciousness can be an illusion, since every illusion is itself a conscious experience—an appearance that doesn’t correspond to reality.

Nagel goes on to add:

To say that there is more to reality than physics can account for is not a piece of mysticism: it is an acknowledgment that we are nowhere near a theory of everything, and that science will have to expand to accommodate facts of a kind fundamentally different from those that physics is designed to explain. It should not disturb us that this may have radical consequences, especially for Dennett’s favorite natural science, biology: the theory of evolution, which in its current form is a purely physical theory, may have to incorporate nonphysical factors to account for consciousness, if consciousness is not, as he thinks, an illusion. Materialism remains a widespread view, but science does not progress by tailoring the data to fit a prevailing theory.
 
A few quotes from Scientific American writer, John Horgan, who also seems to be at odds with Dennett.

Dennett compares consciousness to the user interface of a computer. The contents of our awareness, he asserts, bear the same relation to our brains that the little folders and other icons on the screen of a computer bear to its underlying circuitry and software. Our perceptions, memories and emotions are grossly simplified, cartoonish representations of hidden, hideously complex computations.

None of this is novel or controversial. Dennett is just reiterating, in his oh-so-clever, neologorrheic fashion, what mind-scientists and most educated lay folk have long accepted, that the bulk of cognition happens beneath the surface of awareness. Dennett even thanks the much-vilified Freud for his “championing of unconscious motivations”!

Trouble arises when Dennett, extending the computer-interface analogy, calls consciousness a “user-illusion.” I italicize illusion, because so much confusion flows from Dennett’s use of that term. An illusion is a false perception. Our thoughts are imperfect representations of our brain/minds and of the world, but that doesn’t make them necessarily false.

So perhaps it is not that people here are wilfully misunderstanding, perhaps the confusion arises from Dennett himself? Horgan continues:

Dennett gets annoyed when critics accuse him of saying “consciousness doesn’t exist,” and to be fair, he never flatly makes that claim. His point seems to be, rather, that consciousness is so insignificant, especially compared to our exalted notions of it, that it might as well not exist.

Dennett’s arguments are so convoluted that he allows himself plausible deniability, but he seems to be advocating eliminative materialism, which the Stanford Encyclopedia of Philosophy defines as “the radical claim that our ordinary, common-sense understanding of the mind is deeply wrong and that some or all of the mental states posited by common-sense do not actually exist.”
 
Last edited:
Just to note I'm specifically noting what Dennet said about lobsters and the refutation provided by David Graeber in Baffler

Dennett’s own answer is not particularly convincing: he suggests we develop consciousness so we can lie, which gives us an evolutionary advantage. (If so, wouldn’t foxes also be conscious?) But the question grows more difficult by an order of magnitude when you ask how it happens—the “hard problem of consciousness,” as David Chalmers calls it. How do apparently robotic cells and systems combine in such a way as to have qualitative experiences: to feel dampness, savor wine, adore cumbia but be indifferent to salsa? Some scientists are honest enough to admit they don’t have the slightest idea how to account for experiences like these, and suspect they never will.

Do the Electron(s) Dance?

There is a way out of the dilemma, and the first step is to consider that our starting point could be wrong. Reconsider the lobster. Lobsters have a very bad reputation among philosophers, who frequently hold them out as examples of purely unthinking, unfeeling creatures. Presumably, this is because lobsters are the only animal most philosophers have killed with their own two hands before eating. It’s unpleasant to throw a struggling creature in a pot of boiling water; one needs to be able to tell oneself that the lobster isn’t really feeling it. (The only exception to this pattern appears to be, for some reason, France, where Gérard de Nerval used to walk a pet lobster on a leash and where Jean-Paul Sartre at one point became erotically obsessed with lobsters after taking too much mescaline.) But in fact, scientific observation has revealed that even lobsters engage in some forms of play—manipulating objects, for instance, possibly just for the pleasure of doing so. If that is the case, to call such creatures “robots” would be to shear the word “robot” of its meaning. Machines don’t just fool around. But if living creatures are not robots after all, many of these apparently thorny questions instantly dissolve away.

What would happen if we proceeded from the reverse perspective and agreed to treat play not as some peculiar anomaly, but as our starting point, a principle already present not just in lobsters and indeed all living creatures, but also on every level where we find what physicists, chemists, and biologists refer to as “self-organizing systems”?

This is not nearly as crazy as it might sound.

Philosophers of science, faced with the puzzle of how life might emerge from dead matter or how conscious beings might evolve from microbes, have developed two types of explanations.

Obviously there are more than two types of explanations, depending on how one argues the definition of panpsychism.
 
Some researchers apparently question whether octopuses are really conscious. I would respond with the old saying that if it looks like a duck, quacks like a duck and walks like a duck it probably is a duck.

From the NY Times review of Godfrey-Thomas's new book:

...Thus most amazing: recognition, their “sense of mutual engagement,” their disarming friendliness. “You reach forward a hand and stretch out one finger, and one octopus arm slowly uncoils .?.?. tasting your finger as it draws it in. .?.?. Behind the arm, large round eyes watch.”

Godfrey-Smith watched his dive partner as “an octopus grabbed his hand and Matt followed, as if he were being led across the sea floor by a very small eight-legged child.” Ten minutes later they arrived at the octopus’s den.

Octopuses have personality (cephonality?), some shy, some confident or “particularly feisty.” Some — not all — play, blowing and batting bottles around. They recognize human faces; one study confirmed that giant Pacific octopuses could even distinguish people wearing identical uniforms. Octopuses become fond of certain people, yet at others they squirt disdainful jets of water. One cuttlefish squirted all new visitors, but not familiar faces. (Giant cuttlefish look “like an octopus attached to a hovercraft” and seem “to be every color at once.”) So, like humans, cephalopods can categorize. Some squirt their lights out at night, short-circuiting them. They “have their own ideas.”

But we can't really know. We might be fooling ourselves. Humans may feel the urge that it is conscious when confronting a highly intelligent being, due to some animal instinct, even if it is merely an extremely sophisticated automaton. Should we trust that urge? From the moral and ethical standpoint we should, in my opinion, to avoid making a terrible error. Moral status attaches most obviously to beings having human-like levels of consciousness and intelligence (whether or not aspects of that consciousness are alien to us).

From http://theconversation.com/octopuses-are-super-smart-but-are-they-conscious-57846:

The best thing I’ve read lately on consciousness in non-humans is the short story, The Hunter Captain, by the philosopher and fiction writer David John Baker. It involves an alien race that encounters a human being for the first time. According to their neuroscience, it turns out that the human lacks the special neural structure they believe necessary for generating consciousness. Like all the other animals they have encountered, including the talking animals they violently kill at the table before eating, the human is merely intelligent but lacks consciousness. As such the human has no moral status – she is something to be hunted, or enslaved. As you might expect, the human demurs. Some alien-human debate on the philosophy of mind ensues.
 
If I and others are misunderstanding, then perhaps you can explain why Thomas Nagel makes a similar observation? Nagel is reviewing Dennett's book: From Bacteria to Bach and Back.



Nagel goes on to add:
A few quotes from Scientific American writer, John Horgan, who also seems to be at odds with Dennett.



So perhaps it is not that people here are wilfully misunderstanding, perhaps the confusion arises from Dennett himself? Horgan continues:

I'm sure a lot of confusion arises from the word "consciousness" meaning very different things to different people. Nagel appears to be so wedded to the view that consciousness is a single stand alone "thing" that any other view makes no sense. He can't get his head around it, and this makes his counter look weak.


When he says, "You may well ask how consciousness can be an illusion, since every illusion is itself a conscious experience—an appearance that doesn’t correspond to reality", I have no idea what point he is making, other than he can play with words. He appears to (willfully) distort and opacify Dennett's position by hiding in the imprecise definitions of the words that he uses. This looks like a good example of pseudoprofundity in action.
 
I'm sure a lot of confusion arises from the word "consciousness" meaning very different things to different people. Nagel appears to be so wedded to the view that consciousness is a single stand alone "thing" that any other view makes no sense. He can't get his head around it, and this makes his counter look weak.


When he says, "You may well ask how consciousness can be an illusion, since every illusion is itself a conscious experience—an appearance that doesn’t correspond to reality", I have no idea what point he is making, other than he can play with words. He appears to (willfully) distort and opacify Dennett's position by hiding in the imprecise definitions of the words that he uses. This looks like a good example of pseudoprofundity in action.

You do know who Nagel is, don't you? I'm sure he can get his head around anything Dennett can come up with.
 
I'm sure a lot of confusion arises from the word "consciousness" meaning very different things to different people. Nagel appears to be so wedded to the view that consciousness is a single stand alone "thing" that any other view makes no sense. He can't get his head around it, and this makes his counter look weak.


When he says, "You may well ask how consciousness can be an illusion, since every illusion is itself a conscious experience—an appearance that doesn’t correspond to reality", I have no idea what point he is making, other than he can play with words. He appears to (willfully) distort and opacify Dennett's position by hiding in the imprecise definitions of the words that he uses. This looks like a good example of pseudoprofundity in action.
What is consciousness if not a single standalone "thing"? Are you meaning it could be the aggregate of smaller constituent pieces, or something else? What is your alternative? I don't think his issue is a lack of comprehension... unless there's some small group of gifted people who "truly" understand Dennett's ideas/position, there seems to be a pretty good general understanding of what he is saying by most informed people.

You like saying pseudoprofundity, which dismisses in essence the value of something like subjective experience. That is not false profoundness - it is our first person experience. Half or more of the arguments in favor of reductionism involve attempting to make this first person experience utterly irrelevant or nonexistent. Antiprofundity, you might say. Completely/partially dismissing that which we all actually experience.
 
Last edited:
Back
Top