Animals and conscious thought

I'm not sure that it's true that ants have no forethought.

Leaf cutter ants display forethought in their cultivation of fungus. For example, if they end up with a bad quality of fungus, maybe it doesn't taste right to them, the colony will try a different type of leaf. Some leaves have natural toxins in them which also harm the fungus. If the colony discovers a type of leaf that harms their fungus cultivar, they will not collect that leaf again.
Is that true forethought and decision making or more of an instinctual action? I don't know if that can be answered...lol.
 
Is that true forethought and decision making or more of an instinctual action? I don't know if that can be answered...lol.

Well, what is instinct? It is an inherent behavior, right?

So, one would think that if it were instinct, they'd already know what leaves are bad and not collect them. But a specific colony actually learns after collecting a bad leaf once. It's not pre set, since they collected the leaf unaware and then discover that the leaf is a bad leaf.

So, in essence, they are able to adaptively monitor the condition of the fungus, and have some type of memory of what leaf they collected, and make the connection that this leaf out of the millions of other leaf types is related to the condition of the fungus.

Sure maybe they don't think it out in their little ant brains like a human does, but they don't need to. It follows though that for such adaptive behavior, they must have some type of memory, the ability to identify leaves, connect them to the state of the fungus - possibly in combinations that have not been tried before, making instinct kind of no help, trial and error - and they have some mechanism that forbids them from collecting the leaf again after the initial mistake, which is avoidance of a future consequence. They have to identify the leaf, which means they somehow have to remember it, in order to not collect it.
 
Well, what is instinct? It is an inherent behavior, right?

So, one would think that if it were instinct, they'd already know what leaves are bad and not collect them. But a specific colony actually learns after collecting a bad leaf once. It's not pre set, since they collected the leaf unaware and then discover that the leaf is a bad leaf.

So, in essence, they are able to adaptively monitor the condition of the fungus, and have some type of memory of what leaf they collected, and make the connection that this leaf out of the millions of other leaf types is related to the condition of the fungus.

Sure maybe they don't think it out in their little ant brains like a human does, but they don't need to. It follows though that for such adaptive behavior, they must have some type of memory, the ability to identify leaves, connect them to the state of the fungus - possibly in combinations that have not been tried before, making instinct kind of no help, trial and error - and they have some mechanism that forbids them from collecting the leaf again after the initial mistake, which is avoidance of a future consequence. They have to identify the leaf, which means they somehow have to remember it, in order to not collect it.
Well, I concede that there is some decision making involved albeit on a different level...I still tend to view them as more of a programmed animal, which I suppose we are as well just in a more complex way. So let me ask you...do you think ants have moments of self-realization and recognize themselves as individuals? What is the least complex animal that would have such self-realization and actualization? And just for fun...would a computer, if it were to reach the level of self-realization and develop what we would call “feelings” have what we would also call a “soul” or be a participant in the consciousness that humans share?
 
Well, I concede that there is some decision making involved albeit on a different level...I still tend to view them as more of a programmed animal, which I suppose we are as well just in a more complex way. So let me ask you...do you think ants have moments of self-realization and recognize themselves as individuals? What is the least complex animal that would have such self-realization and actualization? And just for fun...would a computer, if it were to reach the level of self-realization and develop what we would call “feelings” have what we would also call a “soul” or be a participant in the consciousness that humans share?

I think I'll have to refer to the zombie hypothesis on this one. We don't truly know that anyone or anything else recognizes itself as an individual. I don't really know that even you recognize that, even if you claim to, because one can always say that it is instinct or emulation.

How do I really know that you do? I base it on feelings, not categories or hypothesis. I can't get inside your head and prove that you're really there - technically you could be a very good robot. I feel that such a good robot is not possible but there's no way that I can actually and definitively prove 100% that you are not one.
 
I think I'll have to refer to the zombie hypothesis on this one. We don't truly know that anyone or anything else recognizes itself as an individual. I don't really know that even you recognize that, even if you claim to, because one can always say that it is instinct or emulation.

How do I really know that you do? I base it on feelings, not categories or hypothesis. I can't get inside your head and prove that you're really there - technically you could be a very good robot. I feel that such a good robot is not possible but there's no way that I can actually and definitively prove 100% that you are not one.
So semantics aside...a robot could technically reach self-realization then? Does it take part in a universal collective consciousness when it is “shut off”? Or is it just a complex program that can emulate feelings perfectly and nothing more?
 
So semantics aside...a robot could technically reach self-realization then? Does it take part in a universal collective consciousness when it is “shut off”? Or is it just a complex program that can emulate feelings perfectly and nothing more?

Well since I don't believe in an actual soul, I believe we are an aggregate, a collection of conditions, it would also follow that anything which attains a similar or sufficient collection of conditions that includes and supports consciousness will end up being a conscious being.

So yes, if - hypothetically speaking - that robot is made of an aggregate that supports consciousness, it would be a conscious robot. I don't know that one will ever be made but it should stand to reason that if an aggregate can have consciousness, then it is possible and reasonable that another sufficient aggregate can too.
 
Well since I don't believe in an actual soul, I believe we are an aggregate, a collection of conditions, it would also follow that anything which attains a similar or sufficient collection of conditions that includes and supports consciousness will end up being a conscious being.

So yes, if - hypothetically speaking - that robot is made of an aggregate that supports consciousness, it would be a conscious robot. I don't know that one will ever be made but it should stand to reason that if an aggregate can have consciousness, then it is possible and reasonable that another sufficient aggregate can too.
That is just so interesting to me! Sorry got a little excited there...I suppose we will just have to wait until we die to see for sure...if we have enough reasoning at that time as well that is.
 
aren't we all biorobots
We are of course...I was implying any creature that moves and reacts purely on instinct...your reptilian brain so to speak.
 
We’re all of less awareness by “scale” to others if we’re all to be honest. So to say an animal is less conscious or more would be of less consciousness to ourselves.

Shed your skin.
 
Back
Top