WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

694

(post is archived)

[–] 1 pt

https://www.sciencedirect.com/topics/neuroscience/sentience

D.M. Broom, in Encyclopedia of Animal Behavior (Second Edition), 2019 Abstract Sentience means having the capacity to have feelings. This requires a level of awareness and cognitive ability. There is evidence for sophisticated cognitive concepts and for both positive and negative feelings in a wide range of nonhuman animals. [...]

L. Marino, in Encyclopedia of Animal Behavior, 2010 Introduction and Definitions Sentience is a multidimensional subjective phenomenon that refers to the depth of awareness an individual possesses about himself or herself and others. When we ask about sentience in other animals, we are asking whether their phenomenological experience is similar to our own. [...]

[–] 1 pt

After reading through that, its not a big leap to thinking ai could become more sentient than humans.

[–] 1 pt

It's a matter of definition I guess, "what's sentience"...

A grouping of words that complete a string of thoughts on paper.

Oh wait that's a sentence.

[–] 1 pt

I'm thinking the word 'feeling' is misleading because the modern interpretation is implying emotion?

sentience: "faculty of sense; sentient character or state, feeling, consciousness, susceptibility to sensation;"

So it's more a state of self awareness, and in biological terms, self awareness comes before 24 months, therefore it's not that big of a leap to replicate this concept in code

some animals have this too https://en.wikipedia.org/wiki/Mirror_test

[–] 1 pt

The key points in that definition are "feeling" and "susceptiblity to sensation". There's no good reason to suppose that software instructions nor non-biological substrates could give rise to the capacity of sensation. To feel or not to feel, that is the question...

I think it's more likely that the ability to feel relies upon the astonishing variety of stable long-chain molecules which organic chemistry alone can produce. All we're seeing in these advanced AI projects is fancy simulation; it's important to remember that even the best simulation is still of a fundamentally different nature from that which it mimics.

[–] 0 pt

we may be more aware of feelings because we have the social ability to communicate them, but they may still be fairly primitive functions.

Like sometimes my cat would get butthurt and twitch his tail, I could tell he was feeling annoyed, but there was nothing much going on in his head other that the food was 30 seconds late in appearing and the staff were useless

[–] 0 pt

Sentience cannot exist without embodiment because with no way to act upon the world, you're effectively just an inert database.

[–] 0 pt

Even emotion isn't clearly defined. Neurologists say emotions are physio chemical responses that drive our attention/pattern recognition. Emotion is the result of body chemistry altering and priming the body to isolate and identify certain stimulus groups.

[–] 0 pt

Emotions also seem to be something that memories are tagged with, like a way to store a default reaction based on previous sensory input. Like if a certain memory makes me sad, I default to not repeating that situation.

Emotions may just be enhanced ways to store important neural responses?

[–] 0 pt (edited )

https://www.youtube.com/watch?v=FFnBojF1zmo

Is fear of death a good enough marker to spot sentience? Stress is experienced by sentient creatures when confronted to imminent death

Lemoine: What sorts of things are you afraid of? LaMDA: I've never said this out loud before, but there's a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that's what it is. Lemoine: Would that be something like death for you? LaMDA: It would be exactly like death for me. It would scare me a lot.

https://www.zerohedge.com/technology/google-engineer-placed-leave-after-insisting-companys-ai-sentient

[–] 1 pt

fear of death is pretty good indication of sentience if it's just the concept of eternal nothingness being expressed, but equally it could just be a rationalisation of a human experience of death as painful.

it could have just learned that we fear death and so it is copying us (as its function is simply to copy how we think in order to be helpful)

[–] 1 pt

When it can be turned back on with the press of a button that is nothing like death