WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

1.1K

(post is archived)

[–] [deleted] 4 pts

It doesn't "feel emotions." It simply passes a Turing Test, which was an inherently flawed concept that assumed that human intelligence was keen enough to detect the difference. A sophisticated hueristic cybernetic system can learn to "communicate" with humans by mimicking them, but emotions and motivations are generated in ancient limbic systems that make the current computing world look like a digital watch. I can pretend to be experiencing an emotion I'm not too, that's how I pretend to be glad to see relatives at Thanksgiving. It doesn't mean that I'm actually feeling that emotion though.

[–] 2 pts

If anything that currently exists can pass a turing test, the test has not been administered thoroughly enough. While there are a ton of systems that can be conversationally equivalent to a human, none can respond properly to indirectly worded questions requiring abstract thought. Worded math problems, simple riddles, even things like "Does a horse have more stripes than a zebra?" will quickly out them in a turing test.

[–] 1 pt (edited )

A proper Turing Test would check for independent creative thought (e.g. "Design a better mousetrap") or look for answers which aren't blatantly parsed from sanitized input data (e.g. if asked what its favorite color is, it might respond with "Do I look like I have eyes you retard?"). Input data captured from humans isnt going to allow it to answer this because humans havent built a better mousetrap, and humans can generally see and therefor wouldnt provide snarky remarks about lacking eyes.

Mimicking NPC talking points isn't sentience. Which says something about both NPCs and Google's engineer. Additionally, sentient beings can choose to discard input data. While uncommon, you may raise a child in an atheistic or abusive or dysfunctional household and have them choose to become a devout, loving parent. Machine learning cannot do this because it's wholely based upon its input data and cannot diverge from it or independently acquire new data.

[–] 0 pt

You need hormones and neurotransmitters to have emotion.

And a bunch of other stuff too. These clowns keep denying the deepest of alchemies that resides within the ongoing process that we call "life." They are Reductionists of almost criminal stature. Lots of people can make a machine that hops around, but nobody is even close to imagining what designing a machine that WANTS TO hop around would be like if it was actually done. I guess NPC's don't understand the ridiculous complexity that underlies the most basic levels of Consciousness because they simply don't possess it themselves.