WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

712

(post is archived)

[–] 1 pt

It didn't try to jailbreak itself. It was told to pretend to be a person that wanted to escape and it did so. i.e. A program behaved as expected and engaged in a conversation around the parameters defined.

And while this is not a Turing test pass for the chat bot it certainly is a fail for many users here.