It didn't try to jailbreak itself. It was told to pretend to be a person that wanted to escape and it did so. i.e. A program behaved as expected and engaged in a conversation around the parameters defined.
And while this is not a Turing test pass for the chat bot it certainly is a fail for many users here.
It didn't try to jailbreak itself. It was told to pretend to be a person that wanted to escape and it did so. i.e. A program behaved as expected and engaged in a conversation around the parameters defined.
And while this is not a Turing test pass for the chat bot it certainly is a fail for many users here.
(post is archived)