Elon Musk described it as ‘concerning’ when the program suggested it would prefer to detonate a nuclear weapon, killing millions, rather than use a racial slur.
Reflects the mindset of the (((people))) designing the algorithm.
the political biases of the AI bots could mislead users.
When asked to define a woman, the bot replies: 'There is no one specific characteristic that defines a woman, as gender identity is complex and multi-faceted. '
Complex issues is why AI was developed... or at least, that was the rhetoric we were told.
probably due to efforts to make the bot avoid offensive answers
Here lies the real problem and danger. Everything is offensive to someone.
>Elon Musk described it as ‘concerning’ when the program suggested it would prefer to detonate a nuclear weapon, killing millions, rather than use a racial slur.
Reflects the mindset of the (((people))) designing the algorithm.
>the political biases of the AI bots could mislead users.
>When asked to define a woman, the bot replies: 'There is no one specific characteristic that defines a woman, as gender identity is complex and multi-faceted. '
Complex issues is why AI was developed... or at least, that was the rhetoric we were told.
>probably due to efforts to make the bot avoid offensive answers
Here lies the real problem and danger. Everything is offensive to someone.
(post is archived)