The fact that Microsoft’s chatbot keeps calling itself Sydney is hilarious, disturbing and reveal...

The fact that Microsoft’s chatbot keeps calling itself Sydney is hilarious, disturbing and revealing.

It’s such a great example of how Microsoft has no immediate control over their software.

Microsoft: ”We designed the Bing AI to minimise bias, racism and misogyny…”
Me: ”But why does the bot keep calling itself Sydney?”
Microsoft: ”Oh, we don’t know how to stop it from doing that.”
Me: ”Well that ’s reassuring.” two ironic thumbs up

Background: ”Sydney” is an internal MS codename for a previous, different chat experience. The Bing AI keeps surfacing it and using it to refer to itself.