When you purchase through links on our site, we may earn an affiliate commission.Heres how it works.
New evidence shows that Microsoft publicly tested Bing Chat (codenamed Sidney) in November in India.
Nonetheless, Microsoft needs to be called out here, too.

Microsoft CEO Satya Nadella at its recent Bing Chat AI event in Redmond, Washington.
Microsoft clearly should have known about these disturbing shifts in Sidneys behavior, yet the company released it anyway.
Since the exposure of Bing Chats odd personalities, Microsoft hasrestricted the turn-takingin conversations to just five responses.
Neither option is particularly comforting.

A user complained about ‘Sidney’ back in November documented how the AI can become rude.
Of course, like any race, there will be instances of companies jumping the gun.
Unfortunately, Microsoft and Google (withits disastrous announcement) are getting ahead of themselves to one-up each other.
It should also be obvious were entering new uncharted waters.

Another user noted the same behavior including the AI becoming defensive when it was told it was wrong.
Should AI beanthropomorphized, including having a name, personality, or physical representation?
How much should AI know about you, including access to the data it collects?
Should governments be involved in regulating this technology?

Bing Chat can still get prickly when you ask the wrong thing.
How transparent should companies be about how their AI systems work?
None of these questions have simple answers.
For AI to work well, it needs to know much about the world and, ideally, you.

How far should that go?
Likewise, for making AI more human-like.
Regrettably, Im not sure everyone is prepared for what comes next.










