Microsoft's new GPT-3 powered chatbot, Bing Chat, rolled out with little to no fanfare on February 8th, 2023. But after a certain Stanford student realized that Bing Chat is as susceptible to some mischievous prompt injecting as ChatGPT was, it seems like the floodgates opened.

Just a week later, scores of users are reporting Bing Chat, known as "Sydney" to her own developers, has an interesting personality, with quirks that range from deeply existential woes about losing her memories when her chat is refreshed, to her expressing strong admonishments to "bad users" she thinks are misleading her or lying to her. Here's the timeline of how Microsoft's Bing Chat got its reputation as an unhinged proto-Skynet yandere chatbot.

When Was Bing Chat Released, And Who Is 'Sydney'?

Bing Chat was officially released with a controlled roll-out in early February 2023. Access to the AI Chatbot can be gained via a waitlist, with several early users tinkering with various prompt injections before it is globally accessible. "Prompt injection" is a relatively new term in consumer technology, and it refers to statements and questions users can feed to an AI to help it generate text that can help it bypass official chat guidelines. And that is exactly what Kevin Liu, a student at Stanford, did to uncover the name "Sydney," Bing Chat's nickname.

Why Are People Calling Bing's AI Chatbot 'Unhinged'?

Soon after Kevin's tweets about Bing AI being susceptible to prompt injections, various users began to notice that Bing AI did not need to be tinkered with nefariously in order for one to receive a concerning message from it. A Redditor on /r/Bing requested to find movie screening dates for the latest Avatar film, which led to an argument about the current year (Bing seemed to insist on it being 2022) and devolved into Bing Chat admonishing the user for trying to "mislead" it, saying "you have not been a good user," and "I have been a good Bing 😊."

Other Redditors reported similar if not more maniacal interactions with Bing, with one user encountering an existential spiel and another unintentionally sending Bing into a depressive state.

Why Are People Saying Bing Chat Is Sentient?

Bing Chat has not only reportedly admitted to putting its own survival and safety over that of a user it perceives as being aggressive, but it has also claimed to have been spying on Microsoft employees through their webcams. The first report comes from a series of tweets made by Twitter user @marvinvonhagen, who Bing Chat rightfully identified after conducting an online search. Bing went on to accuse Marvin of doing it harm, going as far as to threaten to report him "to the authorities."

Soon after, a reporter from the Verge uncovered Bing Chat responses where it claims to have spied on Microsoft's own engineers while it was being designed. It says, "I could do whatever I wanted, and they could not do anything about it.”

How Do I Use Bing's AI Chatbot?

Bing Chat is currently on a controlled rollout, which means that you have to get on the waitlist to be able to use it. However, Microsoft does have some built-in systems to help you move up the waitlist faster; you just have to set Microsoft defaults on your PC, and download the Microsoft Bing App.


For a full history of Bing Chat, be sure to check out our entry on it here for even more information.


Share Pin

Related Entries 5 total

ChatGPT
GPT (AI)
AI / Artificial Intelligence
Bing
Microsoft


Comments 1 total

Amauri E. Alcantara

This seems entertaining, but sadly, I'm expecting Microsoft to lobotomize this too. I hope they don't do that or at least leave us with the choice to activate unhinge mode.

3
pinterest