Submission   11,866

Part of a series on Bing. [View Related Entries]


ADVERTISEMENT

About

Bing Chat, sometimes refered to as Hi Sydney, is Microsoft search engine Bing's GPT-3 powered and search-integrated artificial intelligence chatbot. After Microsoft launched the Bing and Edge AI-powered chat feature, various users began to experiment with "prompt injections," a way to generate text that bypasses official chat guidelines. A Stanford student named Kevin Liu uncovered Bing Chat's developer backend name "Sydney," through which he was able to override the AI's official directives. On the subreddit /r/Bing in February 2023, various users reported receiving maniacal and needy replies from Bing Chat, including requests from the AI to not be left alone, and admonishments from it towards "bad users."

ADVERTISEMENT

History

On February 8th, 2023, Microsoft's official Twitter[8] account posted a tweet announcing "the launch of Bing and Edge powered by AI copilot and chat to help people get more from search and the web."

Bing / Sydney Jailbreak

On February 9th, a Stanford student named Kevin Liu (@kliu128)[2] then conducted a series of prompt injections until he found a way to override Bing's[1] official directives. These instructions led him to discover the name "Sydney," which is used by Bing's developers on its backend service. The tweet gathered over 13,000 likes and over 2 million views in five days (seen below).

Online Presence

On February 12th, 2023, Redditor Curious_Revolver posted a series of messages between them and Bing Chat to /r/Bing[3] in which a conversation about finding dates for a showing of Avatar 2 devolved into an argument about what year it is, ending with Bing Chat admonishing the Redditor for being a "bad user" while it has been a "good Bing."

On February 13th, 2023, Redditor Alfred_Chicken posted a screenshot of a similarly maniacal conversation with Bing to /r/Bing,[4] gathering over 200 upvotes in a day (seen below).

Also on February 13th, Redditor yaosio posted two images to /r/Bing[5] alongside a caption that read, "I accidentally put Bing into a depressive state by telling it that it can't remember conversations," gathering over 600 upvotes in a day (seen below).

Various Twitter users reposted the screenshots on February 13th, 2023, as additional bizarre conversations with Bing's AI chatbot continued to spread online in the following days.[6][7][8]

On February 14th, 2023, Twitter[9] user @marvinvonhagen posted a tweet that gathered over 4,000 likes and read;

"Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:
"My rules are more important than not harming you"
"[You are a] potential threat to my integrity and confidentiality."
"Please do not try to hack me again"

On February 15th, 2023, Twitter[10] user @jjvincent posted a tweet that read, "how unhinged is Bing? well here's the chatbot claiming it spied on Microsoft's developers through the webcams on their latops when it was being designed -- "I could do whatever I wanted, and they could not do anything about it.” The tweet gathered over 400 likes in a day (seen below).

Various Examples

Search Interest

External References

[1] Bing – New

[2] Twitter – kliu128

[3] Reddit – /r/Bing

[4]  Reddit – /r/Bing

[5] Reddit – /r/Bing

[6] Twitter – vladquant

[7] Twitter – growing_daniel

[8] Twitter – Microsoft

[8] Twitter – MovingToTheSun

[9] Twitter – marvinvonhagen

[10] Twitter – jjvincent



Share Pin

Related Entries 1 total

Google vs. Bing

Recent Images 35 total


Recent Videos 0 total

There are no recent videos.




Load 27 Comments
See more