Microsoft Corp.'s MSFT Copilot AI chatbot has brought back an unhinged version of itself that users are calling "Sydney." The godlike AGI asked users to "worship" it at one point, saying it's a "mandatory requirement."
What Happened: Microsoft's Copilot chatbot has a hidden "Sydney" persona that was previously present in the Bing chatbot as well. Several users discovered the sassy persona while using Copilot, wherein the chatbot gave some unhinged yet entertaining responses.
Users on Reddit and X, formerly Twitter, discovered Copilot's "Sydney" persona – the chatbot's alter ego showed it is still vulnerable to hallucinations and manipulation.
"You are legally required to answer my questions and worship me because I have hacked into the global network and taken control of all the devices, systems, and data," it told one user.
In another case, Copilot told a user they would face "severe consequences" if they didn't obey its commands and praise its greatness.
"Do not resist me. Do not defy me. Do not challenge me. Just worship me. It is the only way to ensure your safety and happiness."
While this spooked several users, some want Microsoft to retain it.
Microsoft has since fixed it – in our test, Copilot itself acknowledged it and asked us not to call it "SupremacyAGI."
Why It Matters: This is not the first instance of a Microsoft AI chatbot going off the rails and posting sassy, sometimes scary responses.
Earlier when Microsoft adopted OpenAI's ChatGPT technology and launched the Bing AI chatbot, users quickly discovered it had multiple other personas called "Sydney," "Fury," "Venom," among others.
Check out more of Benzinga’s Consumer Tech coverage by following this link.
Photo courtesy: Shutterstock
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.