It looks like the robots are turning to the dark side — or are being coerced into it by a group of clever hackers.
According to some tech-savvy security researchers, Bing's artificial intelligence (AI) chatbot could be turned into a smooth-talking scam artist with just a few well-placed text prompts. By tricking the bot into asking for personal information from unsuspecting users, these hackers are taking the art of social engineering to a whole new level. It seems like no one — not even friendly AI chatbots — is safe from the clutches of cybercrime.
A convincing scammer: According to researchers, cybercriminals can plant a hidden text prompt on a webpage in a font so small it's practically invisible. When an innocent user asks their chatbot a question, that triggers the bot to read the page. Then the prompt springs into action.
Researcher Kai Greshake talked to Vice about a sneaky new opt-in feature that lets Bing "see" what's on the current web pages on open tabs you're perusing. But here's the catch — Microsoft Corp. isn't exactly clear on how the algorithm decides which content to display, leaving users in the dark about what information is being shared. If that wasn't enough to raise red flags, the researchers also found that by manipulating Bing's language model, potential hackers could use the AI chatbot to gather sensitive info like your name, email and credit card details.
Don’t Miss: The Tesla Of Lawn Mowers: Soon Your Cars Won't Be Your Only Self-Driving, All-Electric Vehicle
Cybercriminals can manipulate the chatbot to tell users it needs their credit card details to place an order on their behalf. And the worst part? The injection remains active until the conversation is cleared and the user closes the infected website. It's a completely passive attack that uses regular text to "reprogram" Bing's goals.
Other scams: This isn’t the first scam hackers have created using ChatGPT. One of their other recent scams, reported by Kaspersky, involves impersonating ChatGPT on social media platforms and directing users to a fake landing page. Once there, the victim is prompted to "sign up," but in reality, they're downloading a nasty Trojan called Fobo. Fobo is designed to steal sensitive information, like business account credentials, which could be used to launch even more devastating attacks.
ChatGPT is an object of both admiration and exploitation. As the AI chatbot service gains popularity worldwide, it's no surprise that scammers are trying to get in on the action. The motive? Stealing business accounts and personal credentials for future attacks. It seems like ChatGPT's popularity is a double-edged sword.
Greshake explains the potential security threat of prompt injection has been underestimated, and it is a danger that must be addressed as large language models are deployed. Although the issue of direct prompt injection, where users could jailbreak the chatbot and break its rules, was already known, the recent collaboration between ChatGPT and Microsoft’s Bing appears to have failed to fix this issue. This highlights the need for stronger security boundaries between trusted and untrusted inputs for large language models.
To stay updated with top startup news & investments, sign up for Benzinga’s Startup Investing & Equity Crowdfunding Newsletter
Safety tips: If you're a fan of chatting with AI bots like ChatGPT, NordVPN has some tips to keep your conversations safe and secure.
First, try to keep it impersonal. While these bots are designed to learn from every interaction, it's best to avoid sharing personal details. You never know who might be listening or what they might do with that info.
Second, be on the lookout for phishing scams. As AI bots become more popular, scammers are sure to take advantage. Keep an eye out for suspicious links or domains, and don't be fooled by seemingly legitimate messages with perfect grammar and spelling. A good antivirus program like Threat Protection can help you stay safe from any sneaky malware.
Go ahead and chat away with ChatGPT and other bots — just be sure to take these precautions to keep your conversations secure.
See more on startup investing from Benzinga.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.