A new player has joined the AI chatbot race, competing with Google's GOOG Bard and Microsoft-backed MSFT ChatGPT.
Claude 2, developed by San Francisco-based startup Anthropic, was made available for public testing on Tuesday.
The AI chatbot scene is already very competitive, and Anthropic is looking to gain market share by presenting a software that was "created specifically to be helpful, harmless, and honest."
Chatbots like Bard and ChatGPT have raised a number of concerns within the AI community and beyond. These go from showcasing ethical biases or spreading misinformation, all the way to the risk of wiping out humankind.
While some critics have launched campaigns advocating for urgent AI regulation —including some in the White House— Anthropic decided that the best way to mitigate the risks of dangerous AI was to build its own morally correct version of an AI chatbot.
Anthropic has so far raised $1.5 billion and is valued at $4 billion. A recent successful Series C round of $450 million included investments from Alphabet, Zoom ZM and Salesforce CRM.
"ChatGPT has faced some criticism for potential harms stemming from its training methodology and objectives," said Claude 2, when asked how it's different from ChatGPT.
Claude 2 noted that its own training data and model architecture have been carefully curated to align with human values.
Claude 2 is still in beta testing, but it can be accessed by anyone in the U.S. and the U.K. (or anybody with access to a VPN). Anthropic is also offering the artificial intelligence as an API for businesses to implement into their own platforms.
The company claims it can be as helpful as its competitors, and is able to do everything ChatGPT can do, like write a poem or functional piece of code, but with a twist of ethics embedded into it.
Perhaps as a way to emphasize Claude's moralist profile, Anthropic wrote in a press release that the latest version of the software received a score of 76/100 in the U.S. Bar exam. A good score is considered to be above 67/100.
Anthropic was founded by former employees of OpenAI, the start-up behind ChatGPT and other massive AI hits like image generator Dall-E. CEO Dario Amodei led the teams that built GPT-2 and GPT-3 before founding the company with his sister Daniela Amodei, who's now Anthropic's president.
The siblings, along with other co-founders, became worried that OpenAI had become too commercial to produce ethical software. OpenAI had been conceived as a research non-profit, but shifted into a for-profit model in order to secure the funding needed to develop the massive large language models behind its products.
In order to build an AI chatbot that's less prone to unethical behavior, Anthropic developed what it calls "Constitutional AI." In other words, Claude 2 has a constitution written into it as an effort to "avoid certain harmful behaviors like bias, while maintaining helpfulness."
"ChatGPT does not have any hardcoded technique for avoiding harms," said Claude 2.
Another way to prevent bias is by limiting the information used in its training. While ChatGPT was trained using 175 billion parameters, Claude's training only included 1 billion inputs, which include text and conversational dialogue.
"My smaller model allows for greater transparency and control by my designers," it said.
Anthropic is built as a public benefit corporation, a type of for-profit company that intends to add value to the community, instead of merely seeking profit maximization.
The community, in this case, could be the whole of humanity. From Anthropic's perspective, Claude 2 is an effort to build a responsible AI system that is less likely to go rogue and decide to wipe out civilization, in response to its overly-commercial competition.
Now Read: Tesla’s Humanoid Robot Optimus Makes Debut In US Stores: A New Strategy To Drive Foot Traffic?
Image created using artificial intelligence using Midjourney AI.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.