News

Now, like clockwork, a new group of authors has launched a suit against Microsoft, alleging that the company used their books without permission to train its Megatron AI model, per Reuters. While it’s ...
Microsoft has been hit with a lawsuit by a group of authors who claim the company used their books without permission to ...
A lawyer representing Universal Music Group , opens new tab, Concord and ABKCO in a lawsuit over Anthropic's alleged misuse of their lyrics to train its chatbot Claude told U.S. Magistrate Judge ...
Artificial intelligence startup Anthropic says its new AI model can work for nearly seven hours continuously, further signaling AI’s growing presence in the workplace.
Anthropic says Claude Opus 4 is its most powerful model and the best coding model in the world, while Sonnet 4 is replacing Sonnet 3.7 in the chatbot. Anthropic's latest Claude AI models are here ...
The internet freaked out after Anthropic revealed that Claude attempts to report “immoral” activity to authorities under certain conditions. But it’s not something users are likely to encounter.
Amazon said on Monday that it’s investing up to $4 billion into the artificial intelligence company Anthropic in exchange for partial ownership and Anthropic’s greater use of Amazon Web ...
Anthropic’s Claude 4 models arrive as the company looks to substantially grow revenue. Reportedly , the outfit, founded by ex-OpenAI researchers, aims to notch $12 billion in earnings in 2027 ...
Anthropic's annualized revenue -- or its total projected earnings over the course of the year, assuming its current rate of income continues -- was close to $1 billion in December, ...
OpenAI rival Anthropic is releasing a powerful new generative AI model called Claude 3.5 Sonnet. But it’s more an incremental step than a monumental leap forward. Claude 3.5 Sonnet can analyze ...
When Anthropic's older Claude model played Pokemon Red, it spent “dozens of hours” stuck in one city and had trouble identifying non-player characters. With Claude 4 Opus, the team noticed an ...