News

Anthropic’s developers recently upgraded the AI model Claude Sonnet 4 to support up to 1 million tokens of context, thereby ...
Anthropic has expanded the capabilities of its Claude Sonnet 4 AI model to handle up to one million tokens of context, five ...
Anthropic AI has increased the context window for their Claude Sonnet 4 model to 1 million tokens, which is 5 times more than ...
Anthropic upgrades Claude Sonnet 4 to a 1M token context window and adds memory, enabling full codebase analysis, long ...
Anthropic’s latest move to expand the context window, now in public beta, might encourage Google Gemini users to give it ...
The company today revealed that Claude Sonnet 4 now supports up to 1 million tokens of context in the Anthropic API — a five-fold increase over the previous limit.
Anthropic has upgraded Claude Sonnet 4 with a 1M token context window, competing with OpenAI's GPT-5 and Meta's Llama 4.
Dan Shipper in Vibe Check Was this newsletter forwarded to you? Sign up to get it in your inbox. Today, Anthropic is releasing a version of Claude Sonnet 4 that has a 1-million token context window.
What's New: Tier 4 Anthropic API users are getting access to an extended 1M context window.
Anthropic's Claude Sonnet 4 supports 1 million token context window, enables AI to process entire codebases and documents in ...
Anthropic's popular coding model just became a little more enticing for developers with a million-token context window.
Google and Anthropic are racing to add memory and massive context windows to their AIs right as new research shows that ...