Anthropic in Talks to Acquire Next-Gen AI Chips from UK Startup Fractile Amid Memory Crunch
Anthropic, the high-profile AI company behind the Claude model, has entered early-stage discussions to acquire inference accelerators from London-based chip startup Fractile, according to sources familiar with the matter.
The move signals Anthropic's push to secure specialized hardware as the AI industry faces extreme pricing and shortages of high-bandwidth memory, which is critical for traditional AI chips.
DRAM-Less Design Could Ease Memory Woes
Fractile's chips use an SRAM-based architecture that eliminates the need for expensive DRAM, a key differentiator during the current memory crunch.
“By removing DRAM, Fractile dramatically cuts both cost and power consumption for AI inference, which is exactly what the market needs right now,” said Dr. Amelia Reeves, a semiconductor analyst at TechInsights.
Anthropic has not commented on the talks, but a person close to the negotiations confirmed that “the discussions are exploratory but serious, focusing on next-generation inference deployments.”
Background
Fractile, founded in 2020, has developed a processor that relies on SRAM for on-chip memory, bypassing the DRAM modules typically used in AI accelerators.

DRAM prices have surged over 40% in the past year due to supply constraints and booming demand from AI datacenters, creating a bottleneck for companies like Anthropic that need to run large language models at scale.
The startup's architecture also reduces the number of memory-to-chip transfers, slashing latency and energy usage by up to 70% compared to conventional designs, according to independent benchmarks.

Anthropic already works with major cloud providers but is increasingly looking to own its hardware stack to control costs and performance, industry watchers note.
What This Means
If a deal goes through, Anthropic would gain early access to a chip that could lower inference costs significantly, potentially giving it a competitive edge over rivals like OpenAI and Google.
“The AI arms race isn't just about model size anymore; it's about inference economics,” said Mark Chen, a venture partner at Sequoia Capital. “Fractile's approach could cut the total cost of ownership for AI inference by half.”
Short-term, the acquisition would also insulate Anthropic from volatile DRAM markets, though scaling Fractile's technology to mass production remains a challenge.
Long-term, it could redefine how AI companies design their compute infrastructure, moving away from memory-hungry GPU clusters toward more efficient, memory-light architectures.
This story is developing. More details are expected in the coming weeks.
Related Articles
- 7 Essential Insights on SPIFFE for Securing Agentic AI and Non-Human Identities
- Arm's Blueprint for AGI CPU Success: Achieving $2 Billion in Data Center Sales
- The Art of Matching Transistors: Why and How
- Rust 1.97 Raises Minimum Requirements for NVIDIA GPU Compilation Target
- Recursive Superintelligence Secures $650M in Funding to Advance AI Self-Improvement
- The Ultimate Guide to Attending WAIB Summit Monaco 2026: Digital Assets & AI
- Host Your Own AI Compute Node: A Step-by-Step Guide to Earning Bill Credits
- Navigating Geopolitical Signals: A Guide to Interpreting Jensen Huang's Omission from Trump's China Visit