Cerebras AI infrastructure funding hits $1.1B at $8.1B valuation
Cette publication existe aussi en Français
Cerebras Systems has closed a massive $1.1 billion Series G in AI infrastructure funding, valuing the company at $8.1 billion. The Sunnyvale-based AI chipmaker says the fresh AI investment will accelerate product development and scale manufacturing to meet booming demand for faster inference.
This signals further competition in high-performance AI hardware and cloud services—an area directly impacting system design choices and semiconductor innovation across the industry.
Investor confidence in AI performance gains
The round was led by Fidelity Management & Research Company and Atreides Management, with participation from Tiger Global, Valor Equity Partners, 1789 Capital, and existing backers including Altimeter, Alpha Wave, and Benchmark. Citigroup and Barclays Capital acted as joint placement agents.
“From our inception we have been backed by the most knowledgeable investors in the industry. They have seen the historic opportunity that is AI and have chosen to invest in Cerebras,” said Andrew Feldman, co-founder and CEO of Cerebras. “We are proud to expand our consortium of best-in-world investors.”
The company says the investment will fund advancements in processor design, packaging, system architecture, and AI supercomputers, alongside U.S. manufacturing and data center expansions.
Inference speed drives adoption
Since launching its inference service in late 2024, Cerebras has rapidly grown its customer base. According to benchmarking firm Artificial Analysis, “Since our founding, we have tested every AI inference provider across hundreds of models. Cerebras is consistently the fastest,” said Micah Hill-Smith, CEO.
Cerebras reports inference speeds more than 20X faster than Nvidia GPUs on a range of models, enabling new use cases in code generation, reasoning, and agent-based AI. Its systems now serve trillions of tokens per month through its own cloud, customer deployments, and partner platforms.
High-profile adopters include AWS, Meta, IBM, Mistral, and Cognition, as well as government and enterprise customers such as the US Department of Defense, GlaxoSmithKline, and Mayo Clinic. On Hugging Face, Cerebras leads as the #1 inference provider, handling over 5 million monthly requests.
If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :
eeNews on Google News
