Cerebras Systems did not wake up one morning and decide to chase artificial intelligence. This story starts earlier, back when Andrew Feldman, Gary Lauterbach, Michael James, Sean Lie, and Jean-Philippe Fricker were still asking uncomfortable questions at SeaMicro. The kind of questions that make rooms go quiet. Lauterbach asked one that never stopped echoing. Why a processor tuned to push pixels should be trusted to think. That question did not age. It multiplied, scaled, and eventually demanded silicon big enough to answer back.
Fast forward to Jan 2026 and Cerebras Systems signs a multi-year computing agreement with OpenAI valued north of $10B. Not VC. Not a splashy round. Infrastructure. Real power, real capacity, real consequence. Up to 750 megawatts of inference through 2028. Low latency workloads that do not tolerate hesitation. OpenAI did not come shopping for novelty. They came looking for something that works when the clock is running and the answers matter.
Andrew Feldman sells inevitability, not hype. Sean Lie builds hardware the way elite software engineers wish hardware behaved. Michael James thinks in systems, not slides. Jean-Philippe Fricker obsesses over integration because performance dies in the seams. Gary Lauterbach asked the question and never let the answer off the hook. Together they built silicon that refuses to wait on off-chip memory, refuses to gossip across clusters, and refuses to pretend latency is a rounding error.
The Wafer-Scale Engine-3 is not branding. It is a full 300mm wafer carrying ~4T transistors, 900K+ compute tiles, and 44GB of on-chip SRAM sitting exactly where the math lives. One chip. One fabric. One conversation moving at nanosecond speed. When Cerebras talks about inference, they are talking about answers arriving while the question still feels warm.
The OpenAI agreement matters because it breaks an old whisper about concentration risk. G42 carried 87% of H1 2024 revenue. That chapter funded the build. OpenAI changes the balance sheet and the narrative. Blue-chip demand. Multi-year visibility. A compute buyer who pressure tests claims for a living. This is what de-risking looks like when it shows up wearing numbers instead of optimism.
Cerebras Systems now stands where infrastructure becomes destiny. Datacenters across North America and Europe. U.S. manufacturing scaling fast. An IPO window that no longer feels theoretical. This is not about replacing GPUs or winning arguments online. This is about inference at scale, delivered fast enough to matter, built by people who decided long ago that general purpose was never the point.