RadixArk Raises $100M in Seed Funding to Build Open AI Infrastructure Platform
Every few years, infrastructure stops being plumbing and starts acting like power. Not the loud, chest-thumping kind either. The quiet kind. The kind that decides who ships, who scales, who burns venture dollars like a bonfire in a wind tunnel, and who suddenly looks like a genius because their models actually run when the GPUs start screaming at 3 a.m.That is the lane RadixArk just walked into with steel-toe boots and a $100M Seed round at a $400M valuation.
Palo Alto keeps producing AI companies like Vegas pumps oxygen into casinos, but RadixArk is different because Co-founders Ying Sheng, CEO, and Banghua Zhu, CTO, are not selling AI perfume to people who still think prompts are strategy. They are building the roads, tunnels, pressure systems, and reinforcement learning machinery underneath frontier models. The part nobody notices until it breaks. Then suddenly every executive on Earth starts sweating through quarter-zips and pretending they always cared about inference efficiency.
And this is not a two-person science project operating out of a WeWork with dry-erase markers and caffeine tremors. The company is already assembling serious infrastructure muscle around the core vision. Mingyi Lu, Head of Product and maintainer of SGLang, is helping shape the product layer around one of the most important open-source inference engines in AI today, while Samson T., Head of Finance & Operations, is helping turn frontier-scale ambition into something operationally dangerous in the best way possible. Quiet operators matter. Especially in infrastructure. Especially when the burn rate of one bad architecture decision can look like a small nation’s GDP.
Accel led the round with Spark Capital co-leading, alongside NVentures, AMD, MediaTek, Walden Catalyst Ventures, HOF Capital, LDV Partners, Salience Capital, A&E Investments, and WTT Investment. Then you look at the individual investors and the room starts reading like an AI infrastructure Avengers cast list: Igor Babuschkin, John Schulman, Soumith Chintala, Thomas Wolf, Olivier Pomel, Robert Nishihara, William Fedus, Logan Kilpatrick, Eric Zelikman, Hock Tan, and Lip-Bu Tan. That is not tourist money. That is operator money. People who know where the pressure points live because they have personally been punched in the face by them.
The part that deserves real attention is SGLang. Ying Sheng, Banghua Zhu, and the broader team helped build what became one of the open-source standards for serving large language models at scale. Hundreds of thousands of GPUs. Trillions of tokens daily. Read that sentence slowly because most people throw around “scale” like a guy at the gym adding plates nobody asked for. This is actual scale. Industrial scale. The kind where milliseconds become budget lines and latency becomes a boardroom conversation.
Now RadixArk is extending that foundation into a full-stack platform spanning training, fine-tuning, reinforcement learning, and inference. They are also building Miles for large-scale RL workloads, which matters because the future AI race is not just about bigger models. It is about smarter systems, tighter orchestration, lower costs, and infrastructure that does not collapse under its own ambition like a folding table at a family barbecue.
There is also a deeper lesson sitting underneath this round. Open source is no longer the side quest. The smartest infrastructure companies understand developers adopt ecosystems before enterprises sign contracts. SGLang earned trust in the wild first. That credibility compounds differently. You cannot fake community adoption with branding decks and catered launch parties featuring cold shrimp and jargon stitched together by consultants billing hourly.
RadixArk is betting frontier AI infrastructure should be open, scalable, and dramatically more accessible. If they pull this off, the real flex will not be the valuation. It will be becoming the invisible engine underneath the next generation of AI companies while everyone else argues online about prompts like it is open mic night at a data center.









