Arcee’s U.S.-made, open source Trinity Large and 10T-checkpoint offer rare look at raw model intelligence
Carl Franzen January 30, 2026 Credit: VentureBeat made with Flux-1 on fal.aiSan Francisco-based AI lab Arcee made waves last year for being one of the only U.S. companies to train large language models (LLMs) from scratch and release them under open or partially open source licenses to the public—enabling developers, solo entrepreneurs, and even medium-to-large enterprises to use the powerful AI models for free and customize them at will.Now Arcee is back again this week with the release of its largest, most performant open language model to date: Trinity Large, a 400-billion parameter mixture-of-experts (MoE), available now in preview,Alongside the flagship release, Arcee is shipping a “raw” checkpoint model, Trinity-Large-TrueBase, that allows researchers to study what a 400B sparse…