Arcee aims to reboot U.S. open source AI with new Trinity models released under Apache 2.0
FeaturedCarl Franzen December 2, 2025 Credit: VentureBeat made with Flux 2-Pro using FAL.aiFor much of 2025, the frontier of open-weight language models has been defined not in Silicon Valley or New York City, but in Beijing and Hangzhou.Chinese research labs including Alibaba’s Qwen, DeepSeek, Moonshot and Baidu have rapidly set the pace in developing large-scale, open Mixture-of-Experts (MoE) models — often with permissive licenses and leading benchmark performance. While OpenAI fielded its own open source, general purpose LLM this summer as well — gpt-oss-20B and 120B — the uptake has been slowed by so many equally or better performing alternatives. Now, one small U.S. company is pushing back.Today, Arcee AI announced the release of Trinity Mini and Trinity Nano Preview, the…