Arcee aims to reboot U.S. open source AI with new Trinity models released under Apache 2.0
venturebeatFor much of 2025, the frontier of open-weight language models has been defined not in Silicon Valley or New York City, but in Beijing and Hangzhou.
Chinese research labs including Alibaba's Qwen, DeepSeek, Moonshot and Baidu have rapidly set the pace in developing large-scale, open Mixture-of-Experts (MoE) models — often with permissive licenses and leading benchmark performance. While OpenAI fielded its own open source, general purpose LLM this summer as well — gpt-oss-20B and 120B — the uptake has been slowed by so many equally or better performing alternatives.
Now, one small U.S. company is pushing back.
Today, Arcee AI announced the release of Trinity Mini and Trinity Nano Preview, the first two models in its new “Trinity” family—an open-weight MoE model suite fully trained in the United States.
Users can try the former directly for themselves in a chatbot format on Acree's new website, chat.arcee.ai, and ...
Copyright of this story solely belongs to venturebeat . To see the full text click HERE

