OpenAI is about to launch a powerful new open-source AI model, and it could drop within hours.
Leaks revealed deleted repos named yofo-deepcurrent/gpt-oss-120b and yofo-wildflower/gpt-oss-20b. Those accounts include OpenAI team members. The “gpt-oss” tag screams “GPT Open Source Software.” Multiple versions suggest a full lineup is ready to go.
The biggest model packs 120 billion parameters and uses a Mixture of Experts (MoE) design. Instead of one giant model, it has 128 specialist “experts.” The system picks just four to handle each query, making it both huge and fast.
This puts OpenAI’s new open-source model side-by-side with Mistral AI’s Mixtral and Meta’s Llama lineup. The model also features a large vocabulary for better multilingual handling and Sliding Window Attention to manage long texts smoothly.
This move looks like a direct answer to criticism about OpenAI becoming more closed. It’s aiming to win back developers and researchers while stepping up against open-source rivals.
No official word from OpenAI yet, but this leak is solid, backed by code and config files. A 120-billion-parameter open-source MoE model from them would be huge—and it looks imminent.