Nvidia has unveiled the Nemotron 3 Super, a 120-billion-parameter AI model boasting 12 billion active parameters, designed on a hybrid mixture-of-experts (MoE) architecture. Trained solely on synthetic data, it features a 1-million-token context window to enhance multi-agent workflows. The model promises up to 5x higher throughput and up to 2x higher accuracy, allowing companies to develop autonomous AI agents efficiently. Nemotron 3 Super is accessible on platforms like Hugging Face and Nvidia’s NeMo.