About 39,500 results
Open links in new tab
  1. PrimeIntellect/INTELLECT-3 · Hugging Face

    2 days ago · Trained with prime-rl and verifiers Environments released on Environments Hub Read the Blog & Technical Report X | Discord | Prime Intellect Platform Introduction INTELLECT-3 is a 106B …

  2. INTELLECT-3: Prime Intellect's 106B MoE Model Trained End-to ...

    10 hours ago · Prime Intellect just released INTELLECT-3, a 106B-parameter Mixture-of-Experts (MoE) model that utilizes only 12B active parameters at inference time. This model is trained end-to-end …

  3. INTELLECT-3: A 100B+ MoE trained with large-scale RL

    2 days ago · INTELLECT-3 is a 106B parameter Mixture-of-Experts model trained with both SFT and RL on top of the GLM 4.5 Air base model. It achieves state-of-the-art performance for its size across …

  4. INTELLECT-3: The new 106B MoE model that revolutionizes ...

    14 hours ago · Learn more Discover INTELLECT-3, a 106B parameter Mixture-of-Experts model, trained with SFT + RL on GLM-4.5-Air and built entirely with open-source tools.

  5. Prime Intellect Unveils 106 Billion Parameter INTELLECT-3 AI ...

    Prime Intellect has announced the release of INTELLECT-3, a new 100B+ Mixture-of-Experts (MoE) model, marking a significant advancement in the field of large-scale artificial intelligence.

  6. Prime Intellect: INTELLECT-3 – Performance Metrics | OpenRouter

    3 days ago · See performance metrics across providers for Prime Intellect: INTELLECT-3 - INTELLECT-3 is a 106B-parameter Mixture-of-Experts model (12B active) post-trained from GLM-4.5-Air-Base …

  7. Prime Intellect debuts INTELLECT-3, an RL-trained 106B ...

    1 day ago · Prime Intellect debuts INTELLECT-3, an RL-trained 106B parameter open source MOE model it claims outperforms larger models across math, code, science, reasoning — Today, we …