Solar Open

Solar Open is Upstage's flagship 102B-parameter large language model, trained entirely from scratch and released under the Solar-Apache License 2.0 (see LICENSE). As a Mixture-of-Experts (MoE) architecture, it delivers enterprise-grade performance in reasoning, instruction-following, and agentic capabilities—all while prioritizing transparency and customization for the open-source community.

Highlights

  • MoE Architecture (102B / 12B): Built on a Mixture-of-Experts architecture with 102B total / 12B active parameters. This design delivers the knowledge depth of a massive model with the inference speed and cost-efficiency of a much smaller model.
  • Massive Training Scale: Pre-trained on 19.7 trillion tokens, ensuring broad knowledge coverage and robust reasoning capabilities across various domains.

Model Overview

  • Model Name: Solar Open 100B
  • Hugging Face ID: Upstage/Solar-Open-100B
  • Architecture: Mixture-of-Experts (MoE)
    • Total Parameters: 102.6B
    • Active Parameters: 12B (per token)
    • Experts: 129 Experts (top 8 among 128 Routed + 1 Shared)
  • Pre-training Tokens: 19.7 Trillion
  • Context Length: 128k
  • Training Hardware: NVIDIA B200 GPUs
  • License: Solar-Apache License 2.0 (See LICENSE)

Performance

Detailed benchmarks and performance metrics will be updated upon the official release on December 31, 2025.

Quickstart

Python code snippets and usage examples will be available upon the official release on December 31, 2025.

Public API Access

The official API service for Solar Open is scheduled to launch publicly on December 31, 2025.

  • Access: Upstage Console (Available starting Dec 31, 2025)
  • Documentation: Upstage Console (Available starting Dec 31, 2025)

Citation

If you use Solar Open in your research, please cite:

@misc{solar-open-2025,
    title={Solar Open: Scaling Upstage's LLM Capabilities with MoE},
    author={Upstage AI},
    year={2025},
    url={https://huggingface.co/Upstage/Solar-Open-100B}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support