Skip to main content

Model Use Cases

A powerful sparse MoE LLM. It balances high-performance reasoning with multilingual support within its standard 32K context window for complex tasks.

Try Doubao 1.5 Pro (32K) on Siray.ai

Key Features

  • MoE Architecture: Utilizes a Mixture-of-Experts design for efficient scaling of reasoning and knowledge access.
  • High Reasoning: Strong performance in logical tasks, coding, and mathematical problem-solving within the 32K context.

Get Started with the API