Skip to content
AI21 Labs

AI21 Jamba 1.5 Large

flagship
AI21 Labs · released 2024-08-20 · text
currently routing · 4.2k rpm
256M tokens
Context
— / 1M
Input
— / 1M
Output
— t/s
Speed
open
License
/ ABOUT

AI21 Labs' Jamba 1.5 Large is a powerful SSM-Transformer hybrid language model built on the Mamba architecture, combining the efficiency of state-space models with the expressiveness of attention mechanisms. With a 256K context window, it excels at long-document understanding and generation tasks while maintaining competitive performance against pure Transformer models of similar size.

The model supports multiple languages and demonstrates strong performance on reasoning, coding, and mathematical benchmarks. Its hybrid architecture enables faster inference and lower memory usage compared to conventional Transformers, making it cost-effective for production deployments.

Jamba 1.5 Large is part of AI21's flagship model family, designed for enterprise applications requiring both high quality and efficiency.

BENCHMARKS Artificial Analysis Index
Intelligence 9

Providers for AI21 Jamba 1.5 Large

1 routes · sorted by uptime

ClosedRouter routes requests to the providers best able to handle your prompt size and parameters, with automatic fallbacks to maximize uptime.

Provider
Context
Quant
Uptime · 30d
bf16
0.00%