BGE Small EN v1.5
flagshipBGE Small EN v1.5 is a compact English text embedding model from the Beijing Academy of Artificial Intelligence (BAAI), optimized for generating high-quality vector representations of text for semantic search, clustering, and retrieval tasks. Despite its small size, it delivers competitive embedding quality, making it ideal for applications with latency or cost constraints.
The v1.5 update improves upon the original BGE models with better training data and techniques, achieving stronger performance on the MTEB (Massive Text Embedding Benchmark) leaderboard. It supports up to 512 tokens and produces 384-dimensional embeddings.
BGE Small EN v1.5 is widely used in production RAG (Retrieval-Augmented Generation) systems where speed and efficiency matter, offering an excellent balance between quality and resource usage.
Providers for BGE Small EN v1.5
1 routes · sorted by uptimeClosedRouter routes requests to the providers best able to handle your prompt size and parameters, with automatic fallbacks to maximize uptime.