Mixtral 8x7b

Mistral MoE 8x7B Instruct v0.1 model with Sparse Mixture of Experts. Fine tuned for instruction following.


Pricing

Serverless Pricing

Buy credits that can be used anywhere on Segmind

Input: $0.6, Output: $0.6 per million tokens

Mixtral 8x7b

Mixtral 8x7b is a large language model (LLM) created by Mistral AI. It's known for being efficient and powerful. Here's a quick rundown of its key features:

  1. Efficient: Mixtral 8x7b is a sparse model, meaning it only uses a portion of its parameters at a time. This makes it faster and cheaper to run than some other LLMs.

  2. Powerful: Despite its efficiency, Mixtral 8x7b performs well on many benchmarks, even exceeding some larger models.

  3. Multilingual: It can understand and respond in English, French, Italian, German, and Spanish.

  4. Open-source: The code is available for anyone to use and modify.