Mixtral 8x7b

Mistral MoE 8x7B Instruct v0.1 model with Sparse Mixture of Experts. Fine tuned for instruction following.


API

If you're looking for an API, you can choose from your desired programming language.

POST
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 const axios = require('axios'); const api_key = "YOUR API-KEY"; const url = "https://api.segmind.com/v1/mixtral-8x7b-instruct"; const data = { "messages": [ { "role": "user", "content": "tell me a joke on cats" }, { "role": "assistant", "content": "here is a joke about cats..." }, { "role": "user", "content": "now a joke on dogs" } ] }; (async function() { try { const response = await axios.post(url, data, { headers: { 'x-api-key': api_key } }); console.log(response.data); } catch (error) { console.error('Error:', error.response.data); } })();
RESPONSE
application/json
HTTP Response Codes
200 - OKResponse Generated
401 - UnauthorizedUser authentication failed
404 - Not FoundThe requested URL does not exist
405 - Method Not AllowedThe requested HTTP method is not allowed
406 - Not AcceptableNot enough credits
500 - Server ErrorServer had some issue with processing

Attributes


messagesArray

An array of objects containing the role and content


rolestr

Could be "user", "assistant" or "system".


contentstr

A string containing the user's query or the assistant's response.

To keep track of your credit usage, you can inspect the response headers of each API call. The x-remaining-credits property will indicate the number of remaining credits in your account. Ensure you monitor this value to avoid any disruptions in your API usage.

Mixtral 8x7b

Mixtral 8x7b is a large language model (LLM) created by Mistral AI. It's known for being efficient and powerful. Here's a quick rundown of its key features:

  1. Efficient: Mixtral 8x7b is a sparse model, meaning it only uses a portion of its parameters at a time. This makes it faster and cheaper to run than some other LLMs.

  2. Powerful: Despite its efficiency, Mixtral 8x7b performs well on many benchmarks, even exceeding some larger models.

  3. Multilingual: It can understand and respond in English, French, Italian, German, and Spanish.

  4. Open-source: The code is available for anyone to use and modify.