# Mistral-7B-Instruct-v0.1

**Developer Portal :** <https://api.market/store/bridgeml/mistralai7b>

<figure><img src="https://blog.api.market/wp-content/uploads/2024/06/mistralai-Mistral-7B-Instruct-v0.1.png" alt=""><figcaption><p>Mistral-7B-Instruct-v0.1</p></figcaption></figure>

Model name to use in API calls:

```
mistralai/Mistral-7B-Instruct-v0.1
```

The Mistral-7B-Instruct-v0.1 Large Language Model (LLM) is a instruct fine-tuned version of the Mistral-7B-v0.1 generative text model using a variety of publicly available conversation datasets. This model supports function calling and JSON mode.

**Model Developers:** Mistral

**Input Models:** input text only.

**Output Models:** generate text only.

**Model Architecture:** `Mistral-7B-v0.1`, a transformer model, serves as the base for this instruction model, and it has the following architecture choices:

* Grouped-Query Attention
* Sliding-Window Attention
* Byte-fallback BPE tokenizer

**Context Length:** 16384

**License:** Apache 2.0

### Request and Response

**Request**

{% code overflow="wrap" %}

```bash
curl -X 'POST' \
  'https://prod.api.market/api/v1/bridgeml/mistralai7b/bridgeml/mistralai7b' \
  -H 'accept: application/json' \
  -H 'x-api-market-key: API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
  "messages": [
    {
      "role": "user",
      "content": "hello"
    },
    {
      "role": "assistant",
      "content": "Use voice of customer from an Amazon review to write an ad for a webcam."
    }
  ],
  "temperature": 1,
  "max_tokens": 256,
  "top_p": 1,
  "frequency_penalty": 0,
  "stream": false
}'
```

{% endcode %}

**Response**

{% code overflow="wrap" %}

```bash
{
  "id": "mistralai/Mistral-7B-Instruct-v0.1-825ff876-d2eb-40ac-9012-5a3f1011db4e",
  "object": "text_completion",
  "created": 1718783411,
  "model": "mistralai/Mistral-7B-Instruct-v0.1",
  "choices": [
    {
      "message": {
        "role": "assistant",
        "content": "\n\r\nAre you looking for a reliable webcam that offers clear and crisp audio? Look no further! One of our satisfied customers had this to say:\n\n\"The only thing I've ever had problems with is lighting, but the USB-C port on the webcam allowed me to quickly connect to my computer and resolve this issue in seconds. Plus, the audio quality is top-notch. I highly recommend this webcam to anyone who wants a reliable and high-quality audio and video experience.\"\n\nDon't let low-quality audio and video hold you back! Get your hands on this Amazon-reviewed webcam today!",
        "tool_calls": null,
        "tool_call_id": null
      },
      "index": 0,
      "finish_reason": "stop",
      "logprobs": null
    }
  ],
  "usage": {
    "prompt_tokens": 74,
    "completion_tokens": 136,
    "total_tokens": 210
  }
}
```

{% endcode %}

You can try out this cheap and easy to use LLM API here at <https://api.market/store/bridgeml/mistralai7b>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.api.market/api-product-docs/bridgeml/mistral-7b-instruct-v0.1.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
