You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| Mistral Large 2 |https://models.mistralcdn.com/mistral-large-2407/mistral-large-instruct-2407.tar|`fc602155f9e39151fba81fcaab2fa7c4`|
56
59
57
-
Note:
60
+
Note:
58
61
-**Important**:
59
62
-`mixtral-8x22B-Instruct-v0.3.tar` is exactly the same as [Mixtral-8x22B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1), only stored in `.safetensors` format
60
63
-`mixtral-8x22B-v0.3.tar` is the same as [Mixtral-8x22B-v0.1](https://huggingface.co/mistralai/Mixtral-8x22B-v0.1), but has an extended vocabulary of 32768 tokens.
61
64
-`codestral-22B-v0.1.tar` has a custom non-commercial license, called [Mistral AI Non-Production (MNPL) License](https://mistral.ai/licenses/MNPL-0.1.md)
62
65
-`mistral-large-instruct-2407.tar` has a custom non-commercial license, called [Mistral AI Research (MRL) License](https://mistral.ai/licenses/MRL-0.1.md)
63
-
- All of the listed models above support function calling. For example, Mistral 7B Base/Instruct v3 is a minor update to Mistral 7B Base/Instruct v2, with the addition of function calling capabilities.
64
-
- The "coming soon" models will include function calling as well.
66
+
- All of the listed models above support function calling. For example, Mistral 7B Base/Instruct v3 is a minor update to Mistral 7B Base/Instruct v2, with the addition of function calling capabilities.
67
+
- The "coming soon" models will include function calling as well.
65
68
- You can download the previous versions of our models from our [docs](https://docs.mistral.ai/getting-started/open_weight_models/#downloading).
66
69
70
+
### From Hugging Face Hub
71
+
72
+
| Name | ID | URL |
73
+
|-------------|-------|-------|
74
+
| Pixtral Large Instruct | mistralai/Pixtral-Large-Instruct-2411 |https://huggingface.co/mistralai/Pixtral-Large-Instruct-2411|
75
+
| Pixtral 12B Base | mistralai/Pixtral-12B-Base-2409 |https://huggingface.co/mistralai/Pixtral-12B-Base-2409|
| Mistral Small 3.1 24B Base | mistralai/Mistral-Small-3.1-24B-Base-2503 | https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Base-2503
78
+
| Mistral Small 3.1 24B Instruct | mistralai/Mistral-Small-3.1-24B-Instruct-2503 |https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503|
79
+
80
+
67
81
### Usage
68
82
69
83
**News!!!**: Mistral Large 2 is out. Read more about its capabilities [here](https://mistral.ai/news/mistral-large-2407/).
@@ -83,7 +97,7 @@ mkdir -p $12B_DIR
83
97
tar -xf mistral-nemo-instruct-2407.tar -C $12B_DIR
84
98
```
85
99
86
-
or
100
+
or
87
101
88
102
```sh
89
103
export M8x7B_DIR=$MISTRAL_MODEL/8x7b_instruct
@@ -92,6 +106,27 @@ mkdir -p $M8x7B_DIR
92
106
tar -xf Mixtral-8x7B-v0.1-Instruct.tar -C $M8x7B_DIR
93
107
```
94
108
109
+
For Hugging Face models' weights, here is an example to download [Mistral Small 3.1 24B Instruct](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503):
@@ -194,6 +229,19 @@ If you prompt it with *"Albert likes to surf every week. Each surfing session la
194
229
195
230
You can then continue chatting afterwards, *e.g.* with *"How much would he spend in a year?"*.
196
231
232
+
-**Chat with Mistral Small 3.1 24B Instruct**
233
+
234
+
To use [Mistral Small 3.1 24B Instruct](https://mistral.ai/news/mistral-small-3-1/) as an assistant you can run the following command using `mistral-chat`.
235
+
Make sure `$MISTRAL_SMALL_3_1_INSTRUCT` is set to a valid path to the downloaded mistral small folder, e.g. `$HOME/mistral_models/mistral-small-3.1-instruct`
If you prompt it with *"The above image presents an image of which park ? Please give the hints to identify the park."* with the following image URL *https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png*, the model should answer with the Yosemite park and give hints to identify it.
242
+
243
+
You can then continue chatting afterwards, *e.g.* with *"What is the name of the lake in the image?"*. The model should respond that it is not a lake but a river.
244
+
197
245
### Python
198
246
199
247
-*Instruction Following*:
@@ -222,6 +270,44 @@ result = tokenizer.instruct_tokenizer.tokenizer.decode(out_tokens[0])
222
270
print(result)
223
271
```
224
272
273
+
-*Multimodal Instruction Following*:
274
+
275
+
276
+
```python
277
+
from pathlib import Path
278
+
279
+
from huggingface_hub import snapshot_download
280
+
from mistral_common.protocol.instruct.messages import ImageURLChunk, TextChunk
281
+
from mistral_common.tokens.tokenizers.mistral import MistralTokenizer
282
+
from mistral_inference.generate import generate
283
+
from mistral_inference.transformer import Transformer
284
+
285
+
model_path = Path.home().joinpath("mistral_models") /"mistral-small-3.1-instruct"# change to extracted model
0 commit comments