Add tool call chat template from vLLM
π
2
#78 opened 17 days ago
by
dsarfati

Multimodal ToolMessage
#77 opened 25 days ago
by
Butzermoggel
add chat_template key to the tokenizer_config
π
1
#76 opened 25 days ago
by
aleyfin
Fine tuning
#75 opened about 1 month ago
by
AtulOk
Running the model with 3*4090 will report an error about out of memory.
1
#73 opened about 2 months ago
by
iduzy

[vLLM] Stuck at "Waiting for output from MQLLMEngine"
π
π
2
2
#72 opened about 2 months ago
by
slemouellic
Test using M1 Max (64G) and Word
#71 opened about 2 months ago
by
gptlocalhost

Dirty
#70 opened about 2 months ago
by
Antonykmilk

Delete SYSTEM_PROMPT.txt
π€―
1
#69 opened about 2 months ago
by
Antonykmilk

Delete SYSTEM_PROMPT.txt
#68 opened about 2 months ago
by
Antonykmilk

Delete SYSTEM_PROMPT.txt
#67 opened about 2 months ago
by
Antonykmilk

Delete SYSTEM_PROMPT.txt
#66 opened about 2 months ago
by
Antonykmilk

Update SYSTEM_PROMPT.txt
#65 opened about 2 months ago
by
Antonykmilk

ValueError: Model architectures ['PixtralForConditionalGeneration'] failed to be inspected.
1
#64 opened about 2 months ago
by
sangwonjung
Add tool calling template for HF format
π₯
3
4
#63 opened 2 months ago
by
Frrosta
FSDP Training with Mistral-Small-3.1-24B-Instruct-2503 Model and DecoderLayer
#62 opened 2 months ago
by
ian00000
Unexpected PixtralProcessor use with Mistral-Small-3.1 on vLLM β text-only use case
#61 opened 2 months ago
by
AbasKhan

Consolidated safetensors
#60 opened 2 months ago
by
Aktsvigun

Removed redundancy in suggested system prompt
#59 opened 2 months ago
by
owao
Add chat_template to tokenizer_config
π
1
2
#58 opened 2 months ago
by
alexmarques
Create Mistral-Small-3.1-24B-Instruct-2503
#56 opened 2 months ago
by
dylanliao9191

Request: DOI
#55 opened 2 months ago
by
gmaterni

Address discrepancies in the languages supported by the Mistral Small 3.1 2503
π₯
1
3
#54 opened 2 months ago
by
fpaupier

chat template not working for tool calling
1
#52 opened 2 months ago
by
thies
[resolved] vllm nightly hf config
#51 opened 2 months ago
by
zaventh
Transformers Code Almost Works
#48 opened 2 months ago
by
binder11
Problem hosting the model using vllm
β
3
4
#45 opened 3 months ago
by
ShaoServient
FP8 Dynamic/W8A16 Quants Please
4
#44 opened 3 months ago
by
rjmehta
Speculative Decoding: I'd love to have a much smaller "companion model" (0.5B for example)
#43 opened 3 months ago
by
lstrozzi
model_card
#42 opened 3 months ago
by
Nahieli777777
Fix typos
π
1
#41 opened 3 months ago
by
sukrucildirr

Can you provide the finetune codeοΌ
#40 opened 3 months ago
by
jason500
Upload Gravity%2520Falls%2520Intro%2520x%2520playboi%2520carti%25203%205.mp3
π
1
#39 opened 3 months ago
by
Jasond6111
Can't determine properly which is greater between 9.9 and 9.11
π§
β
2
10
#38 opened 3 months ago
by
sniffski
Add transformers snippet
#36 opened 3 months ago
by
merve

Please help with error: Mistral-Small is not running on MacOs with CPU M2 Silicon. With Assert Error
β
2
#34 opened 3 months ago
by
NickolasCh
Deployment on Amazon SageMaker Endpoint
β
1
1
#33 opened 3 months ago
by
dgallitelli

Request support for text-only inference in transformers (Mistral3ForCausalLM class)
5
#32 opened 3 months ago
by
alan925
update metadata
#31 opened 3 months ago
by
nickname100231
Quantized models with vision included?
12
#27 opened 3 months ago
by
geoad
Corrected vllm link in readme
#26 opened 3 months ago
by
riversnow
Regarding Video Understanding
1
#25 opened 3 months ago
by
fensz
Support tool calls with chat template
β€οΈ
1
1
#24 opened 3 months ago
by
CISCai
FIX for the pip install vllm --ugrade --> pip install vllm --upgrade
#23 opened 3 months ago
by
rbgo

How do we use it with Transformers? can you give some sample code ?
9
#22 opened 3 months ago
by
rameshch
Local Installation Video and Testing on Vision, Coding, Math, Text - Step by Step
1
#21 opened 3 months ago
by
fahdmirzac

Visual Grounding
π
1
1
#20 opened 3 months ago
by
Maverick17

Mistral-small
#19 opened 3 months ago
by
Melkiss
Add chat template to tokenizer config
#18 opened 3 months ago
by
mrfakename
