mistralai/Mistral-7B-v0.1
- Original model: Mistral-7B
rl4b
This is a model based on Mistral-7B intented to generate specialized queries from natural language queries.
Usage
To leverage rl4b for generating insights or responses, you can use the following code snippet:
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("fracarfer5/rl4b")
model = AutoModelForCausalLM.from_pretrained(
"fracarfer5/rl4b",
)
- Downloads last month
- 3
Hardware compatibility
Log In
to view the estimation
4-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support