Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
MatrixIA
/
LLama-3-8B-SQL
like
2
Text Generation
Transformers
PyTorch
Safetensors
OneGate/OGText2SQL
English
llama
text-generation-inference
unsloth
trl
sft
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
LLama-3-8B-SQL
1 contributor
History:
17 commits
MatrixIA
Upload model trained with Unsloth
09a2af8
verified
10 months ago
.gitattributes
Safe
1.52 kB
initial commit
10 months ago
README.md
Safe
333 Bytes
Update README.md
10 months ago
adapter_config.json
Safe
732 Bytes
Upload model trained with Unsloth
10 months ago
adapter_model.safetensors
Safe
168 MB
LFS
Upload model trained with Unsloth
10 months ago
config.json
Safe
729 Bytes
Trained with Unsloth
10 months ago
generation_config.json
Safe
143 Bytes
Trained with Unsloth
10 months ago
pytorch_model-00001-of-00004.bin
Safe
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
4.98 GB
LFS
Trained with Unsloth
10 months ago
pytorch_model-00002-of-00004.bin
Safe
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
5 GB
LFS
Trained with Unsloth
10 months ago
pytorch_model-00003-of-00004.bin
Safe
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
4.92 GB
LFS
Trained with Unsloth
10 months ago
pytorch_model-00004-of-00004.bin
Safe
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
1.17 GB
LFS
Trained with Unsloth
10 months ago
pytorch_model.bin.index.json
Safe
24 kB
Trained with Unsloth
10 months ago
special_tokens_map.json
Safe
449 Bytes
Upload tokenizer
10 months ago
tokenizer.json
Safe
9.09 MB
Upload tokenizer
10 months ago
tokenizer_config.json
Safe
50.6 kB
Upload tokenizer
10 months ago