Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
VMware
/
open-llama-0.3T-7B-instruct-dolly-hhrlhf
like
3
Follow
VMware AI Labs
210
Text Generation
Transformers
PyTorch
Safetensors
mosaicml/dolly_hhrlhf
English
llama
text-generation-inference
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Train
Deploy
Use this model
main
open-llama-0.3T-7B-instruct-dolly-hhrlhf
Ctrl+K
Ctrl+K
3 contributors
History:
12 commits
rbattle
SFconvertbot
Adding `safetensors` variant of this model (
#1
)
aed0cd9
verified
3 months ago
.gitattributes
Safe
1.48 kB
initial commit
over 2 years ago
README.md
Safe
2.04 kB
Update README.md
over 2 years ago
added_tokens.json
Safe
21 Bytes
Upload tokenizer
over 2 years ago
config.json
Safe
568 Bytes
Upload LlamaForCausalLM
over 2 years ago
generation_config.json
Safe
132 Bytes
Upload LlamaForCausalLM
over 2 years ago
model-00001-of-00002.safetensors
9.98 GB
xet
Adding `safetensors` variant of this model (#1)
3 months ago
model-00002-of-00002.safetensors
3.5 GB
xet
Adding `safetensors` variant of this model (#1)
3 months ago
model.safetensors.index.json
Safe
28.1 kB
Adding `safetensors` variant of this model (#1)
3 months ago
pytorch_model-00001-of-00002.bin
Safe
pickle
Detected Pickle imports (4)
"torch.HalfStorage"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
9.98 GB
xet
Upload LlamaForCausalLM
over 2 years ago
pytorch_model-00002-of-00002.bin
Safe
pickle
Detected Pickle imports (4)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch.HalfStorage"
What is a pickle import?
3.5 GB
xet
Upload LlamaForCausalLM
over 2 years ago
pytorch_model.bin.index.json
Safe
26.8 kB
Upload LlamaForCausalLM
over 2 years ago
special_tokens_map.json
Safe
96 Bytes
Upload tokenizer
over 2 years ago
tokenizer.json
Safe
1.99 MB
Upload tokenizer
over 2 years ago
tokenizer.model
Safe
772 kB
xet
Upload tokenizer
over 2 years ago
tokenizer_config.json
Safe
714 Bytes
Upload tokenizer
over 2 years ago