Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yentinglin
/
Taiwan-LLaMa-v1.0
like
76
Text Generation
Transformers
PyTorch
yentinglin/zh_TW_c4
yentinglin/traditional_mandarin_instructions
Chinese
llama
conversational
text-generation-inference
Inference Endpoints
arxiv:
2311.17487
License:
llama2
Model card
Files
Files and versions
Community
5
Train
Deploy
Use this model
e9f04d1
Taiwan-LLaMa-v1.0
1 contributor
History:
15 commits
Luigi
Adds /home/user/.cache/huggingface/hub/models--yentinglin--Taiwan-LLaMa-v1.0/snapshots/55d346ffb1ae7796bcb30c7562d4d10c8ce33463/model-00003-of-00003.safetensors
e9f04d1
about 1 year ago
.gitattributes
1.52 kB
initial commit
over 1 year ago
README.md
13 kB
Update README.md
about 1 year ago
config.json
637 Bytes
Upload LlamaForCausalLM
over 1 year ago
generation_config.json
192 Bytes
Upload LlamaForCausalLM
over 1 year ago
model-00003-of-00003.safetensors
6.18 GB
LFS
Adds /home/user/.cache/huggingface/hub/models--yentinglin--Taiwan-LLaMa-v1.0/snapshots/55d346ffb1ae7796bcb30c7562d4d10c8ce33463/model-00003-of-00003.safetensors
about 1 year ago
model.safetensors.index.json
35.1 kB
Adding `safetensors` variant of this model
about 1 year ago
pytorch_model-00001-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
9.95 GB
LFS
Upload LlamaForCausalLM
over 1 year ago
pytorch_model-00002-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"torch.BFloat16Storage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.9 GB
LFS
Upload LlamaForCausalLM
over 1 year ago
pytorch_model-00003-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch.BFloat16Storage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
6.18 GB
LFS
Upload LlamaForCausalLM
over 1 year ago
pytorch_model.bin.index.json
33.4 kB
Upload LlamaForCausalLM
over 1 year ago
special_tokens_map.json
438 Bytes
Upload tokenizer
over 1 year ago
tokenizer.json
1.84 MB
Upload tokenizer
over 1 year ago
tokenizer.model
500 kB
LFS
Upload tokenizer
over 1 year ago
tokenizer_config.json
749 Bytes
Upload tokenizer
over 1 year ago