Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
togethercomputer
/
evo-1-131k-base
like
101
Follow
Together
496
Text Generation
Transformers
Safetensors
stripedhyena
long context
deep signal processing
hybrid
biology
genomics
custom_code
arxiv:
7 papers
License:
apache-2.0
Model card
Files
Files and versions
Community
3
Train
Use this model
4a59285
evo-1-131k-base
4 contributors
History:
24 commits
Zymrael
Update README.md
4a59285
verified
11 months ago
.gitattributes
1.52 kB
initial commit
12 months ago
README.md
4.1 kB
Update README.md
11 months ago
cache.py
1.38 kB
init
12 months ago
config.json
1.73 kB
Fix auto tokenizer import reference format in auto map as list for slow and fast.
11 months ago
configuration_hyena.py
3.13 kB
init
12 months ago
engine.py
13.5 kB
init
12 months ago
generation_config.json
69 Bytes
Upload model
12 months ago
layers.py
5.39 kB
init
12 months ago
model-00001-of-00003.safetensors
4.98 GB
LFS
Upload model
12 months ago
model-00002-of-00003.safetensors
4.93 GB
LFS
Upload model
12 months ago
model-00003-of-00003.safetensors
3 GB
LFS
Upload model
12 months ago
model.py
19.4 kB
init
12 months ago
model.safetensors.index.json
34.9 kB
Upload model
12 months ago
modeling_hyena.py
5.55 kB
init
12 months ago
positional_embeddings.py
4.94 kB
init
12 months ago
pytorch_model.pt
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
16.8 GB
LFS
add pt ckpt
11 months ago
special_tokens_map.json
3 Bytes
Update byte tokenizer to be compatible with auto tokenizer and clean-up.
11 months ago
streamer.py
3.94 kB
init
12 months ago
tokenizer.py
4.37 kB
Remove tokenizer.json and replace tokenizer.py with correct version.
11 months ago
tokenizer_config.json
299 Bytes
Fix auto tokenizer import reference format in auto map as list for slow and fast.
11 months ago
utils.py
2.87 kB
init
12 months ago