TPM-28 commited on
Commit
5c7573b
·
verified ·
1 Parent(s): dab850c

fix architecture

Browse files

```
ValueError: The checkpoint you are trying to load has model type `helium` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
```
and
"Overall, our architecture is almost identical to the one introduced by LLaMA 1, allowing an easy and straightforward deployment using existing tools such as MLX, vLLM, ollama or llama.cpp."

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -1,6 +1,6 @@
1
  {
2
  "architectures": [
3
- "HeliumForCausalLM"
4
  ],
5
  "attention_bias": false,
6
  "attention_dropout": 0.0,
 
1
  {
2
  "architectures": [
3
+ "LlamaForCausalLM"
4
  ],
5
  "attention_bias": false,
6
  "attention_dropout": 0.0,