mlabonne commited on
Commit
02ded20
·
verified ·
1 Parent(s): 9fdefed

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -55,7 +55,7 @@ tags:
55
  </a>
56
  </center>
57
 
58
- # LFM2-1.2B
59
 
60
  LFM2 is a new generation of hybrid models developed by [Liquid AI](https://www.liquid.ai/), specifically designed for edge AI and on-device deployment. It sets a new standard in terms of quality, speed, and memory efficiency.
61
 
@@ -149,7 +149,7 @@ Here is an example of how to generate an answer with transformers in Python:
149
  from transformers import AutoModelForCausalLM, AutoTokenizer
150
 
151
  # Load model and tokenizer
152
- model_id = "LiquidAI/LFM2-1.2B"
153
  model = AutoModelForCausalLM.from_pretrained(
154
  model_id,
155
  device_map="auto",
 
55
  </a>
56
  </center>
57
 
58
+ # LFM2-700M
59
 
60
  LFM2 is a new generation of hybrid models developed by [Liquid AI](https://www.liquid.ai/), specifically designed for edge AI and on-device deployment. It sets a new standard in terms of quality, speed, and memory efficiency.
61
 
 
149
  from transformers import AutoModelForCausalLM, AutoTokenizer
150
 
151
  # Load model and tokenizer
152
+ model_id = "LiquidAI/LFM2-700M"
153
  model = AutoModelForCausalLM.from_pretrained(
154
  model_id,
155
  device_map="auto",