πͺ SmolLM Collection A series of smol LLMs: 135M, 360M and 1.7B. We release base and Instruct models as well as the training corpus and some WebGPU demos β’ 12 items β’ Updated May 5 β’ 241
Running on CPU Upgrade Featured 1.18k Open ASR Leaderboard π 1.18k View and request speech models benchmark data