| /home/msyu/workspace/venv/bin/python -m mlc_llm gen_config /tmp/tmpy7oe9l51/repo --quantization q0f16 --conv-template phi-3 --output /tmp/tmpg82xy6xr | |
| [2024-05-08 13:36:57] INFO auto_config.py:115: [92mFound[0m model configuration: /tmp/tmpy7oe9l51/repo/config.json | |
| [2024-05-08 13:36:57] INFO auto_config.py:153: [92mFound[0m model type: [1mphi3[0m. Use `--model-type` to override. | |
| [2024-05-08 13:36:57] INFO phi3_model.py:53: [1mcontext_window_size[0m not found in config.json. Falling back to [1mmax_position_embeddings[0m (4096) | |
| [2024-05-08 13:36:57] INFO phi3_model.py:68: [1mprefill_chunk_size[0m defaults to 2048 | |
| [2024-05-08 13:36:57] INFO config.py:106: Overriding [1mmax_batch_size[0m from 1 to 80 | |
| [2024-05-08 13:36:57] INFO gen_config.py:255: [generation_config.json] Setting [1mbos_token_id[0m: 1 | |
| [2024-05-08 13:36:57] INFO gen_config.py:255: [generation_config.json] Setting [1meos_token_id[0m: [32000, 32001, 32007] | |
| [2024-05-08 13:36:57] INFO gen_config.py:255: [generation_config.json] Setting [1mpad_token_id[0m: 32000 | |
| [2024-05-08 13:36:57] INFO gen_config.py:267: [92mFound[0m tokenizer config: /tmp/tmpy7oe9l51/repo/tokenizer.model. Copying to [1m/tmp/tmpg82xy6xr/tokenizer.model[0m | |
| [2024-05-08 13:36:57] INFO gen_config.py:267: [92mFound[0m tokenizer config: /tmp/tmpy7oe9l51/repo/tokenizer.json. Copying to [1m/tmp/tmpg82xy6xr/tokenizer.json[0m | |
| [2024-05-08 13:36:57] INFO gen_config.py:269: [91mNot found[0m tokenizer config: /tmp/tmpy7oe9l51/repo/vocab.json | |
| [2024-05-08 13:36:57] INFO gen_config.py:269: [91mNot found[0m tokenizer config: /tmp/tmpy7oe9l51/repo/merges.txt | |
| [2024-05-08 13:36:57] INFO gen_config.py:267: [92mFound[0m tokenizer config: /tmp/tmpy7oe9l51/repo/added_tokens.json. Copying to [1m/tmp/tmpg82xy6xr/added_tokens.json[0m | |
| [2024-05-08 13:36:57] INFO gen_config.py:267: [92mFound[0m tokenizer config: /tmp/tmpy7oe9l51/repo/tokenizer_config.json. Copying to [1m/tmp/tmpg82xy6xr/tokenizer_config.json[0m | |
| [2024-05-08 13:36:57] INFO gen_config.py:80: [System default] Setting [1mtemperature[0m: 0.7 | |
| [2024-05-08 13:36:57] INFO gen_config.py:80: [System default] Setting [1mpresence_penalty[0m: 0.0 | |
| [2024-05-08 13:36:57] INFO gen_config.py:80: [System default] Setting [1mfrequency_penalty[0m: 0.0 | |
| [2024-05-08 13:36:57] INFO gen_config.py:80: [System default] Setting [1mrepetition_penalty[0m: 1.0 | |
| [2024-05-08 13:36:57] INFO gen_config.py:80: [System default] Setting [1mtop_p[0m: 0.95 | |
| [2024-05-08 13:36:57] INFO gen_config.py:80: [System default] Setting [1mmean_gen_len[0m: 128 | |
| [2024-05-08 13:36:57] INFO gen_config.py:80: [System default] Setting [1mmax_gen_len[0m: 512 | |
| [2024-05-08 13:36:57] INFO gen_config.py:80: [System default] Setting [1mshift_fill_factor[0m: 0.3 | |
| [2024-05-08 13:36:57] INFO gen_config.py:335: Dumping configuration file to: [1m/tmp/tmpg82xy6xr/mlc-chat-config.json[0m | |
| /home/msyu/workspace/venv/bin/python -m mlc_llm convert_weight /tmp/tmpy7oe9l51/repo --quantization q0f16 --source-format auto --output /tmp/tmpg82xy6xr | |
| [2024-05-08 13:36:58] INFO auto_config.py:115: [92mFound[0m model configuration: /tmp/tmpy7oe9l51/repo/config.json | |
| [2024-05-08 13:36:59] INFO auto_device.py:79: [92mFound[0m device: cuda:0 | |
| [2024-05-08 13:36:59] INFO auto_device.py:88: [91mNot found[0m device: rocm:0 | |
| [2024-05-08 13:37:00] INFO auto_device.py:88: [91mNot found[0m device: metal:0 | |
| [2024-05-08 13:37:01] INFO auto_device.py:88: [91mNot found[0m device: vulkan:0 | |
| [2024-05-08 13:37:01] INFO auto_device.py:79: [92mFound[0m device: opencl:0 | |
| [2024-05-08 13:37:01] INFO auto_device.py:35: Using device: [1mcuda:0[0m | |
| [2024-05-08 13:37:01] INFO auto_weight.py:70: Finding weights in: /tmp/tmpy7oe9l51/repo | |
| [2024-05-08 13:37:01] INFO auto_weight.py:136: [91mNot found[0m Huggingface PyTorch | |
| [2024-05-08 13:37:01] INFO auto_weight.py:143: [92mFound[0m source weight format: huggingface-safetensor. Source configuration: /tmp/tmpy7oe9l51/repo/model.safetensors.index.json | |
| [2024-05-08 13:37:01] INFO auto_weight.py:106: Using source weight configuration: [1m/tmp/tmpy7oe9l51/repo/model.safetensors.index.json[0m. Use `--source` to override. | |
| [2024-05-08 13:37:01] INFO auto_weight.py:110: Using source weight format: [1mhuggingface-safetensor[0m. Use `--source-format` to override. | |
| [2024-05-08 13:37:01] INFO auto_config.py:153: [92mFound[0m model type: [1mphi3[0m. Use `--model-type` to override. | |
| [2024-05-08 13:37:01] INFO phi3_model.py:53: [1mcontext_window_size[0m not found in config.json. Falling back to [1mmax_position_embeddings[0m (4096) | |
| [2024-05-08 13:37:01] INFO phi3_model.py:68: [1mprefill_chunk_size[0m defaults to 2048 | |
| [1mWeight conversion with arguments:[0m | |
| [1m--config[0m /tmp/tmpy7oe9l51/repo/config.json | |
| [1m--quantization[0m NoQuantize(name='q0f16', kind='no-quant', model_dtype='float16') | |
| [1m--model-type[0m phi3 | |
| [1m--device[0m cuda:0 | |
| [1m--source[0m /tmp/tmpy7oe9l51/repo/model.safetensors.index.json | |
| [1m--source-format[0m huggingface-safetensor | |
| [1m--output[0m /tmp/tmpg82xy6xr | |
| Start storing to cache /tmp/tmpg82xy6xr | |
| 0%| | 0/195 [00:00<?, ?it/s] [2024-05-08 13:37:03] INFO huggingface_loader.py:184: Loading HF parameters from: /tmp/tmpy7oe9l51/repo/model-00002-of-00002.safetensors | |
| 0%| | 0/195 [00:00<?, ?it/s] [2024-05-08 13:37:03] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mlm_head.weight[0m", shape: (32064, 3072), dtype: float16 | |
| 0%| | 0/195 [00:00<?, ?it/s] 1%| | 1/195 [00:01<04:13, 1.31s/it] [2024-05-08 13:37:04] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.21.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 1%| | 1/195 [00:01<04:13, 1.31s/it] [2024-05-08 13:37:04] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.21.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 1%| | 1/195 [00:01<04:13, 1.31s/it] 2%|β | 3/195 [00:01<01:15, 2.54it/s] [2024-05-08 13:37:04] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.21.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 2%|β | 3/195 [00:01<01:15, 2.54it/s] 2%|β | 4/195 [00:01<01:07, 2.82it/s] [2024-05-08 13:37:04] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.21.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 2%|β | 4/195 [00:01<01:07, 2.82it/s] [2024-05-08 13:37:04] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.21.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 2%|β | 4/195 [00:01<01:07, 2.82it/s] 3%|β | 6/195 [00:01<00:41, 4.57it/s] [2024-05-08 13:37:04] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.22.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 3%|β | 6/195 [00:01<00:41, 4.57it/s] [2024-05-08 13:37:05] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.22.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 3%|β | 6/195 [00:01<00:41, 4.57it/s] 4%|β | 8/195 [00:02<00:29, 6.36it/s] [2024-05-08 13:37:05] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.22.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 4%|β | 8/195 [00:02<00:29, 6.36it/s] [2024-05-08 13:37:05] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.22.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 4%|β | 8/195 [00:02<00:29, 6.36it/s] 5%|β | 10/195 [00:02<00:28, 6.56it/s] [2024-05-08 13:37:05] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.22.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 5%|β | 10/195 [00:02<00:28, 6.56it/s] [2024-05-08 13:37:05] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.22.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 5%|β | 10/195 [00:02<00:28, 6.56it/s] 6%|β | 12/195 [00:02<00:24, 7.54it/s] [2024-05-08 13:37:05] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.23.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 6%|β | 12/195 [00:02<00:24, 7.54it/s] [2024-05-08 13:37:05] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.23.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 6%|β | 12/195 [00:02<00:24, 7.54it/s] 7%|β | 14/195 [00:02<00:20, 8.89it/s] [2024-05-08 13:37:05] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.23.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 7%|β | 14/195 [00:02<00:20, 8.89it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.23.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 7%|β | 14/195 [00:02<00:20, 8.89it/s] 8%|β | 16/195 [00:02<00:21, 8.16it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.23.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 8%|β | 16/195 [00:02<00:21, 8.16it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.23.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 8%|β | 16/195 [00:03<00:21, 8.16it/s] 9%|β | 18/195 [00:03<00:20, 8.55it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.24.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 9%|β | 18/195 [00:03<00:20, 8.55it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.24.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 9%|β | 18/195 [00:03<00:20, 8.55it/s] 10%|β | 20/195 [00:03<00:18, 9.72it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.24.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 10%|β | 20/195 [00:03<00:18, 9.72it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.24.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 10%|β | 20/195 [00:03<00:18, 9.72it/s] 11%|ββ | 22/195 [00:03<00:19, 8.66it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.24.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 11%|ββ | 22/195 [00:03<00:19, 8.66it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.24.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 11%|ββ | 22/195 [00:03<00:19, 8.66it/s] 12%|ββ | 24/195 [00:03<00:19, 8.96it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.25.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 12%|ββ | 24/195 [00:03<00:19, 8.96it/s] [2024-05-08 13:37:06] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.25.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 12%|ββ | 24/195 [00:03<00:19, 8.96it/s] 13%|ββ | 26/195 [00:03<00:16, 10.09it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.25.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 13%|ββ | 26/195 [00:04<00:16, 10.09it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.25.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 13%|ββ | 26/195 [00:04<00:16, 10.09it/s] 14%|ββ | 28/195 [00:04<00:18, 8.95it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.25.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 14%|ββ | 28/195 [00:04<00:18, 8.95it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.25.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 14%|ββ | 28/195 [00:04<00:18, 8.95it/s] 15%|ββ | 30/195 [00:04<00:17, 9.18it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.26.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 15%|ββ | 30/195 [00:04<00:17, 9.18it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.26.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 15%|ββ | 30/195 [00:04<00:17, 9.18it/s] 16%|ββ | 32/195 [00:04<00:15, 10.29it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.26.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 16%|ββ | 32/195 [00:04<00:15, 10.29it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.26.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 16%|ββ | 32/195 [00:04<00:15, 10.29it/s] 17%|ββ | 34/195 [00:04<00:17, 9.06it/s] [2024-05-08 13:37:07] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.26.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 17%|ββ | 34/195 [00:04<00:17, 9.06it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.26.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 17%|ββ | 34/195 [00:04<00:17, 9.06it/s] 18%|ββ | 36/195 [00:05<00:17, 9.26it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.27.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 18%|ββ | 36/195 [00:05<00:17, 9.26it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.27.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 18%|ββ | 36/195 [00:05<00:17, 9.26it/s] 19%|ββ | 38/195 [00:05<00:15, 10.35it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.27.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 19%|ββ | 38/195 [00:05<00:15, 10.35it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.27.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 19%|ββ | 38/195 [00:05<00:15, 10.35it/s] 21%|ββ | 40/195 [00:05<00:17, 9.10it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.27.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 21%|ββ | 40/195 [00:05<00:17, 9.10it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.27.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 21%|ββ | 40/195 [00:05<00:17, 9.10it/s] 22%|βββ | 42/195 [00:05<00:16, 9.29it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.28.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 22%|βββ | 42/195 [00:05<00:16, 9.29it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.28.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 22%|βββ | 42/195 [00:05<00:16, 9.29it/s] 23%|βββ | 44/195 [00:05<00:14, 10.37it/s] [2024-05-08 13:37:08] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.28.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 23%|βββ | 44/195 [00:05<00:14, 10.37it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.28.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 23%|βββ | 44/195 [00:06<00:14, 10.37it/s] 24%|βββ | 46/195 [00:06<00:16, 9.11it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.28.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 24%|βββ | 46/195 [00:06<00:16, 9.11it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.28.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 24%|βββ | 46/195 [00:06<00:16, 9.11it/s] 25%|βββ | 48/195 [00:06<00:15, 9.29it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.29.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 25%|βββ | 48/195 [00:06<00:15, 9.29it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.29.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 25%|βββ | 48/195 [00:06<00:15, 9.29it/s] 26%|βββ | 50/195 [00:06<00:13, 10.37it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.29.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 26%|βββ | 50/195 [00:06<00:13, 10.37it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.29.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 26%|βββ | 50/195 [00:06<00:13, 10.37it/s] 27%|βββ | 52/195 [00:06<00:15, 9.12it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.29.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 27%|βββ | 52/195 [00:06<00:15, 9.12it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.29.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 27%|βββ | 52/195 [00:06<00:15, 9.12it/s] 28%|βββ | 54/195 [00:06<00:15, 9.31it/s] [2024-05-08 13:37:09] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.30.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 28%|βββ | 54/195 [00:06<00:15, 9.31it/s] [2024-05-08 13:37:10] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.30.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 28%|βββ | 54/195 [00:06<00:15, 9.31it/s] 29%|βββ | 56/195 [00:07<00:13, 10.39it/s] [2024-05-08 13:37:10] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.30.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 29%|βββ | 56/195 [00:07<00:13, 10.39it/s] [2024-05-08 13:37:10] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.30.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 29%|βββ | 56/195 [00:07<00:13, 10.39it/s] 30%|βββ | 58/195 [00:07<00:15, 9.12it/s] [2024-05-08 13:37:10] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.30.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 30%|βββ | 58/195 [00:07<00:15, 9.12it/s] [2024-05-08 13:37:10] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.30.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 30%|βββ | 58/195 [00:07<00:15, 9.12it/s] 31%|βββ | 60/195 [00:07<00:14, 9.31it/s] [2024-05-08 13:37:10] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.31.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 31%|βββ | 60/195 [00:07<00:14, 9.31it/s] [2024-05-08 13:37:10] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.31.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 31%|βββ | 60/195 [00:07<00:14, 9.31it/s] 32%|ββββ | 62/195 [00:07<00:12, 10.38it/s] [2024-05-08 13:37:10] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.31.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 32%|ββββ | 62/195 [00:07<00:12, 10.38it/s] [2024-05-08 13:37:11] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.31.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 32%|ββββ | 62/195 [00:07<00:12, 10.38it/s] 33%|ββββ | 64/195 [00:07<00:14, 9.13it/s] [2024-05-08 13:37:11] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.31.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 33%|ββββ | 64/195 [00:07<00:14, 9.13it/s] [2024-05-08 13:37:11] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.31.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 33%|ββββ | 64/195 [00:08<00:14, 9.13it/s] 34%|ββββ | 66/195 [00:08<00:13, 9.31it/s] [2024-05-08 13:37:11] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.norm.weight[0m", shape: (3072,), dtype: float16 | |
| 34%|ββββ | 66/195 [00:08<00:13, 9.31it/s] [2024-05-08 13:37:11] INFO huggingface_loader.py:196: Unloading HF weight file: /tmp/tmpy7oe9l51/repo/model-00002-of-00002.safetensors | |
| 34%|ββββ | 66/195 [00:08<00:13, 9.31it/s] [2024-05-08 13:37:11] INFO huggingface_loader.py:184: Loading HF parameters from: /tmp/tmpy7oe9l51/repo/model-00001-of-00002.safetensors | |
| 34%|ββββ | 66/195 [00:08<00:13, 9.31it/s] [2024-05-08 13:37:11] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.embd.weight[0m", shape: (32064, 3072), dtype: float16 | |
| 34%|ββββ | 66/195 [00:08<00:13, 9.31it/s] 35%|ββββ | 68/195 [00:09<00:30, 4.23it/s] [2024-05-08 13:37:12] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.0.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 35%|ββββ | 68/195 [00:09<00:30, 4.23it/s] [2024-05-08 13:37:12] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.0.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 35%|ββββ | 68/195 [00:09<00:30, 4.23it/s] 36%|ββββ | 70/195 [00:09<00:23, 5.34it/s] [2024-05-08 13:37:12] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.0.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 36%|ββββ | 70/195 [00:09<00:23, 5.34it/s] 36%|ββββ | 71/195 [00:09<00:25, 4.93it/s] [2024-05-08 13:37:12] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.0.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 36%|ββββ | 71/195 [00:09<00:25, 4.93it/s] [2024-05-08 13:37:12] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.0.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 36%|ββββ | 71/195 [00:09<00:25, 4.93it/s] [2024-05-08 13:37:12] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.0.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 36%|ββββ | 71/195 [00:09<00:25, 4.93it/s] 38%|ββββ | 74/195 [00:09<00:17, 6.86it/s] [2024-05-08 13:37:12] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.1.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 38%|ββββ | 74/195 [00:09<00:17, 6.86it/s] [2024-05-08 13:37:12] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.1.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 38%|ββββ | 74/195 [00:09<00:17, 6.86it/s] 39%|ββββ | 76/195 [00:10<00:14, 8.09it/s] [2024-05-08 13:37:13] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.1.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 39%|ββββ | 76/195 [00:10<00:14, 8.09it/s] [2024-05-08 13:37:13] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.1.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 39%|ββββ | 76/195 [00:10<00:14, 8.09it/s] 40%|ββββ | 78/195 [00:10<00:15, 7.41it/s] [2024-05-08 13:37:13] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.1.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 40%|ββββ | 78/195 [00:10<00:15, 7.41it/s] [2024-05-08 13:37:13] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.1.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 40%|ββββ | 78/195 [00:10<00:15, 7.41it/s] 41%|ββββ | 80/195 [00:10<00:14, 7.92it/s] [2024-05-08 13:37:13] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.10.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 41%|ββββ | 80/195 [00:10<00:14, 7.92it/s] [2024-05-08 13:37:13] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.10.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 41%|ββββ | 80/195 [00:10<00:14, 7.92it/s] 42%|βββββ | 82/195 [00:10<00:12, 9.07it/s] [2024-05-08 13:37:13] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.10.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 42%|βββββ | 82/195 [00:10<00:12, 9.07it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.10.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 42%|βββββ | 82/195 [00:10<00:12, 9.07it/s] 43%|βββββ | 84/195 [00:10<00:13, 8.31it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.10.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 43%|βββββ | 84/195 [00:11<00:13, 8.31it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.10.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 43%|βββββ | 84/195 [00:11<00:13, 8.31it/s] 44%|βββββ | 86/195 [00:11<00:12, 8.62it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.11.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 44%|βββββ | 86/195 [00:11<00:12, 8.62it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.11.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 44%|βββββ | 86/195 [00:11<00:12, 8.62it/s] 45%|βββββ | 88/195 [00:11<00:11, 9.73it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.11.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 45%|βββββ | 88/195 [00:11<00:11, 9.73it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.11.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 45%|βββββ | 88/195 [00:11<00:11, 9.73it/s] 46%|βββββ | 90/195 [00:11<00:12, 8.67it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.11.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 46%|βββββ | 90/195 [00:11<00:12, 8.67it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.11.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 46%|βββββ | 90/195 [00:11<00:12, 8.67it/s] 47%|βββββ | 92/195 [00:11<00:11, 8.97it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.12.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 47%|βββββ | 92/195 [00:11<00:11, 8.97it/s] [2024-05-08 13:37:14] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.12.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 47%|βββββ | 92/195 [00:11<00:11, 8.97it/s] 48%|βββββ | 94/195 [00:11<00:10, 10.10it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.12.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 48%|βββββ | 94/195 [00:12<00:10, 10.10it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.12.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 48%|βββββ | 94/195 [00:12<00:10, 10.10it/s] 49%|βββββ | 96/195 [00:12<00:11, 8.98it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.12.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 49%|βββββ | 96/195 [00:12<00:11, 8.98it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.12.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 49%|βββββ | 96/195 [00:12<00:11, 8.98it/s] 50%|βββββ | 98/195 [00:12<00:10, 9.20it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.13.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 50%|βββββ | 98/195 [00:12<00:10, 9.20it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.13.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 50%|βββββ | 98/195 [00:12<00:10, 9.20it/s] 51%|ββββββ | 100/195 [00:12<00:09, 10.31it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.13.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 51%|ββββββ | 100/195 [00:12<00:09, 10.31it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.13.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 51%|ββββββ | 100/195 [00:12<00:09, 10.31it/s] 52%|ββββββ | 102/195 [00:12<00:10, 9.10it/s] [2024-05-08 13:37:15] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.13.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 52%|ββββββ | 102/195 [00:12<00:10, 9.10it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.13.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 52%|ββββββ | 102/195 [00:12<00:10, 9.10it/s] 53%|ββββββ | 104/195 [00:13<00:09, 9.29it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.14.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 53%|ββββββ | 104/195 [00:13<00:09, 9.29it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.14.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 53%|ββββββ | 104/195 [00:13<00:09, 9.29it/s] 54%|ββββββ | 106/195 [00:13<00:08, 10.38it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.14.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 54%|ββββββ | 106/195 [00:13<00:08, 10.38it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.14.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 54%|ββββββ | 106/195 [00:13<00:08, 10.38it/s] 55%|ββββββ | 108/195 [00:13<00:09, 9.14it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.14.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 55%|ββββββ | 108/195 [00:13<00:09, 9.14it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.14.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 55%|ββββββ | 108/195 [00:13<00:09, 9.14it/s] 56%|ββββββ | 110/195 [00:13<00:09, 9.33it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.15.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 56%|ββββββ | 110/195 [00:13<00:09, 9.33it/s] [2024-05-08 13:37:16] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.15.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 56%|ββββββ | 110/195 [00:13<00:09, 9.33it/s] 57%|ββββββ | 112/195 [00:13<00:07, 10.41it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.15.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 57%|ββββββ | 112/195 [00:13<00:07, 10.41it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.15.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 57%|ββββββ | 112/195 [00:14<00:07, 10.41it/s] 58%|ββββββ | 114/195 [00:14<00:08, 9.15it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.15.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 58%|ββββββ | 114/195 [00:14<00:08, 9.15it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.15.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 58%|ββββββ | 114/195 [00:14<00:08, 9.15it/s] 59%|ββββββ | 116/195 [00:14<00:08, 9.34it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.16.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 59%|ββββββ | 116/195 [00:14<00:08, 9.34it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.16.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 59%|ββββββ | 116/195 [00:14<00:08, 9.34it/s] 61%|ββββββ | 118/195 [00:14<00:07, 10.43it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.16.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 61%|ββββββ | 118/195 [00:14<00:07, 10.43it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.16.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 61%|ββββββ | 118/195 [00:14<00:07, 10.43it/s] 62%|βββββββ | 120/195 [00:14<00:08, 9.16it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.16.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 62%|βββββββ | 120/195 [00:14<00:08, 9.16it/s] [2024-05-08 13:37:17] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.16.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 62%|βββββββ | 120/195 [00:14<00:08, 9.16it/s] 63%|βββββββ | 122/195 [00:14<00:07, 9.34it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.17.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 63%|βββββββ | 122/195 [00:14<00:07, 9.34it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.17.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 63%|βββββββ | 122/195 [00:15<00:07, 9.34it/s] 64%|βββββββ | 124/195 [00:15<00:06, 10.42it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.17.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 64%|βββββββ | 124/195 [00:15<00:06, 10.42it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.17.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 64%|βββββββ | 124/195 [00:15<00:06, 10.42it/s] 65%|βββββββ | 126/195 [00:15<00:07, 9.16it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.17.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 65%|βββββββ | 126/195 [00:15<00:07, 9.16it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.17.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 65%|βββββββ | 126/195 [00:15<00:07, 9.16it/s] 66%|βββββββ | 128/195 [00:15<00:07, 9.34it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.18.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 66%|βββββββ | 128/195 [00:15<00:07, 9.34it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.18.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 66%|βββββββ | 128/195 [00:15<00:07, 9.34it/s] 67%|βββββββ | 130/195 [00:15<00:06, 10.43it/s] [2024-05-08 13:37:18] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.18.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 67%|βββββββ | 130/195 [00:15<00:06, 10.43it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.18.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 67%|βββββββ | 130/195 [00:16<00:06, 10.43it/s] 68%|βββββββ | 132/195 [00:16<00:06, 9.16it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.18.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 68%|βββββββ | 132/195 [00:16<00:06, 9.16it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.18.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 68%|βββββββ | 132/195 [00:16<00:06, 9.16it/s] 69%|βββββββ | 134/195 [00:16<00:06, 9.33it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.19.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 69%|βββββββ | 134/195 [00:16<00:06, 9.33it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.19.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 69%|βββββββ | 134/195 [00:16<00:06, 9.33it/s] 70%|βββββββ | 136/195 [00:16<00:05, 10.42it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.19.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 70%|βββββββ | 136/195 [00:16<00:05, 10.42it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.19.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 70%|βββββββ | 136/195 [00:16<00:05, 10.42it/s] 71%|βββββββ | 138/195 [00:16<00:06, 9.16it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.19.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 71%|βββββββ | 138/195 [00:16<00:06, 9.16it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.19.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 71%|βββββββ | 138/195 [00:16<00:06, 9.16it/s] 72%|ββββββββ | 140/195 [00:16<00:05, 9.34it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.2.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 72%|ββββββββ | 140/195 [00:16<00:05, 9.34it/s] [2024-05-08 13:37:19] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.2.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 72%|ββββββββ | 140/195 [00:16<00:05, 9.34it/s] 73%|ββββββββ | 142/195 [00:16<00:05, 10.42it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.2.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 73%|ββββββββ | 142/195 [00:17<00:05, 10.42it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.2.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 73%|ββββββββ | 142/195 [00:17<00:05, 10.42it/s] 74%|ββββββββ | 144/195 [00:17<00:05, 9.16it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.2.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 74%|ββββββββ | 144/195 [00:17<00:05, 9.16it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.2.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 74%|ββββββββ | 144/195 [00:17<00:05, 9.16it/s] 75%|ββββββββ | 146/195 [00:17<00:05, 9.34it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.20.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 75%|ββββββββ | 146/195 [00:17<00:05, 9.34it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.20.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 75%|ββββββββ | 146/195 [00:17<00:05, 9.34it/s] 76%|ββββββββ | 148/195 [00:17<00:04, 10.42it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.20.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 76%|ββββββββ | 148/195 [00:17<00:04, 10.42it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.20.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 76%|ββββββββ | 148/195 [00:17<00:04, 10.42it/s] 77%|ββββββββ | 150/195 [00:17<00:04, 9.16it/s] [2024-05-08 13:37:20] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.20.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 77%|ββββββββ | 150/195 [00:17<00:04, 9.16it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.20.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 77%|ββββββββ | 150/195 [00:17<00:04, 9.16it/s] 78%|ββββββββ | 152/195 [00:18<00:04, 9.33it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.21.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 78%|ββββββββ | 152/195 [00:18<00:04, 9.33it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.3.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 78%|ββββββββ | 152/195 [00:18<00:04, 9.33it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.3.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 78%|ββββββββ | 152/195 [00:18<00:04, 9.33it/s] 79%|ββββββββ | 155/195 [00:18<00:03, 11.11it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.3.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 79%|ββββββββ | 155/195 [00:18<00:03, 11.11it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.3.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 79%|ββββββββ | 155/195 [00:18<00:03, 11.11it/s] 81%|ββββββββ | 157/195 [00:18<00:03, 9.65it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.3.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 81%|ββββββββ | 157/195 [00:18<00:03, 9.65it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.3.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 81%|ββββββββ | 157/195 [00:18<00:03, 9.65it/s] 82%|βββββββββ | 159/195 [00:18<00:03, 9.70it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.4.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 82%|βββββββββ | 159/195 [00:18<00:03, 9.70it/s] [2024-05-08 13:37:21] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.4.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 82%|βββββββββ | 159/195 [00:18<00:03, 9.70it/s] 83%|βββββββββ | 161/195 [00:18<00:03, 10.67it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.4.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 83%|βββββββββ | 161/195 [00:18<00:03, 10.67it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.4.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 83%|βββββββββ | 161/195 [00:19<00:03, 10.67it/s] 84%|βββββββββ | 163/195 [00:19<00:03, 9.33it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.4.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 84%|βββββββββ | 163/195 [00:19<00:03, 9.33it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.4.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 84%|βββββββββ | 163/195 [00:19<00:03, 9.33it/s] 85%|βββββββββ | 165/195 [00:19<00:03, 9.46it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.5.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 85%|βββββββββ | 165/195 [00:19<00:03, 9.46it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.5.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 85%|βββββββββ | 165/195 [00:19<00:03, 9.46it/s] 86%|βββββββββ | 167/195 [00:19<00:02, 10.51it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.5.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 86%|βββββββββ | 167/195 [00:19<00:02, 10.51it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.5.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 86%|βββββββββ | 167/195 [00:19<00:02, 10.51it/s] 87%|βββββββββ | 169/195 [00:19<00:02, 9.22it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.5.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 87%|βββββββββ | 169/195 [00:19<00:02, 9.22it/s] [2024-05-08 13:37:22] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.5.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 87%|βββββββββ | 169/195 [00:19<00:02, 9.22it/s] 88%|βββββββββ | 171/195 [00:20<00:02, 9.38it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.6.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 88%|βββββββββ | 171/195 [00:20<00:02, 9.38it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.6.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 88%|βββββββββ | 171/195 [00:20<00:02, 9.38it/s] 89%|βββββββββ | 173/195 [00:20<00:02, 10.44it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.6.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 89%|βββββββββ | 173/195 [00:20<00:02, 10.44it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.6.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 89%|βββββββββ | 173/195 [00:20<00:02, 10.44it/s] 90%|βββββββββ | 175/195 [00:20<00:02, 9.17it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.6.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 90%|βββββββββ | 175/195 [00:20<00:02, 9.17it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.6.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 90%|βββββββββ | 175/195 [00:20<00:02, 9.17it/s] 91%|βββββββββ | 177/195 [00:20<00:01, 9.34it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.7.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 91%|βββββββββ | 177/195 [00:20<00:01, 9.34it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.7.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 91%|βββββββββ | 177/195 [00:20<00:01, 9.34it/s] 92%|ββββββββββ| 179/195 [00:20<00:01, 10.42it/s] [2024-05-08 13:37:23] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.7.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 92%|ββββββββββ| 179/195 [00:20<00:01, 10.42it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.7.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 92%|ββββββββββ| 179/195 [00:21<00:01, 10.42it/s] 93%|ββββββββββ| 181/195 [00:21<00:01, 9.15it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.7.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 93%|ββββββββββ| 181/195 [00:21<00:01, 9.15it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.7.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 93%|ββββββββββ| 181/195 [00:21<00:01, 9.15it/s] 94%|ββββββββββ| 183/195 [00:21<00:01, 9.33it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.8.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 94%|ββββββββββ| 183/195 [00:21<00:01, 9.33it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.8.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 94%|ββββββββββ| 183/195 [00:21<00:01, 9.33it/s] 95%|ββββββββββ| 185/195 [00:21<00:00, 10.41it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.8.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 95%|ββββββββββ| 185/195 [00:21<00:00, 10.41it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.8.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 95%|ββββββββββ| 185/195 [00:21<00:00, 10.41it/s] 96%|ββββββββββ| 187/195 [00:21<00:00, 9.14it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.8.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 96%|ββββββββββ| 187/195 [00:21<00:00, 9.14it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.8.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 96%|ββββββββββ| 187/195 [00:21<00:00, 9.14it/s] 97%|ββββββββββ| 189/195 [00:21<00:00, 9.33it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.9.ln.weight[0m", shape: (3072,), dtype: float16 | |
| 97%|ββββββββββ| 189/195 [00:21<00:00, 9.33it/s] [2024-05-08 13:37:24] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.9.mlp.down_proj.weight[0m", shape: (3072, 8192), dtype: float16 | |
| 97%|ββββββββββ| 189/195 [00:21<00:00, 9.33it/s] 98%|ββββββββββ| 191/195 [00:22<00:00, 10.40it/s] [2024-05-08 13:37:25] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.9.mlp.gate_up_proj.weight[0m", shape: (16384, 3072), dtype: float16 | |
| 98%|ββββββββββ| 191/195 [00:22<00:00, 10.40it/s] [2024-05-08 13:37:25] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.9.post_attention_layernorm.weight[0m", shape: (3072,), dtype: float16 | |
| 98%|ββββββββββ| 191/195 [00:22<00:00, 10.40it/s] 99%|ββββββββββ| 193/195 [00:22<00:00, 9.13it/s] [2024-05-08 13:37:25] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.9.mixer.out_proj.weight[0m", shape: (3072, 3072), dtype: float16 | |
| 99%|ββββββββββ| 193/195 [00:22<00:00, 9.13it/s] [2024-05-08 13:37:25] INFO huggingface_loader.py:174: [Not quantized] Parameter: "[1mtransformer.h.9.mixer.qkv_proj.weight[0m", shape: (9216, 3072), dtype: float16 | |
| 99%|ββββββββββ| 193/195 [00:22<00:00, 9.13it/s] 100%|ββββββββββ| 195/195 [00:22<00:00, 9.30it/s] 100%|ββββββββββ| 195/195 [00:22<00:00, 8.66it/s] | |
| [2024-05-08 13:37:25] INFO huggingface_loader.py:196: Unloading HF weight file: /tmp/tmpy7oe9l51/repo/model-00001-of-00002.safetensors | |
| [2024-05-08 13:37:25] INFO stats.py:76: [92mTime usage[0m: HF loading: 1.450 sec; Pre-quantization mapping: 5.723 sec; Quantization: 0.000 sec | |
| [2024-05-08 13:37:25] INFO stats.py:90: [92mRAM usage[0m: Peak RAM: 9.262 GB. Total bytes loaded from disk: 14.235 GB | |
| [2024-05-08 13:37:25] INFO convert_weight.py:155: [92mParameter size[0m after quantization: 7.117 GB | |
| [2024-05-08 13:37:25] INFO convert_weight.py:160: [92mTotal parameters[0m: 3,821,079,552 | |
| [2024-05-08 13:37:25] INFO convert_weight.py:161: [92mBits per parameter[0m: 16.000 | |
| [2024-05-08 13:37:25] INFO convert_weight.py:166: Saved to directory: [1m/tmp/tmpg82xy6xr[0m | |
| All finished, 130 total shards committed, record saved to /tmp/tmpg82xy6xr/ndarray-cache.json | |