Qwen3-30B-A3B-pruned-GGUF / scores /Qwen3-30B-A3B-pruned-Q3_K_S.md
eaddario's picture
Add GGUF internal file structure
dc1fc34 verified

Qwen3-30B-A3B-Q3_K_S.gguf - GGUF Internal File Dump

  • Endian: LITTLE endian

Key Value Metadata Store

There are 45 key-value pairs in this file

POS TYPE Count Key Value
1 UINT32 1 GGUF.version 3
2 UINT64 1 GGUF.tensor_count 555
3 UINT64 1 GGUF.kv_count 42
4 STRING 1 general.architecture qwen3moe
5 STRING 1 general.type model
6 STRING 1 general.name Qwen3 30B A3B
7 STRING 1 general.basename Qwen3
8 STRING 1 general.size_label 30B-A3B
9 STRING 1 general.license apache-2.0
10 STRING 1 general.license.link https://huggingface.co/Qwen/Qwen3-30B-A3B/blob/main/LICENSE
11 UINT32 1 general.base_model.count 1
12 STRING 1 general.base_model.0.name Qwen3 30B A3B Base
13 STRING 1 general.base_model.0.organization Qwen
14 STRING 1 general.base_model.0.repo_url https://huggingface.co/Qwen/Qwen3-30B-A3B-Base
15 [STRING] 1 general.tags [ text-generation ]
16 UINT32 1 qwen3moe.context_length 40960
17 UINT32 1 qwen3moe.embedding_length 2048
18 UINT32 1 qwen3moe.feed_forward_length 6144
19 UINT32 1 qwen3moe.attention.head_count 32
20 UINT32 1 qwen3moe.attention.head_count_kv 4
21 FLOAT32 1 qwen3moe.rope.freq_base 1000000.0
22 FLOAT32 1 qwen3moe.attention.layer_norm_rms_epsilon 1e-06
23 UINT32 1 qwen3moe.expert_used_count 8
24 UINT32 1 qwen3moe.attention.key_length 128
25 UINT32 1 qwen3moe.attention.value_length 128
26 UINT32 1 qwen3moe.expert_count 128
27 UINT32 1 qwen3moe.expert_feed_forward_length 768
28 STRING 1 tokenizer.ggml.model gpt2
29 STRING 1 tokenizer.ggml.pre qwen2
30 [STRING] 151936 tokenizer.ggml.tokens [ !, ", #, $, %, ... ]
31 [INT32] 151936 tokenizer.ggml.token_type [ 1, 1, 1, 1, 1, 1, 1, ... ]
32 [STRING] 151387 tokenizer.ggml.merges [ Ġ Ġ, ĠĠ ĠĠ, i n, Ġ t, ĠĠĠĠ ĠĠĠĠ, ... ]
33 UINT32 1 tokenizer.ggml.eos_token_id 151645
34 UINT32 1 tokenizer.ggml.padding_token_id 151643
35 UINT32 1 tokenizer.ggml.bos_token_id 151643
36 BOOL 1 tokenizer.ggml.add_bos_token False
37 STRING 1 tokenizer.chat_template `{%- if tools %}{{- '<
38 UINT32 1 general.quantization_version 2
39 UINT32 1 general.file_type 11
40 BOOL 1 general.pruned True
41 UINT32 1 qwen3moe.block_count 46
42 STRING 1 quantize.imatrix.file ./imatrix/imatrix-Qwen3-30B-A3B-medium.dat
43 STRING 1 quantize.imatrix.dataset ../../datasets/imatrix/combined_all_medium.txt
44 INT32 1 quantize.imatrix.entries_count 385
45 INT32 1 quantize.imatrix.chunks_count 6946

Tensors Overview ~29B Elements

Total number of elements in all tensors: 29285881344 Elements

Tensor Data Offset

This table contains the offset and data segment relative to start of file

T_ID Tensor Layer Name Data Offset (B) Data Size (B)
0 output.weight 0x5b12e0 0x7f82800
1 output_norm.weight 0x8533ae0 0x2000
2 token_embd.weight 0x8535ae0 0x615f000
3 blk.0.attn_k.weight 0xe694ae0 0x54000
4 blk.0.attn_k_norm.weight 0xe6e8ae0 0x200
5 blk.0.attn_norm.weight 0xe6e8ce0 0x2000
6 blk.0.attn_output.weight 0xe6eace0 0x370000
7 blk.0.attn_q.weight 0xea5ace0 0x2a0000
8 blk.0.attn_q_norm.weight 0xecface0 0x200
9 blk.0.attn_v.weight 0xecfaee0 0x6e000
10 blk.0.ffn_down_exps.weight 0xed68ee0 0x6c00000
11 blk.0.ffn_gate_exps.weight 0x15968ee0 0x3f00000
12 blk.0.ffn_gate_inp.weight 0x19868ee0 0x100000
13 blk.0.ffn_norm.weight 0x19968ee0 0x2000
14 blk.0.ffn_up_exps.weight 0x1996aee0 0x3f00000
15 blk.1.attn_k.weight 0x1d86aee0 0x54000
16 blk.1.attn_k_norm.weight 0x1d8beee0 0x200
17 blk.1.attn_norm.weight 0x1d8bf0e0 0x2000
18 blk.1.attn_output.weight 0x1d8c10e0 0x370000
19 blk.1.attn_q.weight 0x1dc310e0 0x2a0000
20 blk.1.attn_q_norm.weight 0x1ded10e0 0x200
21 blk.1.attn_v.weight 0x1ded12e0 0x6e000
22 blk.1.ffn_down_exps.weight 0x1df3f2e0 0x6c00000
23 blk.1.ffn_gate_exps.weight 0x24b3f2e0 0x3f00000
24 blk.1.ffn_gate_inp.weight 0x28a3f2e0 0x100000
25 blk.1.ffn_norm.weight 0x28b3f2e0 0x2000
26 blk.1.ffn_up_exps.weight 0x28b412e0 0x3f00000
27 blk.2.attn_k.weight 0x2ca412e0 0x54000
28 blk.2.attn_k_norm.weight 0x2ca952e0 0x200
29 blk.2.attn_norm.weight 0x2ca954e0 0x2000
30 blk.2.attn_output.weight 0x2ca974e0 0x370000
31 blk.2.attn_q.weight 0x2ce074e0 0x2a0000
32 blk.2.attn_q_norm.weight 0x2d0a74e0 0x200
33 blk.2.attn_v.weight 0x2d0a76e0 0x6e000
34 blk.2.ffn_down_exps.weight 0x2d1156e0 0x6c00000
35 blk.2.ffn_gate_exps.weight 0x33d156e0 0x3f00000
36 blk.2.ffn_gate_inp.weight 0x37c156e0 0x100000
37 blk.2.ffn_norm.weight 0x37d156e0 0x2000
38 blk.2.ffn_up_exps.weight 0x37d176e0 0x3f00000
39 blk.3.attn_k.weight 0x3bc176e0 0x54000
40 blk.3.attn_k_norm.weight 0x3bc6b6e0 0x200
41 blk.3.attn_norm.weight 0x3bc6b8e0 0x2000
42 blk.3.attn_output.weight 0x3bc6d8e0 0x370000
43 blk.3.attn_q.weight 0x3bfdd8e0 0x2a0000
44 blk.3.attn_q_norm.weight 0x3c27d8e0 0x200
45 blk.3.attn_v.weight 0x3c27dae0 0x6e000
46 blk.3.ffn_down_exps.weight 0x3c2ebae0 0x6c00000
47 blk.3.ffn_gate_exps.weight 0x42eebae0 0x3f00000
48 blk.3.ffn_gate_inp.weight 0x46debae0 0x100000
49 blk.3.ffn_norm.weight 0x46eebae0 0x2000
50 blk.3.ffn_up_exps.weight 0x46eedae0 0x3f00000
51 blk.4.attn_k.weight 0x4adedae0 0x54000
52 blk.4.attn_k_norm.weight 0x4ae41ae0 0x200
53 blk.4.attn_norm.weight 0x4ae41ce0 0x2000
54 blk.4.attn_output.weight 0x4ae43ce0 0x370000
55 blk.4.attn_q.weight 0x4b1b3ce0 0x2a0000
56 blk.4.attn_q_norm.weight 0x4b453ce0 0x200
57 blk.4.attn_v.weight 0x4b453ee0 0x6e000
58 blk.4.ffn_down_exps.weight 0x4b4c1ee0 0x6c00000
59 blk.4.ffn_gate_exps.weight 0x520c1ee0 0x3f00000
60 blk.4.ffn_gate_inp.weight 0x55fc1ee0 0x100000
61 blk.4.ffn_norm.weight 0x560c1ee0 0x2000
62 blk.4.ffn_up_exps.weight 0x560c3ee0 0x3f00000
63 blk.5.attn_k.weight 0x59fc3ee0 0x54000
64 blk.5.attn_k_norm.weight 0x5a017ee0 0x200
65 blk.5.attn_norm.weight 0x5a0180e0 0x2000
66 blk.5.attn_output.weight 0x5a01a0e0 0x370000
67 blk.5.attn_q.weight 0x5a38a0e0 0x2a0000
68 blk.5.attn_q_norm.weight 0x5a62a0e0 0x200
69 blk.5.attn_v.weight 0x5a62a2e0 0x6e000
70 blk.5.ffn_down_exps.weight 0x5a6982e0 0x6c00000
71 blk.5.ffn_gate_exps.weight 0x612982e0 0x3f00000
72 blk.5.ffn_gate_inp.weight 0x651982e0 0x100000
73 blk.5.ffn_norm.weight 0x652982e0 0x2000
74 blk.5.ffn_up_exps.weight 0x6529a2e0 0x3f00000
75 blk.6.attn_k.weight 0x6919a2e0 0x54000
76 blk.6.attn_k_norm.weight 0x691ee2e0 0x200
77 blk.6.attn_norm.weight 0x691ee4e0 0x2000
78 blk.6.attn_output.weight 0x691f04e0 0x370000
79 blk.6.attn_q.weight 0x695604e0 0x2a0000
80 blk.6.attn_q_norm.weight 0x698004e0 0x200
81 blk.6.attn_v.weight 0x698006e0 0x6e000
82 blk.6.ffn_down_exps.weight 0x6986e6e0 0x6c00000
83 blk.6.ffn_gate_exps.weight 0x7046e6e0 0x3f00000
84 blk.6.ffn_gate_inp.weight 0x7436e6e0 0x100000
85 blk.6.ffn_norm.weight 0x7446e6e0 0x2000
86 blk.6.ffn_up_exps.weight 0x744706e0 0x3f00000
87 blk.7.attn_k.weight 0x783706e0 0x54000
88 blk.7.attn_k_norm.weight 0x783c46e0 0x200
89 blk.7.attn_norm.weight 0x783c48e0 0x2000
90 blk.7.attn_output.weight 0x783c68e0 0x370000
91 blk.7.attn_q.weight 0x787368e0 0x2a0000
92 blk.7.attn_q_norm.weight 0x789d68e0 0x200
93 blk.7.attn_v.weight 0x789d6ae0 0x6e000
94 blk.7.ffn_down_exps.weight 0x78a44ae0 0x6c00000
95 blk.7.ffn_gate_exps.weight 0x7f644ae0 0x3f00000
96 blk.7.ffn_gate_inp.weight 0x83544ae0 0x100000
97 blk.7.ffn_norm.weight 0x83644ae0 0x2000
98 blk.7.ffn_up_exps.weight 0x83646ae0 0x3f00000
99 blk.8.attn_k.weight 0x87546ae0 0x54000
100 blk.8.attn_k_norm.weight 0x8759aae0 0x200
101 blk.8.attn_norm.weight 0x8759ace0 0x2000
102 blk.8.attn_output.weight 0x8759cce0 0x370000
103 blk.8.attn_q.weight 0x8790cce0 0x2a0000
104 blk.8.attn_q_norm.weight 0x87bacce0 0x200
105 blk.8.attn_v.weight 0x87bacee0 0x6e000
106 blk.8.ffn_down_exps.weight 0x87c1aee0 0x6c00000
107 blk.8.ffn_gate_exps.weight 0x8e81aee0 0x3f00000
108 blk.8.ffn_gate_inp.weight 0x9271aee0 0x100000
109 blk.8.ffn_norm.weight 0x9281aee0 0x2000
110 blk.8.ffn_up_exps.weight 0x9281cee0 0x3f00000
111 blk.9.attn_k.weight 0x9671cee0 0x54000
112 blk.9.attn_k_norm.weight 0x96770ee0 0x200
113 blk.9.attn_norm.weight 0x967710e0 0x2000
114 blk.9.attn_output.weight 0x967730e0 0x370000
115 blk.9.attn_q.weight 0x96ae30e0 0x2a0000
116 blk.9.attn_q_norm.weight 0x96d830e0 0x200
117 blk.9.attn_v.weight 0x96d832e0 0x6e000
118 blk.9.ffn_down_exps.weight 0x96df12e0 0x6c00000
119 blk.9.ffn_gate_exps.weight 0x9d9f12e0 0x3f00000
120 blk.9.ffn_gate_inp.weight 0xa18f12e0 0x100000
121 blk.9.ffn_norm.weight 0xa19f12e0 0x2000
122 blk.9.ffn_up_exps.weight 0xa19f32e0 0x3f00000
123 blk.10.attn_k.weight 0xa58f32e0 0x54000
124 blk.10.attn_k_norm.weight 0xa59472e0 0x200
125 blk.10.attn_norm.weight 0xa59474e0 0x2000
126 blk.10.attn_output.weight 0xa59494e0 0x370000
127 blk.10.attn_q.weight 0xa5cb94e0 0x2a0000
128 blk.10.attn_q_norm.weight 0xa5f594e0 0x200
129 blk.10.attn_v.weight 0xa5f596e0 0x6e000
130 blk.10.ffn_down_exps.weight 0xa5fc76e0 0x6c00000
131 blk.10.ffn_gate_exps.weight 0xacbc76e0 0x3f00000
132 blk.10.ffn_gate_inp.weight 0xb0ac76e0 0x100000
133 blk.10.ffn_norm.weight 0xb0bc76e0 0x2000
134 blk.10.ffn_up_exps.weight 0xb0bc96e0 0x3f00000
135 blk.11.attn_k.weight 0xb4ac96e0 0x54000
136 blk.11.attn_k_norm.weight 0xb4b1d6e0 0x200
137 blk.11.attn_norm.weight 0xb4b1d8e0 0x2000
138 blk.11.attn_output.weight 0xb4b1f8e0 0x370000
139 blk.11.attn_q.weight 0xb4e8f8e0 0x2a0000
140 blk.11.attn_q_norm.weight 0xb512f8e0 0x200
141 blk.11.attn_v.weight 0xb512fae0 0x6e000
142 blk.11.ffn_down_exps.weight 0xb519dae0 0x6c00000
143 blk.11.ffn_gate_exps.weight 0xbbd9dae0 0x3f00000
144 blk.11.ffn_gate_inp.weight 0xbfc9dae0 0x100000
145 blk.11.ffn_norm.weight 0xbfd9dae0 0x2000
146 blk.11.ffn_up_exps.weight 0xbfd9fae0 0x3f00000
147 blk.12.attn_k.weight 0xc3c9fae0 0x54000
148 blk.12.attn_k_norm.weight 0xc3cf3ae0 0x200
149 blk.12.attn_norm.weight 0xc3cf3ce0 0x2000
150 blk.12.attn_output.weight 0xc3cf5ce0 0x370000
151 blk.12.attn_q.weight 0xc4065ce0 0x2a0000
152 blk.12.attn_q_norm.weight 0xc4305ce0 0x200
153 blk.12.attn_v.weight 0xc4305ee0 0x6e000
154 blk.12.ffn_down_exps.weight 0xc4373ee0 0x6c00000
155 blk.12.ffn_gate_exps.weight 0xcaf73ee0 0x3f00000
156 blk.12.ffn_gate_inp.weight 0xcee73ee0 0x100000
157 blk.12.ffn_norm.weight 0xcef73ee0 0x2000
158 blk.12.ffn_up_exps.weight 0xcef75ee0 0x3f00000
159 blk.13.attn_k.weight 0xd2e75ee0 0x54000
160 blk.13.attn_k_norm.weight 0xd2ec9ee0 0x200
161 blk.13.attn_norm.weight 0xd2eca0e0 0x2000
162 blk.13.attn_output.weight 0xd2ecc0e0 0x370000
163 blk.13.attn_q.weight 0xd323c0e0 0x2a0000
164 blk.13.attn_q_norm.weight 0xd34dc0e0 0x200
165 blk.13.attn_v.weight 0xd34dc2e0 0x6e000
166 blk.13.ffn_down_exps.weight 0xd354a2e0 0x6c00000
167 blk.13.ffn_gate_exps.weight 0xda14a2e0 0x3f00000
168 blk.13.ffn_gate_inp.weight 0xde04a2e0 0x100000
169 blk.13.ffn_norm.weight 0xde14a2e0 0x2000
170 blk.13.ffn_up_exps.weight 0xde14c2e0 0x3f00000
171 blk.14.attn_k.weight 0xe204c2e0 0x54000
172 blk.14.attn_k_norm.weight 0xe20a02e0 0x200
173 blk.14.attn_norm.weight 0xe20a04e0 0x2000
174 blk.14.attn_output.weight 0xe20a24e0 0x370000
175 blk.14.attn_q.weight 0xe24124e0 0x2a0000
176 blk.14.attn_q_norm.weight 0xe26b24e0 0x200
177 blk.14.attn_v.weight 0xe26b26e0 0x6e000
178 blk.14.ffn_down_exps.weight 0xe27206e0 0x6c00000
179 blk.14.ffn_gate_exps.weight 0xe93206e0 0x3f00000
180 blk.14.ffn_gate_inp.weight 0xed2206e0 0x100000
181 blk.14.ffn_norm.weight 0xed3206e0 0x2000
182 blk.14.ffn_up_exps.weight 0xed3226e0 0x3f00000
183 blk.15.attn_k.weight 0xf12226e0 0x54000
184 blk.15.attn_k_norm.weight 0xf12766e0 0x200
185 blk.15.attn_norm.weight 0xf12768e0 0x2000
186 blk.15.attn_output.weight 0xf12788e0 0x370000
187 blk.15.attn_q.weight 0xf15e88e0 0x2a0000
188 blk.15.attn_q_norm.weight 0xf18888e0 0x200
189 blk.15.attn_v.weight 0xf1888ae0 0x6e000
190 blk.15.ffn_down_exps.weight 0xf18f6ae0 0x6c00000
191 blk.15.ffn_gate_exps.weight 0xf84f6ae0 0x3f00000
192 blk.15.ffn_gate_inp.weight 0xfc3f6ae0 0x100000
193 blk.15.ffn_norm.weight 0xfc4f6ae0 0x2000
194 blk.15.ffn_up_exps.weight 0xfc4f8ae0 0x3f00000
195 blk.16.attn_k.weight 0x1003f8ae0 0x54000
196 blk.16.attn_k_norm.weight 0x10044cae0 0x200
197 blk.16.attn_norm.weight 0x10044cce0 0x2000
198 blk.16.attn_output.weight 0x10044ece0 0x370000
199 blk.16.attn_q.weight 0x1007bece0 0x2a0000
200 blk.16.attn_q_norm.weight 0x100a5ece0 0x200
201 blk.16.attn_v.weight 0x100a5eee0 0x6e000
202 blk.16.ffn_down_exps.weight 0x100accee0 0x6c00000
203 blk.16.ffn_gate_exps.weight 0x1076ccee0 0x3f00000
204 blk.16.ffn_gate_inp.weight 0x10b5ccee0 0x100000
205 blk.16.ffn_norm.weight 0x10b6ccee0 0x2000
206 blk.16.ffn_up_exps.weight 0x10b6ceee0 0x3f00000
207 blk.17.attn_k.weight 0x10f5ceee0 0x54000
208 blk.17.attn_k_norm.weight 0x10f622ee0 0x200
209 blk.17.attn_norm.weight 0x10f6230e0 0x2000
210 blk.17.attn_output.weight 0x10f6250e0 0x370000
211 blk.17.attn_q.weight 0x10f9950e0 0x2a0000
212 blk.17.attn_q_norm.weight 0x10fc350e0 0x200
213 blk.17.attn_v.weight 0x10fc352e0 0x6e000
214 blk.17.ffn_down_exps.weight 0x10fca32e0 0x6c00000
215 blk.17.ffn_gate_exps.weight 0x1168a32e0 0x3f00000
216 blk.17.ffn_gate_inp.weight 0x11a7a32e0 0x100000
217 blk.17.ffn_norm.weight 0x11a8a32e0 0x2000
218 blk.17.ffn_up_exps.weight 0x11a8a52e0 0x3f00000
219 blk.18.attn_k.weight 0x11e7a52e0 0x54000
220 blk.18.attn_k_norm.weight 0x11e7f92e0 0x200
221 blk.18.attn_norm.weight 0x11e7f94e0 0x2000
222 blk.18.attn_output.weight 0x11e7fb4e0 0x370000
223 blk.18.attn_q.weight 0x11eb6b4e0 0x2a0000
224 blk.18.attn_q_norm.weight 0x11ee0b4e0 0x200
225 blk.18.attn_v.weight 0x11ee0b6e0 0x6e000
226 blk.18.ffn_down_exps.weight 0x11ee796e0 0x6c00000
227 blk.18.ffn_gate_exps.weight 0x125a796e0 0x5280000
228 blk.18.ffn_gate_inp.weight 0x12acf96e0 0x100000
229 blk.18.ffn_norm.weight 0x12adf96e0 0x2000
230 blk.18.ffn_up_exps.weight 0x12adfb6e0 0x5280000
231 blk.19.attn_k.weight 0x13007b6e0 0x54000
232 blk.19.attn_k_norm.weight 0x1300cf6e0 0x200
233 blk.19.attn_norm.weight 0x1300cf8e0 0x2000
234 blk.19.attn_output.weight 0x1300d18e0 0x370000
235 blk.19.attn_q.weight 0x1304418e0 0x2a0000
236 blk.19.attn_q_norm.weight 0x1306e18e0 0x200
237 blk.19.attn_v.weight 0x1306e1ae0 0x6e000
238 blk.19.ffn_down_exps.weight 0x13074fae0 0x6c00000
239 blk.19.ffn_gate_exps.weight 0x13734fae0 0x3f00000
240 blk.19.ffn_gate_inp.weight 0x13b24fae0 0x100000
241 blk.19.ffn_norm.weight 0x13b34fae0 0x2000
242 blk.19.ffn_up_exps.weight 0x13b351ae0 0x3f00000
243 blk.20.attn_k.weight 0x13f251ae0 0x54000
244 blk.20.attn_k_norm.weight 0x13f2a5ae0 0x200
245 blk.20.attn_norm.weight 0x13f2a5ce0 0x2000
246 blk.20.attn_output.weight 0x13f2a7ce0 0x370000
247 blk.20.attn_q.weight 0x13f617ce0 0x2a0000
248 blk.20.attn_q_norm.weight 0x13f8b7ce0 0x200
249 blk.20.attn_v.weight 0x13f8b7ee0 0x6e000
250 blk.20.ffn_down_exps.weight 0x13f925ee0 0x6c00000
251 blk.20.ffn_gate_exps.weight 0x146525ee0 0x3f00000
252 blk.20.ffn_gate_inp.weight 0x14a425ee0 0x100000
253 blk.20.ffn_norm.weight 0x14a525ee0 0x2000
254 blk.20.ffn_up_exps.weight 0x14a527ee0 0x3f00000
255 blk.21.attn_k.weight 0x14e427ee0 0x54000
256 blk.21.attn_k_norm.weight 0x14e47bee0 0x200
257 blk.21.attn_norm.weight 0x14e47c0e0 0x2000
258 blk.21.attn_output.weight 0x14e47e0e0 0x370000
259 blk.21.attn_q.weight 0x14e7ee0e0 0x2a0000
260 blk.21.attn_q_norm.weight 0x14ea8e0e0 0x200
261 blk.21.attn_v.weight 0x14ea8e2e0 0x6e000
262 blk.21.ffn_down_exps.weight 0x14eafc2e0 0x6c00000
263 blk.21.ffn_gate_exps.weight 0x1556fc2e0 0x3f00000
264 blk.21.ffn_gate_inp.weight 0x1595fc2e0 0x100000
265 blk.21.ffn_norm.weight 0x1596fc2e0 0x2000
266 blk.21.ffn_up_exps.weight 0x1596fe2e0 0x3f00000
267 blk.22.attn_k.weight 0x15d5fe2e0 0x54000
268 blk.22.attn_k_norm.weight 0x15d6522e0 0x200
269 blk.22.attn_norm.weight 0x15d6524e0 0x2000
270 blk.22.attn_output.weight 0x15d6544e0 0x370000
271 blk.22.attn_q.weight 0x15d9c44e0 0x2a0000
272 blk.22.attn_q_norm.weight 0x15dc644e0 0x200
273 blk.22.attn_v.weight 0x15dc646e0 0x6e000
274 blk.22.ffn_down_exps.weight 0x15dcd26e0 0x6c00000
275 blk.22.ffn_gate_exps.weight 0x1648d26e0 0x3f00000
276 blk.22.ffn_gate_inp.weight 0x1687d26e0 0x100000
277 blk.22.ffn_norm.weight 0x1688d26e0 0x2000
278 blk.22.ffn_up_exps.weight 0x1688d46e0 0x3f00000
279 blk.23.attn_k.weight 0x16c7d46e0 0x54000
280 blk.23.attn_k_norm.weight 0x16c8286e0 0x200
281 blk.23.attn_norm.weight 0x16c8288e0 0x2000
282 blk.23.attn_output.weight 0x16c82a8e0 0x370000
283 blk.23.attn_q.weight 0x16cb9a8e0 0x2a0000
284 blk.23.attn_q_norm.weight 0x16ce3a8e0 0x200
285 blk.23.attn_v.weight 0x16ce3aae0 0x6e000
286 blk.23.ffn_down_exps.weight 0x16cea8ae0 0x6c00000
287 blk.23.ffn_gate_exps.weight 0x173aa8ae0 0x3f00000
288 blk.23.ffn_gate_inp.weight 0x1779a8ae0 0x100000
289 blk.23.ffn_norm.weight 0x177aa8ae0 0x2000
290 blk.23.ffn_up_exps.weight 0x177aaaae0 0x3f00000
291 blk.24.attn_k.weight 0x17b9aaae0 0x6e000
292 blk.24.attn_k_norm.weight 0x17ba18ae0 0x200
293 blk.24.attn_norm.weight 0x17ba18ce0 0x2000
294 blk.24.attn_output.weight 0x17ba1ace0 0x370000
295 blk.24.attn_q.weight 0x17bd8ace0 0x370000
296 blk.24.attn_q_norm.weight 0x17c0face0 0x200
297 blk.24.attn_v.weight 0x17c0faee0 0x6e000
298 blk.24.ffn_down_exps.weight 0x17c168ee0 0x6c00000
299 blk.24.ffn_gate_exps.weight 0x182d68ee0 0x3f00000
300 blk.24.ffn_gate_inp.weight 0x186c68ee0 0x100000
301 blk.24.ffn_norm.weight 0x186d68ee0 0x2000
302 blk.24.ffn_up_exps.weight 0x186d6aee0 0x3f00000
303 blk.25.attn_k.weight 0x18ac6aee0 0x6e000
304 blk.25.attn_k_norm.weight 0x18acd8ee0 0x200
305 blk.25.attn_norm.weight 0x18acd90e0 0x2000
306 blk.25.attn_output.weight 0x18acdb0e0 0x370000
307 blk.25.attn_q.weight 0x18b04b0e0 0x370000
308 blk.25.attn_q_norm.weight 0x18b3bb0e0 0x200
309 blk.25.attn_v.weight 0x18b3bb2e0 0x6e000
310 blk.25.ffn_down_exps.weight 0x18b4292e0 0x6c00000
311 blk.25.ffn_gate_exps.weight 0x1920292e0 0x5280000
312 blk.25.ffn_gate_inp.weight 0x1972a92e0 0x100000
313 blk.25.ffn_norm.weight 0x1973a92e0 0x2000
314 blk.25.ffn_up_exps.weight 0x1973ab2e0 0x5280000
315 blk.26.attn_k.weight 0x19c62b2e0 0x6e000
316 blk.26.attn_k_norm.weight 0x19c6992e0 0x200
317 blk.26.attn_norm.weight 0x19c6994e0 0x2000
318 blk.26.attn_output.weight 0x19c69b4e0 0x370000
319 blk.26.attn_q.weight 0x19ca0b4e0 0x370000
320 blk.26.attn_q_norm.weight 0x19cd7b4e0 0x200
321 blk.26.attn_v.weight 0x19cd7b6e0 0x6e000
322 blk.26.ffn_down_exps.weight 0x19cde96e0 0x6c00000
323 blk.26.ffn_gate_exps.weight 0x1a39e96e0 0x5280000
324 blk.26.ffn_gate_inp.weight 0x1a8c696e0 0x100000
325 blk.26.ffn_norm.weight 0x1a8d696e0 0x2000
326 blk.26.ffn_up_exps.weight 0x1a8d6b6e0 0x5280000
327 blk.27.attn_k.weight 0x1adfeb6e0 0x6e000
328 blk.27.attn_k_norm.weight 0x1ae0596e0 0x200
329 blk.27.attn_norm.weight 0x1ae0598e0 0x2000
330 blk.27.attn_output.weight 0x1ae05b8e0 0x370000
331 blk.27.attn_q.weight 0x1ae3cb8e0 0x370000
332 blk.27.attn_q_norm.weight 0x1ae73b8e0 0x200
333 blk.27.attn_v.weight 0x1ae73bae0 0x6e000
334 blk.27.ffn_down_exps.weight 0x1ae7a9ae0 0x6c00000
335 blk.27.ffn_gate_exps.weight 0x1b53a9ae0 0x5280000
336 blk.27.ffn_gate_inp.weight 0x1ba629ae0 0x100000
337 blk.27.ffn_norm.weight 0x1ba729ae0 0x2000
338 blk.27.ffn_up_exps.weight 0x1ba72bae0 0x5280000
339 blk.28.attn_k.weight 0x1bf9abae0 0x6e000
340 blk.28.attn_k_norm.weight 0x1bfa19ae0 0x200
341 blk.28.attn_norm.weight 0x1bfa19ce0 0x2000
342 blk.28.attn_output.weight 0x1bfa1bce0 0x370000
343 blk.28.attn_q.weight 0x1bfd8bce0 0x370000
344 blk.28.attn_q_norm.weight 0x1c00fbce0 0x200
345 blk.28.attn_v.weight 0x1c00fbee0 0x6e000
346 blk.28.ffn_down_exps.weight 0x1c0169ee0 0x6c00000
347 blk.28.ffn_gate_exps.weight 0x1c6d69ee0 0x5280000
348 blk.28.ffn_gate_inp.weight 0x1cbfe9ee0 0x100000
349 blk.28.ffn_norm.weight 0x1cc0e9ee0 0x2000
350 blk.28.ffn_up_exps.weight 0x1cc0ebee0 0x5280000
351 blk.29.attn_k.weight 0x1d136bee0 0x6e000
352 blk.29.attn_k_norm.weight 0x1d13d9ee0 0x200
353 blk.29.attn_norm.weight 0x1d13da0e0 0x2000
354 blk.29.attn_output.weight 0x1d13dc0e0 0x370000
355 blk.29.attn_q.weight 0x1d174c0e0 0x370000
356 blk.29.attn_q_norm.weight 0x1d1abc0e0 0x200
357 blk.29.attn_v.weight 0x1d1abc2e0 0x6e000
358 blk.29.ffn_down_exps.weight 0x1d1b2a2e0 0x6c00000
359 blk.29.ffn_gate_exps.weight 0x1d872a2e0 0x5280000
360 blk.29.ffn_gate_inp.weight 0x1dd9aa2e0 0x100000
361 blk.29.ffn_norm.weight 0x1ddaaa2e0 0x2000
362 blk.29.ffn_up_exps.weight 0x1ddaac2e0 0x5280000
363 blk.30.attn_k.weight 0x1e2d2c2e0 0x6e000
364 blk.30.attn_k_norm.weight 0x1e2d9a2e0 0x200
365 blk.30.attn_norm.weight 0x1e2d9a4e0 0x2000
366 blk.30.attn_output.weight 0x1e2d9c4e0 0x370000
367 blk.30.attn_q.weight 0x1e310c4e0 0x370000
368 blk.30.attn_q_norm.weight 0x1e347c4e0 0x200
369 blk.30.attn_v.weight 0x1e347c6e0 0x6e000
370 blk.30.ffn_down_exps.weight 0x1e34ea6e0 0x6c00000
371 blk.30.ffn_gate_exps.weight 0x1ea0ea6e0 0x5280000
372 blk.30.ffn_gate_inp.weight 0x1ef36a6e0 0x100000
373 blk.30.ffn_norm.weight 0x1ef46a6e0 0x2000
374 blk.30.ffn_up_exps.weight 0x1ef46c6e0 0x5280000
375 blk.31.attn_k.weight 0x1f46ec6e0 0x6e000
376 blk.31.attn_k_norm.weight 0x1f475a6e0 0x200
377 blk.31.attn_norm.weight 0x1f475a8e0 0x2000
378 blk.31.attn_output.weight 0x1f475c8e0 0x370000
379 blk.31.attn_q.weight 0x1f4acc8e0 0x370000
380 blk.31.attn_q_norm.weight 0x1f4e3c8e0 0x200
381 blk.31.attn_v.weight 0x1f4e3cae0 0x6e000
382 blk.31.ffn_down_exps.weight 0x1f4eaaae0 0x6c00000
383 blk.31.ffn_gate_exps.weight 0x1fbaaaae0 0x5280000
384 blk.31.ffn_gate_inp.weight 0x200d2aae0 0x100000
385 blk.31.ffn_norm.weight 0x200e2aae0 0x2000
386 blk.31.ffn_up_exps.weight 0x200e2cae0 0x5280000
387 blk.32.attn_k.weight 0x2060acae0 0x6e000
388 blk.32.attn_k_norm.weight 0x20611aae0 0x200
389 blk.32.attn_norm.weight 0x20611ace0 0x2000
390 blk.32.attn_output.weight 0x20611cce0 0x370000
391 blk.32.attn_q.weight 0x20648cce0 0x370000
392 blk.32.attn_q_norm.weight 0x2067fcce0 0x200
393 blk.32.attn_v.weight 0x2067fcee0 0x6e000
394 blk.32.ffn_down_exps.weight 0x20686aee0 0x6c00000
395 blk.32.ffn_gate_exps.weight 0x20d46aee0 0x5280000
396 blk.32.ffn_gate_inp.weight 0x2126eaee0 0x100000
397 blk.32.ffn_norm.weight 0x2127eaee0 0x2000
398 blk.32.ffn_up_exps.weight 0x2127ecee0 0x5280000
399 blk.33.attn_k.weight 0x217a6cee0 0x6e000
400 blk.33.attn_k_norm.weight 0x217adaee0 0x200
401 blk.33.attn_norm.weight 0x217adb0e0 0x2000
402 blk.33.attn_output.weight 0x217add0e0 0x370000
403 blk.33.attn_q.weight 0x217e4d0e0 0x370000
404 blk.33.attn_q_norm.weight 0x2181bd0e0 0x200
405 blk.33.attn_v.weight 0x2181bd2e0 0x6e000
406 blk.33.ffn_down_exps.weight 0x21822b2e0 0x6c00000
407 blk.33.ffn_gate_exps.weight 0x21ee2b2e0 0x5280000
408 blk.33.ffn_gate_inp.weight 0x2240ab2e0 0x100000
409 blk.33.ffn_norm.weight 0x2241ab2e0 0x2000
410 blk.33.ffn_up_exps.weight 0x2241ad2e0 0x5280000
411 blk.34.attn_k.weight 0x22942d2e0 0x6e000
412 blk.34.attn_k_norm.weight 0x22949b2e0 0x200
413 blk.34.attn_norm.weight 0x22949b4e0 0x2000
414 blk.34.attn_output.weight 0x22949d4e0 0x370000
415 blk.34.attn_q.weight 0x22980d4e0 0x370000
416 blk.34.attn_q_norm.weight 0x229b7d4e0 0x200
417 blk.34.attn_v.weight 0x229b7d6e0 0x6e000
418 blk.34.ffn_down_exps.weight 0x229beb6e0 0x6c00000
419 blk.34.ffn_gate_exps.weight 0x2307eb6e0 0x5280000
420 blk.34.ffn_gate_inp.weight 0x235a6b6e0 0x100000
421 blk.34.ffn_norm.weight 0x235b6b6e0 0x2000
422 blk.34.ffn_up_exps.weight 0x235b6d6e0 0x5280000
423 blk.35.attn_k.weight 0x23aded6e0 0x6e000
424 blk.35.attn_k_norm.weight 0x23ae5b6e0 0x200
425 blk.35.attn_norm.weight 0x23ae5b8e0 0x2000
426 blk.35.attn_output.weight 0x23ae5d8e0 0x370000
427 blk.35.attn_q.weight 0x23b1cd8e0 0x370000
428 blk.35.attn_q_norm.weight 0x23b53d8e0 0x200
429 blk.35.attn_v.weight 0x23b53dae0 0x6e000
430 blk.35.ffn_down_exps.weight 0x23b5abae0 0x6c00000
431 blk.35.ffn_gate_exps.weight 0x2421abae0 0x5280000
432 blk.35.ffn_gate_inp.weight 0x24742bae0 0x100000
433 blk.35.ffn_norm.weight 0x24752bae0 0x2000
434 blk.35.ffn_up_exps.weight 0x24752dae0 0x5280000
435 blk.36.attn_k.weight 0x24c7adae0 0x6e000
436 blk.36.attn_k_norm.weight 0x24c81bae0 0x200
437 blk.36.attn_norm.weight 0x24c81bce0 0x2000
438 blk.36.attn_output.weight 0x24c81dce0 0x370000
439 blk.36.attn_q.weight 0x24cb8dce0 0x370000
440 blk.36.attn_q_norm.weight 0x24cefdce0 0x200
441 blk.36.attn_v.weight 0x24cefdee0 0x6e000
442 blk.36.ffn_down_exps.weight 0x24cf6bee0 0x6c00000
443 blk.36.ffn_gate_exps.weight 0x253b6bee0 0x5280000
444 blk.36.ffn_gate_inp.weight 0x258debee0 0x100000
445 blk.36.ffn_norm.weight 0x258eebee0 0x2000
446 blk.36.ffn_up_exps.weight 0x258eedee0 0x5280000
447 blk.37.attn_k.weight 0x25e16dee0 0x6e000
448 blk.37.attn_k_norm.weight 0x25e1dbee0 0x200
449 blk.37.attn_norm.weight 0x25e1dc0e0 0x2000
450 blk.37.attn_output.weight 0x25e1de0e0 0x370000
451 blk.37.attn_q.weight 0x25e54e0e0 0x370000
452 blk.37.attn_q_norm.weight 0x25e8be0e0 0x200
453 blk.37.attn_v.weight 0x25e8be2e0 0x6e000
454 blk.37.ffn_down_exps.weight 0x25e92c2e0 0x6c00000
455 blk.37.ffn_gate_exps.weight 0x26552c2e0 0x5280000
456 blk.37.ffn_gate_inp.weight 0x26a7ac2e0 0x100000
457 blk.37.ffn_norm.weight 0x26a8ac2e0 0x2000
458 blk.37.ffn_up_exps.weight 0x26a8ae2e0 0x5280000
459 blk.38.attn_k.weight 0x26fb2e2e0 0x6e000
460 blk.38.attn_k_norm.weight 0x26fb9c2e0 0x200
461 blk.38.attn_norm.weight 0x26fb9c4e0 0x2000
462 blk.38.attn_output.weight 0x26fb9e4e0 0x370000
463 blk.38.attn_q.weight 0x26ff0e4e0 0x370000
464 blk.38.attn_q_norm.weight 0x27027e4e0 0x200
465 blk.38.attn_v.weight 0x27027e6e0 0x6e000
466 blk.38.ffn_down_exps.weight 0x2702ec6e0 0x6c00000
467 blk.38.ffn_gate_exps.weight 0x276eec6e0 0x5280000
468 blk.38.ffn_gate_inp.weight 0x27c16c6e0 0x100000
469 blk.38.ffn_norm.weight 0x27c26c6e0 0x2000
470 blk.38.ffn_up_exps.weight 0x27c26e6e0 0x5280000
471 blk.39.attn_k.weight 0x2814ee6e0 0x6e000
472 blk.39.attn_k_norm.weight 0x28155c6e0 0x200
473 blk.39.attn_norm.weight 0x28155c8e0 0x2000
474 blk.39.attn_output.weight 0x28155e8e0 0x370000
475 blk.39.attn_q.weight 0x2818ce8e0 0x370000
476 blk.39.attn_q_norm.weight 0x281c3e8e0 0x200
477 blk.39.attn_v.weight 0x281c3eae0 0x6e000
478 blk.39.ffn_down_exps.weight 0x281cacae0 0x6c00000
479 blk.39.ffn_gate_exps.weight 0x2888acae0 0x5280000
480 blk.39.ffn_gate_inp.weight 0x28db2cae0 0x100000
481 blk.39.ffn_norm.weight 0x28dc2cae0 0x2000
482 blk.39.ffn_up_exps.weight 0x28dc2eae0 0x5280000
483 blk.40.attn_k.weight 0x292eaeae0 0x6e000
484 blk.40.attn_k_norm.weight 0x292f1cae0 0x200
485 blk.40.attn_norm.weight 0x292f1cce0 0x2000
486 blk.40.attn_output.weight 0x292f1ece0 0x370000
487 blk.40.attn_q.weight 0x29328ece0 0x370000
488 blk.40.attn_q_norm.weight 0x2935fece0 0x200
489 blk.40.attn_v.weight 0x2935feee0 0x6e000
490 blk.40.ffn_down_exps.weight 0x29366cee0 0x6c00000
491 blk.40.ffn_gate_exps.weight 0x29a26cee0 0x5280000
492 blk.40.ffn_gate_inp.weight 0x29f4ecee0 0x100000
493 blk.40.ffn_norm.weight 0x29f5ecee0 0x2000
494 blk.40.ffn_up_exps.weight 0x29f5eeee0 0x5280000
495 blk.41.attn_k.weight 0x2a486eee0 0x6e000
496 blk.41.attn_k_norm.weight 0x2a48dcee0 0x200
497 blk.41.attn_norm.weight 0x2a48dd0e0 0x2000
498 blk.41.attn_output.weight 0x2a48df0e0 0x370000
499 blk.41.attn_q.weight 0x2a4c4f0e0 0x370000
500 blk.41.attn_q_norm.weight 0x2a4fbf0e0 0x200
501 blk.41.attn_v.weight 0x2a4fbf2e0 0x6e000
502 blk.41.ffn_down_exps.weight 0x2a502d2e0 0x6c00000
503 blk.41.ffn_gate_exps.weight 0x2abc2d2e0 0x5280000
504 blk.41.ffn_gate_inp.weight 0x2b0ead2e0 0x100000
505 blk.41.ffn_norm.weight 0x2b0fad2e0 0x2000
506 blk.41.ffn_up_exps.weight 0x2b0faf2e0 0x5280000
507 blk.42.attn_k.weight 0x2b622f2e0 0x6e000
508 blk.42.attn_k_norm.weight 0x2b629d2e0 0x200
509 blk.42.attn_norm.weight 0x2b629d4e0 0x2000
510 blk.42.attn_output.weight 0x2b629f4e0 0x370000
511 blk.42.attn_q.weight 0x2b660f4e0 0x370000
512 blk.42.attn_q_norm.weight 0x2b697f4e0 0x200
513 blk.42.attn_v.weight 0x2b697f6e0 0x6e000
514 blk.42.ffn_down_exps.weight 0x2b69ed6e0 0x6c00000
515 blk.42.ffn_gate_exps.weight 0x2bd5ed6e0 0x5280000
516 blk.42.ffn_gate_inp.weight 0x2c286d6e0 0x100000
517 blk.42.ffn_norm.weight 0x2c296d6e0 0x2000
518 blk.42.ffn_up_exps.weight 0x2c296f6e0 0x5280000
519 blk.43.attn_k.weight 0x2c7bef6e0 0x6e000
520 blk.43.attn_k_norm.weight 0x2c7c5d6e0 0x200
521 blk.43.attn_norm.weight 0x2c7c5d8e0 0x2000
522 blk.43.attn_output.weight 0x2c7c5f8e0 0x370000
523 blk.43.attn_q.weight 0x2c7fcf8e0 0x370000
524 blk.43.attn_q_norm.weight 0x2c833f8e0 0x200
525 blk.43.attn_v.weight 0x2c833fae0 0x6e000
526 blk.43.ffn_down_exps.weight 0x2c83adae0 0x6c00000
527 blk.43.ffn_gate_exps.weight 0x2cefadae0 0x5280000
528 blk.43.ffn_gate_inp.weight 0x2d422dae0 0x100000
529 blk.43.ffn_norm.weight 0x2d432dae0 0x2000
530 blk.43.ffn_up_exps.weight 0x2d432fae0 0x5280000
531 blk.44.attn_k.weight 0x2d95afae0 0x6e000
532 blk.44.attn_k_norm.weight 0x2d961dae0 0x200
533 blk.44.attn_norm.weight 0x2d961dce0 0x2000
534 blk.44.attn_output.weight 0x2d961fce0 0x370000
535 blk.44.attn_q.weight 0x2d998fce0 0x370000
536 blk.44.attn_q_norm.weight 0x2d9cffce0 0x200
537 blk.44.attn_v.weight 0x2d9cffee0 0x6e000
538 blk.44.ffn_down_exps.weight 0x2d9d6dee0 0x6c00000
539 blk.44.ffn_gate_exps.weight 0x2e096dee0 0x5280000
540 blk.44.ffn_gate_inp.weight 0x2e5bedee0 0x100000
541 blk.44.ffn_norm.weight 0x2e5cedee0 0x2000
542 blk.44.ffn_up_exps.weight 0x2e5cefee0 0x5280000
543 blk.45.attn_k.weight 0x2eaf6fee0 0x6e000
544 blk.45.attn_k_norm.weight 0x2eafddee0 0x200
545 blk.45.attn_norm.weight 0x2eafde0e0 0x2000
546 blk.45.attn_output.weight 0x2eafe00e0 0x370000
547 blk.45.attn_q.weight 0x2eb3500e0 0x370000
548 blk.45.attn_q_norm.weight 0x2eb6c00e0 0x200
549 blk.45.attn_v.weight 0x2eb6c02e0 0x6e000
550 blk.45.ffn_down_exps.weight 0x2eb72e2e0 0x6c00000
551 blk.45.ffn_gate_exps.weight 0x2f232e2e0 0x5280000
552 blk.45.ffn_gate_inp.weight 0x2f75ae2e0 0x100000
553 blk.45.ffn_norm.weight 0x2f76ae2e0 0x2000
554 blk.45.ffn_up_exps.weight 0x2f76b02e0 0x5280000

Base Tensor Group : ~622M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
0 output.weight Output (W) (~311M) 311164928 2048 x 151936 x 1 x 1 Q3_K
1 output_norm.weight Output Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
2 token_embd.weight Token Embedding (W) (~311M) 311164928 2048 x 151936 x 1 x 1 Q2_K
  • Total elements in base: (~622M) 622331904
  • Percentage of total elements: 2.13%

Block 0 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
3 blk.0.attn_k.weight Block 0 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
4 blk.0.attn_k_norm.weight Block 0 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
5 blk.0.attn_norm.weight Block 0 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
6 blk.0.attn_output.weight Block 0 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
7 blk.0.attn_q.weight Block 0 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
8 blk.0.attn_q_norm.weight Block 0 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
9 blk.0.attn_v.weight Block 0 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
10 blk.0.ffn_down_exps.weight Block 0 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
11 blk.0.ffn_gate_exps.weight Block 0 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
12 blk.0.ffn_gate_inp.weight Block 0 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
13 blk.0.ffn_norm.weight Block 0 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
14 blk.0.ffn_up_exps.weight Block 0 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.0: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 1 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
15 blk.1.attn_k.weight Block 1 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
16 blk.1.attn_k_norm.weight Block 1 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
17 blk.1.attn_norm.weight Block 1 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
18 blk.1.attn_output.weight Block 1 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
19 blk.1.attn_q.weight Block 1 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
20 blk.1.attn_q_norm.weight Block 1 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
21 blk.1.attn_v.weight Block 1 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
22 blk.1.ffn_down_exps.weight Block 1 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
23 blk.1.ffn_gate_exps.weight Block 1 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
24 blk.1.ffn_gate_inp.weight Block 1 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
25 blk.1.ffn_norm.weight Block 1 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
26 blk.1.ffn_up_exps.weight Block 1 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.1: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 2 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
27 blk.2.attn_k.weight Block 2 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
28 blk.2.attn_k_norm.weight Block 2 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
29 blk.2.attn_norm.weight Block 2 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
30 blk.2.attn_output.weight Block 2 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
31 blk.2.attn_q.weight Block 2 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
32 blk.2.attn_q_norm.weight Block 2 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
33 blk.2.attn_v.weight Block 2 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
34 blk.2.ffn_down_exps.weight Block 2 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
35 blk.2.ffn_gate_exps.weight Block 2 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
36 blk.2.ffn_gate_inp.weight Block 2 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
37 blk.2.ffn_norm.weight Block 2 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
38 blk.2.ffn_up_exps.weight Block 2 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.2: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 3 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
39 blk.3.attn_k.weight Block 3 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
40 blk.3.attn_k_norm.weight Block 3 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
41 blk.3.attn_norm.weight Block 3 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
42 blk.3.attn_output.weight Block 3 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
43 blk.3.attn_q.weight Block 3 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
44 blk.3.attn_q_norm.weight Block 3 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
45 blk.3.attn_v.weight Block 3 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
46 blk.3.ffn_down_exps.weight Block 3 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
47 blk.3.ffn_gate_exps.weight Block 3 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
48 blk.3.ffn_gate_inp.weight Block 3 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
49 blk.3.ffn_norm.weight Block 3 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
50 blk.3.ffn_up_exps.weight Block 3 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.3: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 4 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
51 blk.4.attn_k.weight Block 4 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
52 blk.4.attn_k_norm.weight Block 4 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
53 blk.4.attn_norm.weight Block 4 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
54 blk.4.attn_output.weight Block 4 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
55 blk.4.attn_q.weight Block 4 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
56 blk.4.attn_q_norm.weight Block 4 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
57 blk.4.attn_v.weight Block 4 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
58 blk.4.ffn_down_exps.weight Block 4 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
59 blk.4.ffn_gate_exps.weight Block 4 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
60 blk.4.ffn_gate_inp.weight Block 4 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
61 blk.4.ffn_norm.weight Block 4 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
62 blk.4.ffn_up_exps.weight Block 4 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.4: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 5 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
63 blk.5.attn_k.weight Block 5 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
64 blk.5.attn_k_norm.weight Block 5 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
65 blk.5.attn_norm.weight Block 5 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
66 blk.5.attn_output.weight Block 5 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
67 blk.5.attn_q.weight Block 5 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
68 blk.5.attn_q_norm.weight Block 5 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
69 blk.5.attn_v.weight Block 5 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
70 blk.5.ffn_down_exps.weight Block 5 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
71 blk.5.ffn_gate_exps.weight Block 5 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
72 blk.5.ffn_gate_inp.weight Block 5 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
73 blk.5.ffn_norm.weight Block 5 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
74 blk.5.ffn_up_exps.weight Block 5 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.5: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 6 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
75 blk.6.attn_k.weight Block 6 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
76 blk.6.attn_k_norm.weight Block 6 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
77 blk.6.attn_norm.weight Block 6 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
78 blk.6.attn_output.weight Block 6 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
79 blk.6.attn_q.weight Block 6 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
80 blk.6.attn_q_norm.weight Block 6 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
81 blk.6.attn_v.weight Block 6 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
82 blk.6.ffn_down_exps.weight Block 6 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
83 blk.6.ffn_gate_exps.weight Block 6 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
84 blk.6.ffn_gate_inp.weight Block 6 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
85 blk.6.ffn_norm.weight Block 6 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
86 blk.6.ffn_up_exps.weight Block 6 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.6: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 7 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
87 blk.7.attn_k.weight Block 7 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
88 blk.7.attn_k_norm.weight Block 7 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
89 blk.7.attn_norm.weight Block 7 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
90 blk.7.attn_output.weight Block 7 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
91 blk.7.attn_q.weight Block 7 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
92 blk.7.attn_q_norm.weight Block 7 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
93 blk.7.attn_v.weight Block 7 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
94 blk.7.ffn_down_exps.weight Block 7 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
95 blk.7.ffn_gate_exps.weight Block 7 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
96 blk.7.ffn_gate_inp.weight Block 7 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
97 blk.7.ffn_norm.weight Block 7 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
98 blk.7.ffn_up_exps.weight Block 7 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.7: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 8 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
99 blk.8.attn_k.weight Block 8 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
100 blk.8.attn_k_norm.weight Block 8 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
101 blk.8.attn_norm.weight Block 8 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
102 blk.8.attn_output.weight Block 8 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
103 blk.8.attn_q.weight Block 8 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
104 blk.8.attn_q_norm.weight Block 8 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
105 blk.8.attn_v.weight Block 8 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
106 blk.8.ffn_down_exps.weight Block 8 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
107 blk.8.ffn_gate_exps.weight Block 8 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
108 blk.8.ffn_gate_inp.weight Block 8 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
109 blk.8.ffn_norm.weight Block 8 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
110 blk.8.ffn_up_exps.weight Block 8 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.8: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 9 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
111 blk.9.attn_k.weight Block 9 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
112 blk.9.attn_k_norm.weight Block 9 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
113 blk.9.attn_norm.weight Block 9 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
114 blk.9.attn_output.weight Block 9 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
115 blk.9.attn_q.weight Block 9 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
116 blk.9.attn_q_norm.weight Block 9 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
117 blk.9.attn_v.weight Block 9 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
118 blk.9.ffn_down_exps.weight Block 9 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
119 blk.9.ffn_gate_exps.weight Block 9 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
120 blk.9.ffn_gate_inp.weight Block 9 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
121 blk.9.ffn_norm.weight Block 9 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
122 blk.9.ffn_up_exps.weight Block 9 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.9: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 10 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
123 blk.10.attn_k.weight Block 10 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
124 blk.10.attn_k_norm.weight Block 10 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
125 blk.10.attn_norm.weight Block 10 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
126 blk.10.attn_output.weight Block 10 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
127 blk.10.attn_q.weight Block 10 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
128 blk.10.attn_q_norm.weight Block 10 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
129 blk.10.attn_v.weight Block 10 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
130 blk.10.ffn_down_exps.weight Block 10 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
131 blk.10.ffn_gate_exps.weight Block 10 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
132 blk.10.ffn_gate_inp.weight Block 10 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
133 blk.10.ffn_norm.weight Block 10 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
134 blk.10.ffn_up_exps.weight Block 10 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.10: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 11 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
135 blk.11.attn_k.weight Block 11 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
136 blk.11.attn_k_norm.weight Block 11 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
137 blk.11.attn_norm.weight Block 11 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
138 blk.11.attn_output.weight Block 11 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
139 blk.11.attn_q.weight Block 11 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
140 blk.11.attn_q_norm.weight Block 11 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
141 blk.11.attn_v.weight Block 11 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
142 blk.11.ffn_down_exps.weight Block 11 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
143 blk.11.ffn_gate_exps.weight Block 11 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
144 blk.11.ffn_gate_inp.weight Block 11 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
145 blk.11.ffn_norm.weight Block 11 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
146 blk.11.ffn_up_exps.weight Block 11 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.11: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 12 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
147 blk.12.attn_k.weight Block 12 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
148 blk.12.attn_k_norm.weight Block 12 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
149 blk.12.attn_norm.weight Block 12 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
150 blk.12.attn_output.weight Block 12 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
151 blk.12.attn_q.weight Block 12 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
152 blk.12.attn_q_norm.weight Block 12 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
153 blk.12.attn_v.weight Block 12 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
154 blk.12.ffn_down_exps.weight Block 12 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
155 blk.12.ffn_gate_exps.weight Block 12 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
156 blk.12.ffn_gate_inp.weight Block 12 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
157 blk.12.ffn_norm.weight Block 12 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
158 blk.12.ffn_up_exps.weight Block 12 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.12: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 13 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
159 blk.13.attn_k.weight Block 13 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
160 blk.13.attn_k_norm.weight Block 13 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
161 blk.13.attn_norm.weight Block 13 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
162 blk.13.attn_output.weight Block 13 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
163 blk.13.attn_q.weight Block 13 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
164 blk.13.attn_q_norm.weight Block 13 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
165 blk.13.attn_v.weight Block 13 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
166 blk.13.ffn_down_exps.weight Block 13 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
167 blk.13.ffn_gate_exps.weight Block 13 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
168 blk.13.ffn_gate_inp.weight Block 13 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
169 blk.13.ffn_norm.weight Block 13 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
170 blk.13.ffn_up_exps.weight Block 13 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.13: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 14 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
171 blk.14.attn_k.weight Block 14 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
172 blk.14.attn_k_norm.weight Block 14 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
173 blk.14.attn_norm.weight Block 14 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
174 blk.14.attn_output.weight Block 14 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
175 blk.14.attn_q.weight Block 14 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
176 blk.14.attn_q_norm.weight Block 14 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
177 blk.14.attn_v.weight Block 14 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
178 blk.14.ffn_down_exps.weight Block 14 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
179 blk.14.ffn_gate_exps.weight Block 14 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
180 blk.14.ffn_gate_inp.weight Block 14 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
181 blk.14.ffn_norm.weight Block 14 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
182 blk.14.ffn_up_exps.weight Block 14 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.14: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 15 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
183 blk.15.attn_k.weight Block 15 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
184 blk.15.attn_k_norm.weight Block 15 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
185 blk.15.attn_norm.weight Block 15 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
186 blk.15.attn_output.weight Block 15 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
187 blk.15.attn_q.weight Block 15 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
188 blk.15.attn_q_norm.weight Block 15 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
189 blk.15.attn_v.weight Block 15 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
190 blk.15.ffn_down_exps.weight Block 15 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
191 blk.15.ffn_gate_exps.weight Block 15 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
192 blk.15.ffn_gate_inp.weight Block 15 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
193 blk.15.ffn_norm.weight Block 15 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
194 blk.15.ffn_up_exps.weight Block 15 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.15: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 16 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
195 blk.16.attn_k.weight Block 16 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
196 blk.16.attn_k_norm.weight Block 16 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
197 blk.16.attn_norm.weight Block 16 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
198 blk.16.attn_output.weight Block 16 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
199 blk.16.attn_q.weight Block 16 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
200 blk.16.attn_q_norm.weight Block 16 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
201 blk.16.attn_v.weight Block 16 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
202 blk.16.ffn_down_exps.weight Block 16 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
203 blk.16.ffn_gate_exps.weight Block 16 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
204 blk.16.ffn_gate_inp.weight Block 16 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
205 blk.16.ffn_norm.weight Block 16 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
206 blk.16.ffn_up_exps.weight Block 16 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.16: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 17 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
207 blk.17.attn_k.weight Block 17 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
208 blk.17.attn_k_norm.weight Block 17 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
209 blk.17.attn_norm.weight Block 17 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
210 blk.17.attn_output.weight Block 17 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
211 blk.17.attn_q.weight Block 17 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
212 blk.17.attn_q_norm.weight Block 17 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
213 blk.17.attn_v.weight Block 17 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
214 blk.17.ffn_down_exps.weight Block 17 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
215 blk.17.ffn_gate_exps.weight Block 17 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
216 blk.17.ffn_gate_inp.weight Block 17 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
217 blk.17.ffn_norm.weight Block 17 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
218 blk.17.ffn_up_exps.weight Block 17 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.17: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 18 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
219 blk.18.attn_k.weight Block 18 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
220 blk.18.attn_k_norm.weight Block 18 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
221 blk.18.attn_norm.weight Block 18 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
222 blk.18.attn_output.weight Block 18 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
223 blk.18.attn_q.weight Block 18 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
224 blk.18.attn_q_norm.weight Block 18 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
225 blk.18.attn_v.weight Block 18 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
226 blk.18.ffn_down_exps.weight Block 18 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
227 blk.18.ffn_gate_exps.weight Block 18 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
228 blk.18.ffn_gate_inp.weight Block 18 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
229 blk.18.ffn_norm.weight Block 18 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
230 blk.18.ffn_up_exps.weight Block 18 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.18: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 19 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
231 blk.19.attn_k.weight Block 19 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
232 blk.19.attn_k_norm.weight Block 19 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
233 blk.19.attn_norm.weight Block 19 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
234 blk.19.attn_output.weight Block 19 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
235 blk.19.attn_q.weight Block 19 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
236 blk.19.attn_q_norm.weight Block 19 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
237 blk.19.attn_v.weight Block 19 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
238 blk.19.ffn_down_exps.weight Block 19 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
239 blk.19.ffn_gate_exps.weight Block 19 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
240 blk.19.ffn_gate_inp.weight Block 19 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
241 blk.19.ffn_norm.weight Block 19 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
242 blk.19.ffn_up_exps.weight Block 19 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.19: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 20 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
243 blk.20.attn_k.weight Block 20 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
244 blk.20.attn_k_norm.weight Block 20 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
245 blk.20.attn_norm.weight Block 20 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
246 blk.20.attn_output.weight Block 20 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
247 blk.20.attn_q.weight Block 20 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
248 blk.20.attn_q_norm.weight Block 20 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
249 blk.20.attn_v.weight Block 20 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
250 blk.20.ffn_down_exps.weight Block 20 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
251 blk.20.ffn_gate_exps.weight Block 20 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
252 blk.20.ffn_gate_inp.weight Block 20 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
253 blk.20.ffn_norm.weight Block 20 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
254 blk.20.ffn_up_exps.weight Block 20 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.20: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 21 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
255 blk.21.attn_k.weight Block 21 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
256 blk.21.attn_k_norm.weight Block 21 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
257 blk.21.attn_norm.weight Block 21 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
258 blk.21.attn_output.weight Block 21 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
259 blk.21.attn_q.weight Block 21 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
260 blk.21.attn_q_norm.weight Block 21 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
261 blk.21.attn_v.weight Block 21 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
262 blk.21.ffn_down_exps.weight Block 21 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
263 blk.21.ffn_gate_exps.weight Block 21 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
264 blk.21.ffn_gate_inp.weight Block 21 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
265 blk.21.ffn_norm.weight Block 21 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
266 blk.21.ffn_up_exps.weight Block 21 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.21: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 22 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
267 blk.22.attn_k.weight Block 22 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
268 blk.22.attn_k_norm.weight Block 22 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
269 blk.22.attn_norm.weight Block 22 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
270 blk.22.attn_output.weight Block 22 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
271 blk.22.attn_q.weight Block 22 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
272 blk.22.attn_q_norm.weight Block 22 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
273 blk.22.attn_v.weight Block 22 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
274 blk.22.ffn_down_exps.weight Block 22 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
275 blk.22.ffn_gate_exps.weight Block 22 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
276 blk.22.ffn_gate_inp.weight Block 22 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
277 blk.22.ffn_norm.weight Block 22 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
278 blk.22.ffn_up_exps.weight Block 22 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.22: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 23 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
279 blk.23.attn_k.weight Block 23 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q2_K
280 blk.23.attn_k_norm.weight Block 23 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
281 blk.23.attn_norm.weight Block 23 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
282 blk.23.attn_output.weight Block 23 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
283 blk.23.attn_q.weight Block 23 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q2_K
284 blk.23.attn_q_norm.weight Block 23 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
285 blk.23.attn_v.weight Block 23 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
286 blk.23.ffn_down_exps.weight Block 23 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
287 blk.23.ffn_gate_exps.weight Block 23 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
288 blk.23.ffn_gate_inp.weight Block 23 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
289 blk.23.ffn_norm.weight Block 23 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
290 blk.23.ffn_up_exps.weight Block 23 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.23: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 24 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
291 blk.24.attn_k.weight Block 24 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
292 blk.24.attn_k_norm.weight Block 24 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
293 blk.24.attn_norm.weight Block 24 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
294 blk.24.attn_output.weight Block 24 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
295 blk.24.attn_q.weight Block 24 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
296 blk.24.attn_q_norm.weight Block 24 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
297 blk.24.attn_v.weight Block 24 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
298 blk.24.ffn_down_exps.weight Block 24 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
299 blk.24.ffn_gate_exps.weight Block 24 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
300 blk.24.ffn_gate_inp.weight Block 24 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
301 blk.24.ffn_norm.weight Block 24 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
302 blk.24.ffn_up_exps.weight Block 24 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q2_K
  • Total elements in blk.24: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 25 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
303 blk.25.attn_k.weight Block 25 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
304 blk.25.attn_k_norm.weight Block 25 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
305 blk.25.attn_norm.weight Block 25 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
306 blk.25.attn_output.weight Block 25 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
307 blk.25.attn_q.weight Block 25 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
308 blk.25.attn_q_norm.weight Block 25 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
309 blk.25.attn_v.weight Block 25 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
310 blk.25.ffn_down_exps.weight Block 25 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
311 blk.25.ffn_gate_exps.weight Block 25 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
312 blk.25.ffn_gate_inp.weight Block 25 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
313 blk.25.ffn_norm.weight Block 25 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
314 blk.25.ffn_up_exps.weight Block 25 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.25: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 26 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
315 blk.26.attn_k.weight Block 26 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
316 blk.26.attn_k_norm.weight Block 26 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
317 blk.26.attn_norm.weight Block 26 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
318 blk.26.attn_output.weight Block 26 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
319 blk.26.attn_q.weight Block 26 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
320 blk.26.attn_q_norm.weight Block 26 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
321 blk.26.attn_v.weight Block 26 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
322 blk.26.ffn_down_exps.weight Block 26 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
323 blk.26.ffn_gate_exps.weight Block 26 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
324 blk.26.ffn_gate_inp.weight Block 26 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
325 blk.26.ffn_norm.weight Block 26 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
326 blk.26.ffn_up_exps.weight Block 26 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.26: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 27 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
327 blk.27.attn_k.weight Block 27 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
328 blk.27.attn_k_norm.weight Block 27 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
329 blk.27.attn_norm.weight Block 27 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
330 blk.27.attn_output.weight Block 27 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
331 blk.27.attn_q.weight Block 27 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
332 blk.27.attn_q_norm.weight Block 27 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
333 blk.27.attn_v.weight Block 27 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
334 blk.27.ffn_down_exps.weight Block 27 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
335 blk.27.ffn_gate_exps.weight Block 27 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
336 blk.27.ffn_gate_inp.weight Block 27 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
337 blk.27.ffn_norm.weight Block 27 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
338 blk.27.ffn_up_exps.weight Block 27 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.27: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 28 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
339 blk.28.attn_k.weight Block 28 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
340 blk.28.attn_k_norm.weight Block 28 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
341 blk.28.attn_norm.weight Block 28 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
342 blk.28.attn_output.weight Block 28 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
343 blk.28.attn_q.weight Block 28 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
344 blk.28.attn_q_norm.weight Block 28 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
345 blk.28.attn_v.weight Block 28 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
346 blk.28.ffn_down_exps.weight Block 28 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
347 blk.28.ffn_gate_exps.weight Block 28 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
348 blk.28.ffn_gate_inp.weight Block 28 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
349 blk.28.ffn_norm.weight Block 28 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
350 blk.28.ffn_up_exps.weight Block 28 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.28: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 29 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
351 blk.29.attn_k.weight Block 29 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
352 blk.29.attn_k_norm.weight Block 29 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
353 blk.29.attn_norm.weight Block 29 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
354 blk.29.attn_output.weight Block 29 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
355 blk.29.attn_q.weight Block 29 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
356 blk.29.attn_q_norm.weight Block 29 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
357 blk.29.attn_v.weight Block 29 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
358 blk.29.ffn_down_exps.weight Block 29 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
359 blk.29.ffn_gate_exps.weight Block 29 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
360 blk.29.ffn_gate_inp.weight Block 29 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
361 blk.29.ffn_norm.weight Block 29 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
362 blk.29.ffn_up_exps.weight Block 29 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.29: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 30 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
363 blk.30.attn_k.weight Block 30 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
364 blk.30.attn_k_norm.weight Block 30 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
365 blk.30.attn_norm.weight Block 30 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
366 blk.30.attn_output.weight Block 30 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
367 blk.30.attn_q.weight Block 30 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
368 blk.30.attn_q_norm.weight Block 30 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
369 blk.30.attn_v.weight Block 30 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
370 blk.30.ffn_down_exps.weight Block 30 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
371 blk.30.ffn_gate_exps.weight Block 30 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
372 blk.30.ffn_gate_inp.weight Block 30 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
373 blk.30.ffn_norm.weight Block 30 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
374 blk.30.ffn_up_exps.weight Block 30 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.30: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 31 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
375 blk.31.attn_k.weight Block 31 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
376 blk.31.attn_k_norm.weight Block 31 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
377 blk.31.attn_norm.weight Block 31 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
378 blk.31.attn_output.weight Block 31 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
379 blk.31.attn_q.weight Block 31 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
380 blk.31.attn_q_norm.weight Block 31 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
381 blk.31.attn_v.weight Block 31 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
382 blk.31.ffn_down_exps.weight Block 31 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
383 blk.31.ffn_gate_exps.weight Block 31 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
384 blk.31.ffn_gate_inp.weight Block 31 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
385 blk.31.ffn_norm.weight Block 31 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
386 blk.31.ffn_up_exps.weight Block 31 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.31: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 32 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
387 blk.32.attn_k.weight Block 32 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
388 blk.32.attn_k_norm.weight Block 32 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
389 blk.32.attn_norm.weight Block 32 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
390 blk.32.attn_output.weight Block 32 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
391 blk.32.attn_q.weight Block 32 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
392 blk.32.attn_q_norm.weight Block 32 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
393 blk.32.attn_v.weight Block 32 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
394 blk.32.ffn_down_exps.weight Block 32 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
395 blk.32.ffn_gate_exps.weight Block 32 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
396 blk.32.ffn_gate_inp.weight Block 32 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
397 blk.32.ffn_norm.weight Block 32 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
398 blk.32.ffn_up_exps.weight Block 32 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.32: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 33 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
399 blk.33.attn_k.weight Block 33 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
400 blk.33.attn_k_norm.weight Block 33 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
401 blk.33.attn_norm.weight Block 33 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
402 blk.33.attn_output.weight Block 33 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
403 blk.33.attn_q.weight Block 33 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
404 blk.33.attn_q_norm.weight Block 33 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
405 blk.33.attn_v.weight Block 33 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
406 blk.33.ffn_down_exps.weight Block 33 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
407 blk.33.ffn_gate_exps.weight Block 33 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
408 blk.33.ffn_gate_inp.weight Block 33 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
409 blk.33.ffn_norm.weight Block 33 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
410 blk.33.ffn_up_exps.weight Block 33 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.33: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 34 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
411 blk.34.attn_k.weight Block 34 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
412 blk.34.attn_k_norm.weight Block 34 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
413 blk.34.attn_norm.weight Block 34 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
414 blk.34.attn_output.weight Block 34 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
415 blk.34.attn_q.weight Block 34 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
416 blk.34.attn_q_norm.weight Block 34 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
417 blk.34.attn_v.weight Block 34 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
418 blk.34.ffn_down_exps.weight Block 34 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
419 blk.34.ffn_gate_exps.weight Block 34 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
420 blk.34.ffn_gate_inp.weight Block 34 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
421 blk.34.ffn_norm.weight Block 34 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
422 blk.34.ffn_up_exps.weight Block 34 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.34: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 35 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
423 blk.35.attn_k.weight Block 35 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
424 blk.35.attn_k_norm.weight Block 35 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
425 blk.35.attn_norm.weight Block 35 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
426 blk.35.attn_output.weight Block 35 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
427 blk.35.attn_q.weight Block 35 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
428 blk.35.attn_q_norm.weight Block 35 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
429 blk.35.attn_v.weight Block 35 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
430 blk.35.ffn_down_exps.weight Block 35 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
431 blk.35.ffn_gate_exps.weight Block 35 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
432 blk.35.ffn_gate_inp.weight Block 35 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
433 blk.35.ffn_norm.weight Block 35 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
434 blk.35.ffn_up_exps.weight Block 35 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.35: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 36 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
435 blk.36.attn_k.weight Block 36 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
436 blk.36.attn_k_norm.weight Block 36 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
437 blk.36.attn_norm.weight Block 36 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
438 blk.36.attn_output.weight Block 36 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
439 blk.36.attn_q.weight Block 36 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
440 blk.36.attn_q_norm.weight Block 36 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
441 blk.36.attn_v.weight Block 36 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
442 blk.36.ffn_down_exps.weight Block 36 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
443 blk.36.ffn_gate_exps.weight Block 36 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
444 blk.36.ffn_gate_inp.weight Block 36 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
445 blk.36.ffn_norm.weight Block 36 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
446 blk.36.ffn_up_exps.weight Block 36 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.36: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 37 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
447 blk.37.attn_k.weight Block 37 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
448 blk.37.attn_k_norm.weight Block 37 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
449 blk.37.attn_norm.weight Block 37 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
450 blk.37.attn_output.weight Block 37 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
451 blk.37.attn_q.weight Block 37 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
452 blk.37.attn_q_norm.weight Block 37 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
453 blk.37.attn_v.weight Block 37 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
454 blk.37.ffn_down_exps.weight Block 37 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
455 blk.37.ffn_gate_exps.weight Block 37 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
456 blk.37.ffn_gate_inp.weight Block 37 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
457 blk.37.ffn_norm.weight Block 37 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
458 blk.37.ffn_up_exps.weight Block 37 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.37: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 38 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
459 blk.38.attn_k.weight Block 38 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
460 blk.38.attn_k_norm.weight Block 38 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
461 blk.38.attn_norm.weight Block 38 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
462 blk.38.attn_output.weight Block 38 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
463 blk.38.attn_q.weight Block 38 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
464 blk.38.attn_q_norm.weight Block 38 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
465 blk.38.attn_v.weight Block 38 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
466 blk.38.ffn_down_exps.weight Block 38 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
467 blk.38.ffn_gate_exps.weight Block 38 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
468 blk.38.ffn_gate_inp.weight Block 38 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
469 blk.38.ffn_norm.weight Block 38 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
470 blk.38.ffn_up_exps.weight Block 38 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.38: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 39 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
471 blk.39.attn_k.weight Block 39 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
472 blk.39.attn_k_norm.weight Block 39 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
473 blk.39.attn_norm.weight Block 39 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
474 blk.39.attn_output.weight Block 39 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
475 blk.39.attn_q.weight Block 39 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
476 blk.39.attn_q_norm.weight Block 39 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
477 blk.39.attn_v.weight Block 39 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
478 blk.39.ffn_down_exps.weight Block 39 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
479 blk.39.ffn_gate_exps.weight Block 39 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
480 blk.39.ffn_gate_inp.weight Block 39 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
481 blk.39.ffn_norm.weight Block 39 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
482 blk.39.ffn_up_exps.weight Block 39 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.39: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 40 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
483 blk.40.attn_k.weight Block 40 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
484 blk.40.attn_k_norm.weight Block 40 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
485 blk.40.attn_norm.weight Block 40 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
486 blk.40.attn_output.weight Block 40 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
487 blk.40.attn_q.weight Block 40 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
488 blk.40.attn_q_norm.weight Block 40 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
489 blk.40.attn_v.weight Block 40 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
490 blk.40.ffn_down_exps.weight Block 40 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
491 blk.40.ffn_gate_exps.weight Block 40 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
492 blk.40.ffn_gate_inp.weight Block 40 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
493 blk.40.ffn_norm.weight Block 40 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
494 blk.40.ffn_up_exps.weight Block 40 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.40: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 41 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
495 blk.41.attn_k.weight Block 41 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
496 blk.41.attn_k_norm.weight Block 41 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
497 blk.41.attn_norm.weight Block 41 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
498 blk.41.attn_output.weight Block 41 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
499 blk.41.attn_q.weight Block 41 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
500 blk.41.attn_q_norm.weight Block 41 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
501 blk.41.attn_v.weight Block 41 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
502 blk.41.ffn_down_exps.weight Block 41 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
503 blk.41.ffn_gate_exps.weight Block 41 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
504 blk.41.ffn_gate_inp.weight Block 41 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
505 blk.41.ffn_norm.weight Block 41 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
506 blk.41.ffn_up_exps.weight Block 41 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.41: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 42 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
507 blk.42.attn_k.weight Block 42 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
508 blk.42.attn_k_norm.weight Block 42 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
509 blk.42.attn_norm.weight Block 42 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
510 blk.42.attn_output.weight Block 42 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
511 blk.42.attn_q.weight Block 42 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
512 blk.42.attn_q_norm.weight Block 42 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
513 blk.42.attn_v.weight Block 42 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
514 blk.42.ffn_down_exps.weight Block 42 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
515 blk.42.ffn_gate_exps.weight Block 42 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
516 blk.42.ffn_gate_inp.weight Block 42 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
517 blk.42.ffn_norm.weight Block 42 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
518 blk.42.ffn_up_exps.weight Block 42 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.42: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 43 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
519 blk.43.attn_k.weight Block 43 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
520 blk.43.attn_k_norm.weight Block 43 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
521 blk.43.attn_norm.weight Block 43 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
522 blk.43.attn_output.weight Block 43 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
523 blk.43.attn_q.weight Block 43 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
524 blk.43.attn_q_norm.weight Block 43 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
525 blk.43.attn_v.weight Block 43 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
526 blk.43.ffn_down_exps.weight Block 43 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
527 blk.43.ffn_gate_exps.weight Block 43 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
528 blk.43.ffn_gate_inp.weight Block 43 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
529 blk.43.ffn_norm.weight Block 43 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
530 blk.43.ffn_up_exps.weight Block 43 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.43: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 44 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
531 blk.44.attn_k.weight Block 44 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
532 blk.44.attn_k_norm.weight Block 44 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
533 blk.44.attn_norm.weight Block 44 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
534 blk.44.attn_output.weight Block 44 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
535 blk.44.attn_q.weight Block 44 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
536 blk.44.attn_q_norm.weight Block 44 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
537 blk.44.attn_v.weight Block 44 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
538 blk.44.ffn_down_exps.weight Block 44 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
539 blk.44.ffn_gate_exps.weight Block 44 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
540 blk.44.ffn_gate_inp.weight Block 44 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
541 blk.44.ffn_norm.weight Block 44 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
542 blk.44.ffn_up_exps.weight Block 44 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.44: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 45 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
543 blk.45.attn_k.weight Block 45 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
544 blk.45.attn_k_norm.weight Block 45 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
545 blk.45.attn_norm.weight Block 45 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
546 blk.45.attn_output.weight Block 45 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q3_K
547 blk.45.attn_q.weight Block 45 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
548 blk.45.attn_q_norm.weight Block 45 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
549 blk.45.attn_v.weight Block 45 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
550 blk.45.ffn_down_exps.weight Block 45 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
551 blk.45.ffn_gate_exps.weight Block 45 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
552 blk.45.ffn_gate_inp.weight Block 45 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
553 blk.45.ffn_norm.weight Block 45 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
554 blk.45.ffn_up_exps.weight Block 45 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.45: (~623M) 623120640
  • Percentage of total elements: 2.13%