Qwen3-30B-A3B-pruned-GGUF / scores /Qwen3-30B-A3B-pruned-Q4_K_S.md
eaddario's picture
Add GGUF internal file structure
dc1fc34 verified

Qwen3-30B-A3B-Q4_K_S.gguf - GGUF Internal File Dump

  • Endian: LITTLE endian

Key Value Metadata Store

There are 45 key-value pairs in this file

POS TYPE Count Key Value
1 UINT32 1 GGUF.version 3
2 UINT64 1 GGUF.tensor_count 555
3 UINT64 1 GGUF.kv_count 42
4 STRING 1 general.architecture qwen3moe
5 STRING 1 general.type model
6 STRING 1 general.name Qwen3 30B A3B
7 STRING 1 general.basename Qwen3
8 STRING 1 general.size_label 30B-A3B
9 STRING 1 general.license apache-2.0
10 STRING 1 general.license.link https://huggingface.co/Qwen/Qwen3-30B-A3B/blob/main/LICENSE
11 UINT32 1 general.base_model.count 1
12 STRING 1 general.base_model.0.name Qwen3 30B A3B Base
13 STRING 1 general.base_model.0.organization Qwen
14 STRING 1 general.base_model.0.repo_url https://huggingface.co/Qwen/Qwen3-30B-A3B-Base
15 [STRING] 1 general.tags [ text-generation ]
16 UINT32 1 qwen3moe.context_length 40960
17 UINT32 1 qwen3moe.embedding_length 2048
18 UINT32 1 qwen3moe.feed_forward_length 6144
19 UINT32 1 qwen3moe.attention.head_count 32
20 UINT32 1 qwen3moe.attention.head_count_kv 4
21 FLOAT32 1 qwen3moe.rope.freq_base 1000000.0
22 FLOAT32 1 qwen3moe.attention.layer_norm_rms_epsilon 1e-06
23 UINT32 1 qwen3moe.expert_used_count 8
24 UINT32 1 qwen3moe.attention.key_length 128
25 UINT32 1 qwen3moe.attention.value_length 128
26 UINT32 1 qwen3moe.expert_count 128
27 UINT32 1 qwen3moe.expert_feed_forward_length 768
28 STRING 1 tokenizer.ggml.model gpt2
29 STRING 1 tokenizer.ggml.pre qwen2
30 [STRING] 151936 tokenizer.ggml.tokens [ !, ", #, $, %, ... ]
31 [INT32] 151936 tokenizer.ggml.token_type [ 1, 1, 1, 1, 1, 1, 1, ... ]
32 [STRING] 151387 tokenizer.ggml.merges [ Ġ Ġ, ĠĠ ĠĠ, i n, Ġ t, ĠĠĠĠ ĠĠĠĠ, ... ]
33 UINT32 1 tokenizer.ggml.eos_token_id 151645
34 UINT32 1 tokenizer.ggml.padding_token_id 151643
35 UINT32 1 tokenizer.ggml.bos_token_id 151643
36 BOOL 1 tokenizer.ggml.add_bos_token False
37 STRING 1 tokenizer.chat_template `{%- if tools %}{{- '<
38 UINT32 1 general.quantization_version 2
39 UINT32 1 general.file_type 14
40 BOOL 1 general.pruned True
41 UINT32 1 qwen3moe.block_count 46
42 STRING 1 quantize.imatrix.file ./imatrix/imatrix-Qwen3-30B-A3B-medium.dat
43 STRING 1 quantize.imatrix.dataset ../../datasets/imatrix/combined_all_medium.txt
44 INT32 1 quantize.imatrix.entries_count 385
45 INT32 1 quantize.imatrix.chunks_count 6946

Tensors Overview ~29B Elements

Total number of elements in all tensors: 29285881344 Elements

Tensor Data Offset

This table contains the offset and data segment relative to start of file

T_ID Tensor Layer Name Data Offset (B) Data Size (B)
0 output.weight 0x5b12e0 0xa6ec000
1 output_norm.weight 0xac9d2e0 0x2000
2 token_embd.weight 0xac9f2e0 0x7f82800
3 blk.0.attn_k.weight 0x12c21ae0 0x6e000
4 blk.0.attn_k_norm.weight 0x12c8fae0 0x200
5 blk.0.attn_norm.weight 0x12c8fce0 0x2000
6 blk.0.attn_output.weight 0x12c91ce0 0x480000
7 blk.0.attn_q.weight 0x13111ce0 0x370000
8 blk.0.attn_q_norm.weight 0x13481ce0 0x200
9 blk.0.attn_v.weight 0x13481ee0 0x90000
10 blk.0.ffn_down_exps.weight 0x13511ee0 0x6c00000
11 blk.0.ffn_gate_exps.weight 0x1a111ee0 0x5280000
12 blk.0.ffn_gate_inp.weight 0x1f391ee0 0x100000
13 blk.0.ffn_norm.weight 0x1f491ee0 0x2000
14 blk.0.ffn_up_exps.weight 0x1f493ee0 0x5280000
15 blk.1.attn_k.weight 0x24713ee0 0x6e000
16 blk.1.attn_k_norm.weight 0x24781ee0 0x200
17 blk.1.attn_norm.weight 0x247820e0 0x2000
18 blk.1.attn_output.weight 0x247840e0 0x480000
19 blk.1.attn_q.weight 0x24c040e0 0x370000
20 blk.1.attn_q_norm.weight 0x24f740e0 0x200
21 blk.1.attn_v.weight 0x24f742e0 0x90000
22 blk.1.ffn_down_exps.weight 0x250042e0 0x6c00000
23 blk.1.ffn_gate_exps.weight 0x2bc042e0 0x5280000
24 blk.1.ffn_gate_inp.weight 0x30e842e0 0x100000
25 blk.1.ffn_norm.weight 0x30f842e0 0x2000
26 blk.1.ffn_up_exps.weight 0x30f862e0 0x5280000
27 blk.2.attn_k.weight 0x362062e0 0x6e000
28 blk.2.attn_k_norm.weight 0x362742e0 0x200
29 blk.2.attn_norm.weight 0x362744e0 0x2000
30 blk.2.attn_output.weight 0x362764e0 0x480000
31 blk.2.attn_q.weight 0x366f64e0 0x370000
32 blk.2.attn_q_norm.weight 0x36a664e0 0x200
33 blk.2.attn_v.weight 0x36a666e0 0x90000
34 blk.2.ffn_down_exps.weight 0x36af66e0 0x8400000
35 blk.2.ffn_gate_exps.weight 0x3eef66e0 0x5280000
36 blk.2.ffn_gate_inp.weight 0x441766e0 0x100000
37 blk.2.ffn_norm.weight 0x442766e0 0x2000
38 blk.2.ffn_up_exps.weight 0x442786e0 0x5280000
39 blk.3.attn_k.weight 0x494f86e0 0x6e000
40 blk.3.attn_k_norm.weight 0x495666e0 0x200
41 blk.3.attn_norm.weight 0x495668e0 0x2000
42 blk.3.attn_output.weight 0x495688e0 0x480000
43 blk.3.attn_q.weight 0x499e88e0 0x370000
44 blk.3.attn_q_norm.weight 0x49d588e0 0x200
45 blk.3.attn_v.weight 0x49d58ae0 0x90000
46 blk.3.ffn_down_exps.weight 0x49de8ae0 0x6c00000
47 blk.3.ffn_gate_exps.weight 0x509e8ae0 0x5280000
48 blk.3.ffn_gate_inp.weight 0x55c68ae0 0x100000
49 blk.3.ffn_norm.weight 0x55d68ae0 0x2000
50 blk.3.ffn_up_exps.weight 0x55d6aae0 0x5280000
51 blk.4.attn_k.weight 0x5afeaae0 0x6e000
52 blk.4.attn_k_norm.weight 0x5b058ae0 0x200
53 blk.4.attn_norm.weight 0x5b058ce0 0x2000
54 blk.4.attn_output.weight 0x5b05ace0 0x480000
55 blk.4.attn_q.weight 0x5b4dace0 0x370000
56 blk.4.attn_q_norm.weight 0x5b84ace0 0x200
57 blk.4.attn_v.weight 0x5b84aee0 0x90000
58 blk.4.ffn_down_exps.weight 0x5b8daee0 0x6c00000
59 blk.4.ffn_gate_exps.weight 0x624daee0 0x5280000
60 blk.4.ffn_gate_inp.weight 0x6775aee0 0x100000
61 blk.4.ffn_norm.weight 0x6785aee0 0x2000
62 blk.4.ffn_up_exps.weight 0x6785cee0 0x5280000
63 blk.5.attn_k.weight 0x6cadcee0 0x6e000
64 blk.5.attn_k_norm.weight 0x6cb4aee0 0x200
65 blk.5.attn_norm.weight 0x6cb4b0e0 0x2000
66 blk.5.attn_output.weight 0x6cb4d0e0 0x480000
67 blk.5.attn_q.weight 0x6cfcd0e0 0x370000
68 blk.5.attn_q_norm.weight 0x6d33d0e0 0x200
69 blk.5.attn_v.weight 0x6d33d2e0 0x90000
70 blk.5.ffn_down_exps.weight 0x6d3cd2e0 0x6c00000
71 blk.5.ffn_gate_exps.weight 0x73fcd2e0 0x5280000
72 blk.5.ffn_gate_inp.weight 0x7924d2e0 0x100000
73 blk.5.ffn_norm.weight 0x7934d2e0 0x2000
74 blk.5.ffn_up_exps.weight 0x7934f2e0 0x5280000
75 blk.6.attn_k.weight 0x7e5cf2e0 0x6e000
76 blk.6.attn_k_norm.weight 0x7e63d2e0 0x200
77 blk.6.attn_norm.weight 0x7e63d4e0 0x2000
78 blk.6.attn_output.weight 0x7e63f4e0 0x480000
79 blk.6.attn_q.weight 0x7eabf4e0 0x370000
80 blk.6.attn_q_norm.weight 0x7ee2f4e0 0x200
81 blk.6.attn_v.weight 0x7ee2f6e0 0x90000
82 blk.6.ffn_down_exps.weight 0x7eebf6e0 0x8400000
83 blk.6.ffn_gate_exps.weight 0x872bf6e0 0x5280000
84 blk.6.ffn_gate_inp.weight 0x8c53f6e0 0x100000
85 blk.6.ffn_norm.weight 0x8c63f6e0 0x2000
86 blk.6.ffn_up_exps.weight 0x8c6416e0 0x5280000
87 blk.7.attn_k.weight 0x918c16e0 0x6e000
88 blk.7.attn_k_norm.weight 0x9192f6e0 0x200
89 blk.7.attn_norm.weight 0x9192f8e0 0x2000
90 blk.7.attn_output.weight 0x919318e0 0x480000
91 blk.7.attn_q.weight 0x91db18e0 0x370000
92 blk.7.attn_q_norm.weight 0x921218e0 0x200
93 blk.7.attn_v.weight 0x92121ae0 0x90000
94 blk.7.ffn_down_exps.weight 0x921b1ae0 0x8400000
95 blk.7.ffn_gate_exps.weight 0x9a5b1ae0 0x5280000
96 blk.7.ffn_gate_inp.weight 0x9f831ae0 0x100000
97 blk.7.ffn_norm.weight 0x9f931ae0 0x2000
98 blk.7.ffn_up_exps.weight 0x9f933ae0 0x5280000
99 blk.8.attn_k.weight 0xa4bb3ae0 0x6e000
100 blk.8.attn_k_norm.weight 0xa4c21ae0 0x200
101 blk.8.attn_norm.weight 0xa4c21ce0 0x2000
102 blk.8.attn_output.weight 0xa4c23ce0 0x480000
103 blk.8.attn_q.weight 0xa50a3ce0 0x370000
104 blk.8.attn_q_norm.weight 0xa5413ce0 0x200
105 blk.8.attn_v.weight 0xa5413ee0 0x90000
106 blk.8.ffn_down_exps.weight 0xa54a3ee0 0x8400000
107 blk.8.ffn_gate_exps.weight 0xad8a3ee0 0x5280000
108 blk.8.ffn_gate_inp.weight 0xb2b23ee0 0x100000
109 blk.8.ffn_norm.weight 0xb2c23ee0 0x2000
110 blk.8.ffn_up_exps.weight 0xb2c25ee0 0x5280000
111 blk.9.attn_k.weight 0xb7ea5ee0 0x6e000
112 blk.9.attn_k_norm.weight 0xb7f13ee0 0x200
113 blk.9.attn_norm.weight 0xb7f140e0 0x2000
114 blk.9.attn_output.weight 0xb7f160e0 0x480000
115 blk.9.attn_q.weight 0xb83960e0 0x370000
116 blk.9.attn_q_norm.weight 0xb87060e0 0x200
117 blk.9.attn_v.weight 0xb87062e0 0x90000
118 blk.9.ffn_down_exps.weight 0xb87962e0 0x8400000
119 blk.9.ffn_gate_exps.weight 0xc0b962e0 0x5280000
120 blk.9.ffn_gate_inp.weight 0xc5e162e0 0x100000
121 blk.9.ffn_norm.weight 0xc5f162e0 0x2000
122 blk.9.ffn_up_exps.weight 0xc5f182e0 0x5280000
123 blk.10.attn_k.weight 0xcb1982e0 0x6e000
124 blk.10.attn_k_norm.weight 0xcb2062e0 0x200
125 blk.10.attn_norm.weight 0xcb2064e0 0x2000
126 blk.10.attn_output.weight 0xcb2084e0 0x480000
127 blk.10.attn_q.weight 0xcb6884e0 0x370000
128 blk.10.attn_q_norm.weight 0xcb9f84e0 0x200
129 blk.10.attn_v.weight 0xcb9f86e0 0x90000
130 blk.10.ffn_down_exps.weight 0xcba886e0 0x8400000
131 blk.10.ffn_gate_exps.weight 0xd3e886e0 0x5280000
132 blk.10.ffn_gate_inp.weight 0xd91086e0 0x100000
133 blk.10.ffn_norm.weight 0xd92086e0 0x2000
134 blk.10.ffn_up_exps.weight 0xd920a6e0 0x5280000
135 blk.11.attn_k.weight 0xde48a6e0 0x6e000
136 blk.11.attn_k_norm.weight 0xde4f86e0 0x200
137 blk.11.attn_norm.weight 0xde4f88e0 0x2000
138 blk.11.attn_output.weight 0xde4fa8e0 0x480000
139 blk.11.attn_q.weight 0xde97a8e0 0x370000
140 blk.11.attn_q_norm.weight 0xdecea8e0 0x200
141 blk.11.attn_v.weight 0xdeceaae0 0x90000
142 blk.11.ffn_down_exps.weight 0xded7aae0 0x8400000
143 blk.11.ffn_gate_exps.weight 0xe717aae0 0x5280000
144 blk.11.ffn_gate_inp.weight 0xec3faae0 0x100000
145 blk.11.ffn_norm.weight 0xec4faae0 0x2000
146 blk.11.ffn_up_exps.weight 0xec4fcae0 0x5280000
147 blk.12.attn_k.weight 0xf177cae0 0x6e000
148 blk.12.attn_k_norm.weight 0xf17eaae0 0x200
149 blk.12.attn_norm.weight 0xf17eace0 0x2000
150 blk.12.attn_output.weight 0xf17ecce0 0x480000
151 blk.12.attn_q.weight 0xf1c6cce0 0x370000
152 blk.12.attn_q_norm.weight 0xf1fdcce0 0x200
153 blk.12.attn_v.weight 0xf1fdcee0 0x90000
154 blk.12.ffn_down_exps.weight 0xf206cee0 0x8400000
155 blk.12.ffn_gate_exps.weight 0xfa46cee0 0x5280000
156 blk.12.ffn_gate_inp.weight 0xff6ecee0 0x100000
157 blk.12.ffn_norm.weight 0xff7ecee0 0x2000
158 blk.12.ffn_up_exps.weight 0xff7eeee0 0x5280000
159 blk.13.attn_k.weight 0x104a6eee0 0x6e000
160 blk.13.attn_k_norm.weight 0x104adcee0 0x200
161 blk.13.attn_norm.weight 0x104add0e0 0x2000
162 blk.13.attn_output.weight 0x104adf0e0 0x480000
163 blk.13.attn_q.weight 0x104f5f0e0 0x370000
164 blk.13.attn_q_norm.weight 0x1052cf0e0 0x200
165 blk.13.attn_v.weight 0x1052cf2e0 0x90000
166 blk.13.ffn_down_exps.weight 0x10535f2e0 0x8400000
167 blk.13.ffn_gate_exps.weight 0x10d75f2e0 0x5280000
168 blk.13.ffn_gate_inp.weight 0x1129df2e0 0x100000
169 blk.13.ffn_norm.weight 0x112adf2e0 0x2000
170 blk.13.ffn_up_exps.weight 0x112ae12e0 0x5280000
171 blk.14.attn_k.weight 0x117d612e0 0x6e000
172 blk.14.attn_k_norm.weight 0x117dcf2e0 0x200
173 blk.14.attn_norm.weight 0x117dcf4e0 0x2000
174 blk.14.attn_output.weight 0x117dd14e0 0x480000
175 blk.14.attn_q.weight 0x1182514e0 0x370000
176 blk.14.attn_q_norm.weight 0x1185c14e0 0x200
177 blk.14.attn_v.weight 0x1185c16e0 0x90000
178 blk.14.ffn_down_exps.weight 0x1186516e0 0x8400000
179 blk.14.ffn_gate_exps.weight 0x120a516e0 0x5280000
180 blk.14.ffn_gate_inp.weight 0x125cd16e0 0x100000
181 blk.14.ffn_norm.weight 0x125dd16e0 0x2000
182 blk.14.ffn_up_exps.weight 0x125dd36e0 0x5280000
183 blk.15.attn_k.weight 0x12b0536e0 0x6e000
184 blk.15.attn_k_norm.weight 0x12b0c16e0 0x200
185 blk.15.attn_norm.weight 0x12b0c18e0 0x2000
186 blk.15.attn_output.weight 0x12b0c38e0 0x480000
187 blk.15.attn_q.weight 0x12b5438e0 0x370000
188 blk.15.attn_q_norm.weight 0x12b8b38e0 0x200
189 blk.15.attn_v.weight 0x12b8b3ae0 0x90000
190 blk.15.ffn_down_exps.weight 0x12b943ae0 0x8400000
191 blk.15.ffn_gate_exps.weight 0x133d43ae0 0x5280000
192 blk.15.ffn_gate_inp.weight 0x138fc3ae0 0x100000
193 blk.15.ffn_norm.weight 0x1390c3ae0 0x2000
194 blk.15.ffn_up_exps.weight 0x1390c5ae0 0x5280000
195 blk.16.attn_k.weight 0x13e345ae0 0x6e000
196 blk.16.attn_k_norm.weight 0x13e3b3ae0 0x200
197 blk.16.attn_norm.weight 0x13e3b3ce0 0x2000
198 blk.16.attn_output.weight 0x13e3b5ce0 0x480000
199 blk.16.attn_q.weight 0x13e835ce0 0x370000
200 blk.16.attn_q_norm.weight 0x13eba5ce0 0x200
201 blk.16.attn_v.weight 0x13eba5ee0 0x90000
202 blk.16.ffn_down_exps.weight 0x13ec35ee0 0x8400000
203 blk.16.ffn_gate_exps.weight 0x147035ee0 0x5280000
204 blk.16.ffn_gate_inp.weight 0x14c2b5ee0 0x100000
205 blk.16.ffn_norm.weight 0x14c3b5ee0 0x2000
206 blk.16.ffn_up_exps.weight 0x14c3b7ee0 0x5280000
207 blk.17.attn_k.weight 0x151637ee0 0x6e000
208 blk.17.attn_k_norm.weight 0x1516a5ee0 0x200
209 blk.17.attn_norm.weight 0x1516a60e0 0x2000
210 blk.17.attn_output.weight 0x1516a80e0 0x480000
211 blk.17.attn_q.weight 0x151b280e0 0x370000
212 blk.17.attn_q_norm.weight 0x151e980e0 0x200
213 blk.17.attn_v.weight 0x151e982e0 0x90000
214 blk.17.ffn_down_exps.weight 0x151f282e0 0x8400000
215 blk.17.ffn_gate_exps.weight 0x15a3282e0 0x5280000
216 blk.17.ffn_gate_inp.weight 0x15f5a82e0 0x100000
217 blk.17.ffn_norm.weight 0x15f6a82e0 0x2000
218 blk.17.ffn_up_exps.weight 0x15f6aa2e0 0x5280000
219 blk.18.attn_k.weight 0x16492a2e0 0x6e000
220 blk.18.attn_k_norm.weight 0x1649982e0 0x200
221 blk.18.attn_norm.weight 0x1649984e0 0x2000
222 blk.18.attn_output.weight 0x16499a4e0 0x480000
223 blk.18.attn_q.weight 0x164e1a4e0 0x370000
224 blk.18.attn_q_norm.weight 0x16518a4e0 0x200
225 blk.18.attn_v.weight 0x16518a6e0 0x90000
226 blk.18.ffn_down_exps.weight 0x16521a6e0 0x8400000
227 blk.18.ffn_gate_exps.weight 0x16d61a6e0 0x6c00000
228 blk.18.ffn_gate_inp.weight 0x17421a6e0 0x100000
229 blk.18.ffn_norm.weight 0x17431a6e0 0x2000
230 blk.18.ffn_up_exps.weight 0x17431c6e0 0x6c00000
231 blk.19.attn_k.weight 0x17af1c6e0 0x6e000
232 blk.19.attn_k_norm.weight 0x17af8a6e0 0x200
233 blk.19.attn_norm.weight 0x17af8a8e0 0x2000
234 blk.19.attn_output.weight 0x17af8c8e0 0x480000
235 blk.19.attn_q.weight 0x17b40c8e0 0x370000
236 blk.19.attn_q_norm.weight 0x17b77c8e0 0x200
237 blk.19.attn_v.weight 0x17b77cae0 0x90000
238 blk.19.ffn_down_exps.weight 0x17b80cae0 0x8400000
239 blk.19.ffn_gate_exps.weight 0x183c0cae0 0x5280000
240 blk.19.ffn_gate_inp.weight 0x188e8cae0 0x100000
241 blk.19.ffn_norm.weight 0x188f8cae0 0x2000
242 blk.19.ffn_up_exps.weight 0x188f8eae0 0x5280000
243 blk.20.attn_k.weight 0x18e20eae0 0x6e000
244 blk.20.attn_k_norm.weight 0x18e27cae0 0x200
245 blk.20.attn_norm.weight 0x18e27cce0 0x2000
246 blk.20.attn_output.weight 0x18e27ece0 0x480000
247 blk.20.attn_q.weight 0x18e6fece0 0x370000
248 blk.20.attn_q_norm.weight 0x18ea6ece0 0x200
249 blk.20.attn_v.weight 0x18ea6eee0 0x90000
250 blk.20.ffn_down_exps.weight 0x18eafeee0 0x8400000
251 blk.20.ffn_gate_exps.weight 0x196efeee0 0x5280000
252 blk.20.ffn_gate_inp.weight 0x19c17eee0 0x100000
253 blk.20.ffn_norm.weight 0x19c27eee0 0x2000
254 blk.20.ffn_up_exps.weight 0x19c280ee0 0x5280000
255 blk.21.attn_k.weight 0x1a1500ee0 0x6e000
256 blk.21.attn_k_norm.weight 0x1a156eee0 0x200
257 blk.21.attn_norm.weight 0x1a156f0e0 0x2000
258 blk.21.attn_output.weight 0x1a15710e0 0x480000
259 blk.21.attn_q.weight 0x1a19f10e0 0x370000
260 blk.21.attn_q_norm.weight 0x1a1d610e0 0x200
261 blk.21.attn_v.weight 0x1a1d612e0 0x90000
262 blk.21.ffn_down_exps.weight 0x1a1df12e0 0x8400000
263 blk.21.ffn_gate_exps.weight 0x1aa1f12e0 0x5280000
264 blk.21.ffn_gate_inp.weight 0x1af4712e0 0x100000
265 blk.21.ffn_norm.weight 0x1af5712e0 0x2000
266 blk.21.ffn_up_exps.weight 0x1af5732e0 0x5280000
267 blk.22.attn_k.weight 0x1b47f32e0 0x6e000
268 blk.22.attn_k_norm.weight 0x1b48612e0 0x200
269 blk.22.attn_norm.weight 0x1b48614e0 0x2000
270 blk.22.attn_output.weight 0x1b48634e0 0x480000
271 blk.22.attn_q.weight 0x1b4ce34e0 0x370000
272 blk.22.attn_q_norm.weight 0x1b50534e0 0x200
273 blk.22.attn_v.weight 0x1b50536e0 0x90000
274 blk.22.ffn_down_exps.weight 0x1b50e36e0 0x8400000
275 blk.22.ffn_gate_exps.weight 0x1bd4e36e0 0x5280000
276 blk.22.ffn_gate_inp.weight 0x1c27636e0 0x100000
277 blk.22.ffn_norm.weight 0x1c28636e0 0x2000
278 blk.22.ffn_up_exps.weight 0x1c28656e0 0x5280000
279 blk.23.attn_k.weight 0x1c7ae56e0 0x6e000
280 blk.23.attn_k_norm.weight 0x1c7b536e0 0x200
281 blk.23.attn_norm.weight 0x1c7b538e0 0x2000
282 blk.23.attn_output.weight 0x1c7b558e0 0x480000
283 blk.23.attn_q.weight 0x1c7fd58e0 0x370000
284 blk.23.attn_q_norm.weight 0x1c83458e0 0x200
285 blk.23.attn_v.weight 0x1c8345ae0 0x90000
286 blk.23.ffn_down_exps.weight 0x1c83d5ae0 0x8400000
287 blk.23.ffn_gate_exps.weight 0x1d07d5ae0 0x5280000
288 blk.23.ffn_gate_inp.weight 0x1d5a55ae0 0x100000
289 blk.23.ffn_norm.weight 0x1d5b55ae0 0x2000
290 blk.23.ffn_up_exps.weight 0x1d5b57ae0 0x5280000
291 blk.24.attn_k.weight 0x1dadd7ae0 0x90000
292 blk.24.attn_k_norm.weight 0x1dae67ae0 0x200
293 blk.24.attn_norm.weight 0x1dae67ce0 0x2000
294 blk.24.attn_output.weight 0x1dae69ce0 0x480000
295 blk.24.attn_q.weight 0x1db2e9ce0 0x480000
296 blk.24.attn_q_norm.weight 0x1db769ce0 0x200
297 blk.24.attn_v.weight 0x1db769ee0 0x90000
298 blk.24.ffn_down_exps.weight 0x1db7f9ee0 0x8400000
299 blk.24.ffn_gate_exps.weight 0x1e3bf9ee0 0x5280000
300 blk.24.ffn_gate_inp.weight 0x1e8e79ee0 0x100000
301 blk.24.ffn_norm.weight 0x1e8f79ee0 0x2000
302 blk.24.ffn_up_exps.weight 0x1e8f7bee0 0x5280000
303 blk.25.attn_k.weight 0x1ee1fbee0 0x90000
304 blk.25.attn_k_norm.weight 0x1ee28bee0 0x200
305 blk.25.attn_norm.weight 0x1ee28c0e0 0x2000
306 blk.25.attn_output.weight 0x1ee28e0e0 0x480000
307 blk.25.attn_q.weight 0x1ee70e0e0 0x480000
308 blk.25.attn_q_norm.weight 0x1eeb8e0e0 0x200
309 blk.25.attn_v.weight 0x1eeb8e2e0 0x90000
310 blk.25.ffn_down_exps.weight 0x1eec1e2e0 0x8400000
311 blk.25.ffn_gate_exps.weight 0x1f701e2e0 0x6c00000
312 blk.25.ffn_gate_inp.weight 0x1fdc1e2e0 0x100000
313 blk.25.ffn_norm.weight 0x1fdd1e2e0 0x2000
314 blk.25.ffn_up_exps.weight 0x1fdd202e0 0x6c00000
315 blk.26.attn_k.weight 0x2049202e0 0x90000
316 blk.26.attn_k_norm.weight 0x2049b02e0 0x200
317 blk.26.attn_norm.weight 0x2049b04e0 0x2000
318 blk.26.attn_output.weight 0x2049b24e0 0x480000
319 blk.26.attn_q.weight 0x204e324e0 0x480000
320 blk.26.attn_q_norm.weight 0x2052b24e0 0x200
321 blk.26.attn_v.weight 0x2052b26e0 0x90000
322 blk.26.ffn_down_exps.weight 0x2053426e0 0x8400000
323 blk.26.ffn_gate_exps.weight 0x20d7426e0 0x6c00000
324 blk.26.ffn_gate_inp.weight 0x2143426e0 0x100000
325 blk.26.ffn_norm.weight 0x2144426e0 0x2000
326 blk.26.ffn_up_exps.weight 0x2144446e0 0x6c00000
327 blk.27.attn_k.weight 0x21b0446e0 0x90000
328 blk.27.attn_k_norm.weight 0x21b0d46e0 0x200
329 blk.27.attn_norm.weight 0x21b0d48e0 0x2000
330 blk.27.attn_output.weight 0x21b0d68e0 0x480000
331 blk.27.attn_q.weight 0x21b5568e0 0x480000
332 blk.27.attn_q_norm.weight 0x21b9d68e0 0x200
333 blk.27.attn_v.weight 0x21b9d6ae0 0x90000
334 blk.27.ffn_down_exps.weight 0x21ba66ae0 0x8400000
335 blk.27.ffn_gate_exps.weight 0x223e66ae0 0x6c00000
336 blk.27.ffn_gate_inp.weight 0x22aa66ae0 0x100000
337 blk.27.ffn_norm.weight 0x22ab66ae0 0x2000
338 blk.27.ffn_up_exps.weight 0x22ab68ae0 0x6c00000
339 blk.28.attn_k.weight 0x231768ae0 0x90000
340 blk.28.attn_k_norm.weight 0x2317f8ae0 0x200
341 blk.28.attn_norm.weight 0x2317f8ce0 0x2000
342 blk.28.attn_output.weight 0x2317face0 0x480000
343 blk.28.attn_q.weight 0x231c7ace0 0x480000
344 blk.28.attn_q_norm.weight 0x2320face0 0x200
345 blk.28.attn_v.weight 0x2320faee0 0x90000
346 blk.28.ffn_down_exps.weight 0x23218aee0 0x8400000
347 blk.28.ffn_gate_exps.weight 0x23a58aee0 0x6c00000
348 blk.28.ffn_gate_inp.weight 0x24118aee0 0x100000
349 blk.28.ffn_norm.weight 0x24128aee0 0x2000
350 blk.28.ffn_up_exps.weight 0x24128cee0 0x6c00000
351 blk.29.attn_k.weight 0x247e8cee0 0x90000
352 blk.29.attn_k_norm.weight 0x247f1cee0 0x200
353 blk.29.attn_norm.weight 0x247f1d0e0 0x2000
354 blk.29.attn_output.weight 0x247f1f0e0 0x480000
355 blk.29.attn_q.weight 0x24839f0e0 0x480000
356 blk.29.attn_q_norm.weight 0x24881f0e0 0x200
357 blk.29.attn_v.weight 0x24881f2e0 0x90000
358 blk.29.ffn_down_exps.weight 0x2488af2e0 0x8400000
359 blk.29.ffn_gate_exps.weight 0x250caf2e0 0x6c00000
360 blk.29.ffn_gate_inp.weight 0x2578af2e0 0x100000
361 blk.29.ffn_norm.weight 0x2579af2e0 0x2000
362 blk.29.ffn_up_exps.weight 0x2579b12e0 0x6c00000
363 blk.30.attn_k.weight 0x25e5b12e0 0x90000
364 blk.30.attn_k_norm.weight 0x25e6412e0 0x200
365 blk.30.attn_norm.weight 0x25e6414e0 0x2000
366 blk.30.attn_output.weight 0x25e6434e0 0x480000
367 blk.30.attn_q.weight 0x25eac34e0 0x480000
368 blk.30.attn_q_norm.weight 0x25ef434e0 0x200
369 blk.30.attn_v.weight 0x25ef436e0 0x90000
370 blk.30.ffn_down_exps.weight 0x25efd36e0 0x8400000
371 blk.30.ffn_gate_exps.weight 0x2673d36e0 0x6c00000
372 blk.30.ffn_gate_inp.weight 0x26dfd36e0 0x100000
373 blk.30.ffn_norm.weight 0x26e0d36e0 0x2000
374 blk.30.ffn_up_exps.weight 0x26e0d56e0 0x6c00000
375 blk.31.attn_k.weight 0x274cd56e0 0x90000
376 blk.31.attn_k_norm.weight 0x274d656e0 0x200
377 blk.31.attn_norm.weight 0x274d658e0 0x2000
378 blk.31.attn_output.weight 0x274d678e0 0x480000
379 blk.31.attn_q.weight 0x2751e78e0 0x480000
380 blk.31.attn_q_norm.weight 0x2756678e0 0x200
381 blk.31.attn_v.weight 0x275667ae0 0x90000
382 blk.31.ffn_down_exps.weight 0x2756f7ae0 0x8400000
383 blk.31.ffn_gate_exps.weight 0x27daf7ae0 0x6c00000
384 blk.31.ffn_gate_inp.weight 0x2846f7ae0 0x100000
385 blk.31.ffn_norm.weight 0x2847f7ae0 0x2000
386 blk.31.ffn_up_exps.weight 0x2847f9ae0 0x6c00000
387 blk.32.attn_k.weight 0x28b3f9ae0 0x90000
388 blk.32.attn_k_norm.weight 0x28b489ae0 0x200
389 blk.32.attn_norm.weight 0x28b489ce0 0x2000
390 blk.32.attn_output.weight 0x28b48bce0 0x480000
391 blk.32.attn_q.weight 0x28b90bce0 0x480000
392 blk.32.attn_q_norm.weight 0x28bd8bce0 0x200
393 blk.32.attn_v.weight 0x28bd8bee0 0x90000
394 blk.32.ffn_down_exps.weight 0x28be1bee0 0x8400000
395 blk.32.ffn_gate_exps.weight 0x29421bee0 0x6c00000
396 blk.32.ffn_gate_inp.weight 0x29ae1bee0 0x100000
397 blk.32.ffn_norm.weight 0x29af1bee0 0x2000
398 blk.32.ffn_up_exps.weight 0x29af1dee0 0x6c00000
399 blk.33.attn_k.weight 0x2a1b1dee0 0x90000
400 blk.33.attn_k_norm.weight 0x2a1badee0 0x200
401 blk.33.attn_norm.weight 0x2a1bae0e0 0x2000
402 blk.33.attn_output.weight 0x2a1bb00e0 0x480000
403 blk.33.attn_q.weight 0x2a20300e0 0x480000
404 blk.33.attn_q_norm.weight 0x2a24b00e0 0x200
405 blk.33.attn_v.weight 0x2a24b02e0 0x90000
406 blk.33.ffn_down_exps.weight 0x2a25402e0 0x8400000
407 blk.33.ffn_gate_exps.weight 0x2aa9402e0 0x6c00000
408 blk.33.ffn_gate_inp.weight 0x2b15402e0 0x100000
409 blk.33.ffn_norm.weight 0x2b16402e0 0x2000
410 blk.33.ffn_up_exps.weight 0x2b16422e0 0x6c00000
411 blk.34.attn_k.weight 0x2b82422e0 0x90000
412 blk.34.attn_k_norm.weight 0x2b82d22e0 0x200
413 blk.34.attn_norm.weight 0x2b82d24e0 0x2000
414 blk.34.attn_output.weight 0x2b82d44e0 0x480000
415 blk.34.attn_q.weight 0x2b87544e0 0x480000
416 blk.34.attn_q_norm.weight 0x2b8bd44e0 0x200
417 blk.34.attn_v.weight 0x2b8bd46e0 0x90000
418 blk.34.ffn_down_exps.weight 0x2b8c646e0 0x8400000
419 blk.34.ffn_gate_exps.weight 0x2c10646e0 0x6c00000
420 blk.34.ffn_gate_inp.weight 0x2c7c646e0 0x100000
421 blk.34.ffn_norm.weight 0x2c7d646e0 0x2000
422 blk.34.ffn_up_exps.weight 0x2c7d666e0 0x6c00000
423 blk.35.attn_k.weight 0x2ce9666e0 0x90000
424 blk.35.attn_k_norm.weight 0x2ce9f66e0 0x200
425 blk.35.attn_norm.weight 0x2ce9f68e0 0x2000
426 blk.35.attn_output.weight 0x2ce9f88e0 0x480000
427 blk.35.attn_q.weight 0x2cee788e0 0x480000
428 blk.35.attn_q_norm.weight 0x2cf2f88e0 0x200
429 blk.35.attn_v.weight 0x2cf2f8ae0 0x90000
430 blk.35.ffn_down_exps.weight 0x2cf388ae0 0x8400000
431 blk.35.ffn_gate_exps.weight 0x2d7788ae0 0x6c00000
432 blk.35.ffn_gate_inp.weight 0x2de388ae0 0x100000
433 blk.35.ffn_norm.weight 0x2de488ae0 0x2000
434 blk.35.ffn_up_exps.weight 0x2de48aae0 0x6c00000
435 blk.36.attn_k.weight 0x2e508aae0 0x90000
436 blk.36.attn_k_norm.weight 0x2e511aae0 0x200
437 blk.36.attn_norm.weight 0x2e511ace0 0x2000
438 blk.36.attn_output.weight 0x2e511cce0 0x480000
439 blk.36.attn_q.weight 0x2e559cce0 0x480000
440 blk.36.attn_q_norm.weight 0x2e5a1cce0 0x200
441 blk.36.attn_v.weight 0x2e5a1cee0 0x90000
442 blk.36.ffn_down_exps.weight 0x2e5aacee0 0x8400000
443 blk.36.ffn_gate_exps.weight 0x2edeacee0 0x6c00000
444 blk.36.ffn_gate_inp.weight 0x2f4aacee0 0x100000
445 blk.36.ffn_norm.weight 0x2f4bacee0 0x2000
446 blk.36.ffn_up_exps.weight 0x2f4baeee0 0x6c00000
447 blk.37.attn_k.weight 0x2fb7aeee0 0x90000
448 blk.37.attn_k_norm.weight 0x2fb83eee0 0x200
449 blk.37.attn_norm.weight 0x2fb83f0e0 0x2000
450 blk.37.attn_output.weight 0x2fb8410e0 0x480000
451 blk.37.attn_q.weight 0x2fbcc10e0 0x480000
452 blk.37.attn_q_norm.weight 0x2fc1410e0 0x200
453 blk.37.attn_v.weight 0x2fc1412e0 0x90000
454 blk.37.ffn_down_exps.weight 0x2fc1d12e0 0x8400000
455 blk.37.ffn_gate_exps.weight 0x3045d12e0 0x6c00000
456 blk.37.ffn_gate_inp.weight 0x30b1d12e0 0x100000
457 blk.37.ffn_norm.weight 0x30b2d12e0 0x2000
458 blk.37.ffn_up_exps.weight 0x30b2d32e0 0x6c00000
459 blk.38.attn_k.weight 0x311ed32e0 0x90000
460 blk.38.attn_k_norm.weight 0x311f632e0 0x200
461 blk.38.attn_norm.weight 0x311f634e0 0x2000
462 blk.38.attn_output.weight 0x311f654e0 0x480000
463 blk.38.attn_q.weight 0x3123e54e0 0x480000
464 blk.38.attn_q_norm.weight 0x3128654e0 0x200
465 blk.38.attn_v.weight 0x3128656e0 0x90000
466 blk.38.ffn_down_exps.weight 0x3128f56e0 0x8400000
467 blk.38.ffn_gate_exps.weight 0x31acf56e0 0x6c00000
468 blk.38.ffn_gate_inp.weight 0x3218f56e0 0x100000
469 blk.38.ffn_norm.weight 0x3219f56e0 0x2000
470 blk.38.ffn_up_exps.weight 0x3219f76e0 0x6c00000
471 blk.39.attn_k.weight 0x3285f76e0 0x90000
472 blk.39.attn_k_norm.weight 0x3286876e0 0x200
473 blk.39.attn_norm.weight 0x3286878e0 0x2000
474 blk.39.attn_output.weight 0x3286898e0 0x480000
475 blk.39.attn_q.weight 0x328b098e0 0x480000
476 blk.39.attn_q_norm.weight 0x328f898e0 0x200
477 blk.39.attn_v.weight 0x328f89ae0 0x90000
478 blk.39.ffn_down_exps.weight 0x329019ae0 0x8400000
479 blk.39.ffn_gate_exps.weight 0x331419ae0 0x6c00000
480 blk.39.ffn_gate_inp.weight 0x338019ae0 0x100000
481 blk.39.ffn_norm.weight 0x338119ae0 0x2000
482 blk.39.ffn_up_exps.weight 0x33811bae0 0x6c00000
483 blk.40.attn_k.weight 0x33ed1bae0 0x90000
484 blk.40.attn_k_norm.weight 0x33edabae0 0x200
485 blk.40.attn_norm.weight 0x33edabce0 0x2000
486 blk.40.attn_output.weight 0x33edadce0 0x480000
487 blk.40.attn_q.weight 0x33f22dce0 0x480000
488 blk.40.attn_q_norm.weight 0x33f6adce0 0x200
489 blk.40.attn_v.weight 0x33f6adee0 0x90000
490 blk.40.ffn_down_exps.weight 0x33f73dee0 0x8400000
491 blk.40.ffn_gate_exps.weight 0x347b3dee0 0x6c00000
492 blk.40.ffn_gate_inp.weight 0x34e73dee0 0x100000
493 blk.40.ffn_norm.weight 0x34e83dee0 0x2000
494 blk.40.ffn_up_exps.weight 0x34e83fee0 0x6c00000
495 blk.41.attn_k.weight 0x35543fee0 0x90000
496 blk.41.attn_k_norm.weight 0x3554cfee0 0x200
497 blk.41.attn_norm.weight 0x3554d00e0 0x2000
498 blk.41.attn_output.weight 0x3554d20e0 0x480000
499 blk.41.attn_q.weight 0x3559520e0 0x480000
500 blk.41.attn_q_norm.weight 0x355dd20e0 0x200
501 blk.41.attn_v.weight 0x355dd22e0 0x90000
502 blk.41.ffn_down_exps.weight 0x355e622e0 0x8400000
503 blk.41.ffn_gate_exps.weight 0x35e2622e0 0x6c00000
504 blk.41.ffn_gate_inp.weight 0x364e622e0 0x100000
505 blk.41.ffn_norm.weight 0x364f622e0 0x2000
506 blk.41.ffn_up_exps.weight 0x364f642e0 0x6c00000
507 blk.42.attn_k.weight 0x36bb642e0 0x90000
508 blk.42.attn_k_norm.weight 0x36bbf42e0 0x200
509 blk.42.attn_norm.weight 0x36bbf44e0 0x2000
510 blk.42.attn_output.weight 0x36bbf64e0 0x480000
511 blk.42.attn_q.weight 0x36c0764e0 0x480000
512 blk.42.attn_q_norm.weight 0x36c4f64e0 0x200
513 blk.42.attn_v.weight 0x36c4f66e0 0x90000
514 blk.42.ffn_down_exps.weight 0x36c5866e0 0x8400000
515 blk.42.ffn_gate_exps.weight 0x3749866e0 0x6c00000
516 blk.42.ffn_gate_inp.weight 0x37b5866e0 0x100000
517 blk.42.ffn_norm.weight 0x37b6866e0 0x2000
518 blk.42.ffn_up_exps.weight 0x37b6886e0 0x6c00000
519 blk.43.attn_k.weight 0x3822886e0 0x90000
520 blk.43.attn_k_norm.weight 0x3823186e0 0x200
521 blk.43.attn_norm.weight 0x3823188e0 0x2000
522 blk.43.attn_output.weight 0x38231a8e0 0x480000
523 blk.43.attn_q.weight 0x38279a8e0 0x480000
524 blk.43.attn_q_norm.weight 0x382c1a8e0 0x200
525 blk.43.attn_v.weight 0x382c1aae0 0x90000
526 blk.43.ffn_down_exps.weight 0x382caaae0 0x8400000
527 blk.43.ffn_gate_exps.weight 0x38b0aaae0 0x6c00000
528 blk.43.ffn_gate_inp.weight 0x391caaae0 0x100000
529 blk.43.ffn_norm.weight 0x391daaae0 0x2000
530 blk.43.ffn_up_exps.weight 0x391dacae0 0x6c00000
531 blk.44.attn_k.weight 0x3989acae0 0x90000
532 blk.44.attn_k_norm.weight 0x398a3cae0 0x200
533 blk.44.attn_norm.weight 0x398a3cce0 0x2000
534 blk.44.attn_output.weight 0x398a3ece0 0x480000
535 blk.44.attn_q.weight 0x398ebece0 0x480000
536 blk.44.attn_q_norm.weight 0x39933ece0 0x200
537 blk.44.attn_v.weight 0x39933eee0 0x90000
538 blk.44.ffn_down_exps.weight 0x3993ceee0 0x8400000
539 blk.44.ffn_gate_exps.weight 0x3a17ceee0 0x6c00000
540 blk.44.ffn_gate_inp.weight 0x3a83ceee0 0x100000
541 blk.44.ffn_norm.weight 0x3a84ceee0 0x2000
542 blk.44.ffn_up_exps.weight 0x3a84d0ee0 0x6c00000
543 blk.45.attn_k.weight 0x3af0d0ee0 0x90000
544 blk.45.attn_k_norm.weight 0x3af160ee0 0x200
545 blk.45.attn_norm.weight 0x3af1610e0 0x2000
546 blk.45.attn_output.weight 0x3af1630e0 0x480000
547 blk.45.attn_q.weight 0x3af5e30e0 0x480000
548 blk.45.attn_q_norm.weight 0x3afa630e0 0x200
549 blk.45.attn_v.weight 0x3afa632e0 0x90000
550 blk.45.ffn_down_exps.weight 0x3afaf32e0 0x8400000
551 blk.45.ffn_gate_exps.weight 0x3b7ef32e0 0x6c00000
552 blk.45.ffn_gate_inp.weight 0x3beaf32e0 0x100000
553 blk.45.ffn_norm.weight 0x3bebf32e0 0x2000
554 blk.45.ffn_up_exps.weight 0x3bebf52e0 0x6c00000

Base Tensor Group : ~622M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
0 output.weight Output (W) (~311M) 311164928 2048 x 151936 x 1 x 1 Q4_K
1 output_norm.weight Output Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
2 token_embd.weight Token Embedding (W) (~311M) 311164928 2048 x 151936 x 1 x 1 Q3_K
  • Total elements in base: (~622M) 622331904
  • Percentage of total elements: 2.13%

Block 0 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
3 blk.0.attn_k.weight Block 0 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
4 blk.0.attn_k_norm.weight Block 0 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
5 blk.0.attn_norm.weight Block 0 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
6 blk.0.attn_output.weight Block 0 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
7 blk.0.attn_q.weight Block 0 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
8 blk.0.attn_q_norm.weight Block 0 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
9 blk.0.attn_v.weight Block 0 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
10 blk.0.ffn_down_exps.weight Block 0 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
11 blk.0.ffn_gate_exps.weight Block 0 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
12 blk.0.ffn_gate_inp.weight Block 0 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
13 blk.0.ffn_norm.weight Block 0 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
14 blk.0.ffn_up_exps.weight Block 0 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.0: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 1 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
15 blk.1.attn_k.weight Block 1 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
16 blk.1.attn_k_norm.weight Block 1 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
17 blk.1.attn_norm.weight Block 1 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
18 blk.1.attn_output.weight Block 1 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
19 blk.1.attn_q.weight Block 1 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
20 blk.1.attn_q_norm.weight Block 1 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
21 blk.1.attn_v.weight Block 1 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
22 blk.1.ffn_down_exps.weight Block 1 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
23 blk.1.ffn_gate_exps.weight Block 1 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
24 blk.1.ffn_gate_inp.weight Block 1 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
25 blk.1.ffn_norm.weight Block 1 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
26 blk.1.ffn_up_exps.weight Block 1 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.1: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 2 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
27 blk.2.attn_k.weight Block 2 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
28 blk.2.attn_k_norm.weight Block 2 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
29 blk.2.attn_norm.weight Block 2 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
30 blk.2.attn_output.weight Block 2 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
31 blk.2.attn_q.weight Block 2 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
32 blk.2.attn_q_norm.weight Block 2 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
33 blk.2.attn_v.weight Block 2 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
34 blk.2.ffn_down_exps.weight Block 2 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
35 blk.2.ffn_gate_exps.weight Block 2 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
36 blk.2.ffn_gate_inp.weight Block 2 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
37 blk.2.ffn_norm.weight Block 2 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
38 blk.2.ffn_up_exps.weight Block 2 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.2: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 3 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
39 blk.3.attn_k.weight Block 3 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
40 blk.3.attn_k_norm.weight Block 3 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
41 blk.3.attn_norm.weight Block 3 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
42 blk.3.attn_output.weight Block 3 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
43 blk.3.attn_q.weight Block 3 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
44 blk.3.attn_q_norm.weight Block 3 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
45 blk.3.attn_v.weight Block 3 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
46 blk.3.ffn_down_exps.weight Block 3 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
47 blk.3.ffn_gate_exps.weight Block 3 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
48 blk.3.ffn_gate_inp.weight Block 3 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
49 blk.3.ffn_norm.weight Block 3 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
50 blk.3.ffn_up_exps.weight Block 3 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.3: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 4 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
51 blk.4.attn_k.weight Block 4 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
52 blk.4.attn_k_norm.weight Block 4 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
53 blk.4.attn_norm.weight Block 4 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
54 blk.4.attn_output.weight Block 4 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
55 blk.4.attn_q.weight Block 4 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
56 blk.4.attn_q_norm.weight Block 4 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
57 blk.4.attn_v.weight Block 4 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
58 blk.4.ffn_down_exps.weight Block 4 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
59 blk.4.ffn_gate_exps.weight Block 4 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
60 blk.4.ffn_gate_inp.weight Block 4 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
61 blk.4.ffn_norm.weight Block 4 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
62 blk.4.ffn_up_exps.weight Block 4 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.4: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 5 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
63 blk.5.attn_k.weight Block 5 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
64 blk.5.attn_k_norm.weight Block 5 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
65 blk.5.attn_norm.weight Block 5 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
66 blk.5.attn_output.weight Block 5 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
67 blk.5.attn_q.weight Block 5 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
68 blk.5.attn_q_norm.weight Block 5 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
69 blk.5.attn_v.weight Block 5 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
70 blk.5.ffn_down_exps.weight Block 5 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q4_K
71 blk.5.ffn_gate_exps.weight Block 5 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
72 blk.5.ffn_gate_inp.weight Block 5 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
73 blk.5.ffn_norm.weight Block 5 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
74 blk.5.ffn_up_exps.weight Block 5 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.5: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 6 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
75 blk.6.attn_k.weight Block 6 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
76 blk.6.attn_k_norm.weight Block 6 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
77 blk.6.attn_norm.weight Block 6 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
78 blk.6.attn_output.weight Block 6 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
79 blk.6.attn_q.weight Block 6 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
80 blk.6.attn_q_norm.weight Block 6 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
81 blk.6.attn_v.weight Block 6 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
82 blk.6.ffn_down_exps.weight Block 6 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
83 blk.6.ffn_gate_exps.weight Block 6 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
84 blk.6.ffn_gate_inp.weight Block 6 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
85 blk.6.ffn_norm.weight Block 6 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
86 blk.6.ffn_up_exps.weight Block 6 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.6: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 7 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
87 blk.7.attn_k.weight Block 7 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
88 blk.7.attn_k_norm.weight Block 7 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
89 blk.7.attn_norm.weight Block 7 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
90 blk.7.attn_output.weight Block 7 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
91 blk.7.attn_q.weight Block 7 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
92 blk.7.attn_q_norm.weight Block 7 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
93 blk.7.attn_v.weight Block 7 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
94 blk.7.ffn_down_exps.weight Block 7 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
95 blk.7.ffn_gate_exps.weight Block 7 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
96 blk.7.ffn_gate_inp.weight Block 7 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
97 blk.7.ffn_norm.weight Block 7 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
98 blk.7.ffn_up_exps.weight Block 7 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.7: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 8 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
99 blk.8.attn_k.weight Block 8 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
100 blk.8.attn_k_norm.weight Block 8 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
101 blk.8.attn_norm.weight Block 8 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
102 blk.8.attn_output.weight Block 8 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
103 blk.8.attn_q.weight Block 8 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
104 blk.8.attn_q_norm.weight Block 8 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
105 blk.8.attn_v.weight Block 8 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
106 blk.8.ffn_down_exps.weight Block 8 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
107 blk.8.ffn_gate_exps.weight Block 8 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
108 blk.8.ffn_gate_inp.weight Block 8 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
109 blk.8.ffn_norm.weight Block 8 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
110 blk.8.ffn_up_exps.weight Block 8 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.8: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 9 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
111 blk.9.attn_k.weight Block 9 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
112 blk.9.attn_k_norm.weight Block 9 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
113 blk.9.attn_norm.weight Block 9 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
114 blk.9.attn_output.weight Block 9 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
115 blk.9.attn_q.weight Block 9 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
116 blk.9.attn_q_norm.weight Block 9 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
117 blk.9.attn_v.weight Block 9 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
118 blk.9.ffn_down_exps.weight Block 9 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
119 blk.9.ffn_gate_exps.weight Block 9 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
120 blk.9.ffn_gate_inp.weight Block 9 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
121 blk.9.ffn_norm.weight Block 9 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
122 blk.9.ffn_up_exps.weight Block 9 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.9: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 10 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
123 blk.10.attn_k.weight Block 10 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
124 blk.10.attn_k_norm.weight Block 10 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
125 blk.10.attn_norm.weight Block 10 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
126 blk.10.attn_output.weight Block 10 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
127 blk.10.attn_q.weight Block 10 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
128 blk.10.attn_q_norm.weight Block 10 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
129 blk.10.attn_v.weight Block 10 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
130 blk.10.ffn_down_exps.weight Block 10 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
131 blk.10.ffn_gate_exps.weight Block 10 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
132 blk.10.ffn_gate_inp.weight Block 10 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
133 blk.10.ffn_norm.weight Block 10 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
134 blk.10.ffn_up_exps.weight Block 10 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.10: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 11 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
135 blk.11.attn_k.weight Block 11 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
136 blk.11.attn_k_norm.weight Block 11 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
137 blk.11.attn_norm.weight Block 11 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
138 blk.11.attn_output.weight Block 11 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
139 blk.11.attn_q.weight Block 11 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
140 blk.11.attn_q_norm.weight Block 11 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
141 blk.11.attn_v.weight Block 11 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
142 blk.11.ffn_down_exps.weight Block 11 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
143 blk.11.ffn_gate_exps.weight Block 11 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
144 blk.11.ffn_gate_inp.weight Block 11 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
145 blk.11.ffn_norm.weight Block 11 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
146 blk.11.ffn_up_exps.weight Block 11 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.11: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 12 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
147 blk.12.attn_k.weight Block 12 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
148 blk.12.attn_k_norm.weight Block 12 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
149 blk.12.attn_norm.weight Block 12 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
150 blk.12.attn_output.weight Block 12 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
151 blk.12.attn_q.weight Block 12 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
152 blk.12.attn_q_norm.weight Block 12 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
153 blk.12.attn_v.weight Block 12 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
154 blk.12.ffn_down_exps.weight Block 12 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
155 blk.12.ffn_gate_exps.weight Block 12 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
156 blk.12.ffn_gate_inp.weight Block 12 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
157 blk.12.ffn_norm.weight Block 12 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
158 blk.12.ffn_up_exps.weight Block 12 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.12: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 13 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
159 blk.13.attn_k.weight Block 13 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
160 blk.13.attn_k_norm.weight Block 13 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
161 blk.13.attn_norm.weight Block 13 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
162 blk.13.attn_output.weight Block 13 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
163 blk.13.attn_q.weight Block 13 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
164 blk.13.attn_q_norm.weight Block 13 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
165 blk.13.attn_v.weight Block 13 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
166 blk.13.ffn_down_exps.weight Block 13 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
167 blk.13.ffn_gate_exps.weight Block 13 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
168 blk.13.ffn_gate_inp.weight Block 13 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
169 blk.13.ffn_norm.weight Block 13 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
170 blk.13.ffn_up_exps.weight Block 13 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.13: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 14 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
171 blk.14.attn_k.weight Block 14 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
172 blk.14.attn_k_norm.weight Block 14 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
173 blk.14.attn_norm.weight Block 14 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
174 blk.14.attn_output.weight Block 14 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
175 blk.14.attn_q.weight Block 14 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
176 blk.14.attn_q_norm.weight Block 14 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
177 blk.14.attn_v.weight Block 14 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
178 blk.14.ffn_down_exps.weight Block 14 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
179 blk.14.ffn_gate_exps.weight Block 14 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
180 blk.14.ffn_gate_inp.weight Block 14 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
181 blk.14.ffn_norm.weight Block 14 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
182 blk.14.ffn_up_exps.weight Block 14 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.14: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 15 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
183 blk.15.attn_k.weight Block 15 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
184 blk.15.attn_k_norm.weight Block 15 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
185 blk.15.attn_norm.weight Block 15 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
186 blk.15.attn_output.weight Block 15 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
187 blk.15.attn_q.weight Block 15 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
188 blk.15.attn_q_norm.weight Block 15 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
189 blk.15.attn_v.weight Block 15 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
190 blk.15.ffn_down_exps.weight Block 15 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
191 blk.15.ffn_gate_exps.weight Block 15 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
192 blk.15.ffn_gate_inp.weight Block 15 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
193 blk.15.ffn_norm.weight Block 15 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
194 blk.15.ffn_up_exps.weight Block 15 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.15: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 16 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
195 blk.16.attn_k.weight Block 16 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
196 blk.16.attn_k_norm.weight Block 16 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
197 blk.16.attn_norm.weight Block 16 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
198 blk.16.attn_output.weight Block 16 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
199 blk.16.attn_q.weight Block 16 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
200 blk.16.attn_q_norm.weight Block 16 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
201 blk.16.attn_v.weight Block 16 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
202 blk.16.ffn_down_exps.weight Block 16 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
203 blk.16.ffn_gate_exps.weight Block 16 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
204 blk.16.ffn_gate_inp.weight Block 16 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
205 blk.16.ffn_norm.weight Block 16 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
206 blk.16.ffn_up_exps.weight Block 16 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.16: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 17 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
207 blk.17.attn_k.weight Block 17 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
208 blk.17.attn_k_norm.weight Block 17 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
209 blk.17.attn_norm.weight Block 17 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
210 blk.17.attn_output.weight Block 17 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
211 blk.17.attn_q.weight Block 17 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
212 blk.17.attn_q_norm.weight Block 17 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
213 blk.17.attn_v.weight Block 17 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
214 blk.17.ffn_down_exps.weight Block 17 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
215 blk.17.ffn_gate_exps.weight Block 17 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
216 blk.17.ffn_gate_inp.weight Block 17 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
217 blk.17.ffn_norm.weight Block 17 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
218 blk.17.ffn_up_exps.weight Block 17 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.17: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 18 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
219 blk.18.attn_k.weight Block 18 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
220 blk.18.attn_k_norm.weight Block 18 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
221 blk.18.attn_norm.weight Block 18 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
222 blk.18.attn_output.weight Block 18 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
223 blk.18.attn_q.weight Block 18 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
224 blk.18.attn_q_norm.weight Block 18 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
225 blk.18.attn_v.weight Block 18 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
226 blk.18.ffn_down_exps.weight Block 18 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
227 blk.18.ffn_gate_exps.weight Block 18 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
228 blk.18.ffn_gate_inp.weight Block 18 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
229 blk.18.ffn_norm.weight Block 18 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
230 blk.18.ffn_up_exps.weight Block 18 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.18: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 19 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
231 blk.19.attn_k.weight Block 19 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
232 blk.19.attn_k_norm.weight Block 19 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
233 blk.19.attn_norm.weight Block 19 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
234 blk.19.attn_output.weight Block 19 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
235 blk.19.attn_q.weight Block 19 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
236 blk.19.attn_q_norm.weight Block 19 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
237 blk.19.attn_v.weight Block 19 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
238 blk.19.ffn_down_exps.weight Block 19 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
239 blk.19.ffn_gate_exps.weight Block 19 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
240 blk.19.ffn_gate_inp.weight Block 19 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
241 blk.19.ffn_norm.weight Block 19 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
242 blk.19.ffn_up_exps.weight Block 19 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.19: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 20 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
243 blk.20.attn_k.weight Block 20 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
244 blk.20.attn_k_norm.weight Block 20 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
245 blk.20.attn_norm.weight Block 20 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
246 blk.20.attn_output.weight Block 20 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
247 blk.20.attn_q.weight Block 20 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
248 blk.20.attn_q_norm.weight Block 20 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
249 blk.20.attn_v.weight Block 20 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
250 blk.20.ffn_down_exps.weight Block 20 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
251 blk.20.ffn_gate_exps.weight Block 20 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
252 blk.20.ffn_gate_inp.weight Block 20 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
253 blk.20.ffn_norm.weight Block 20 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
254 blk.20.ffn_up_exps.weight Block 20 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.20: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 21 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
255 blk.21.attn_k.weight Block 21 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
256 blk.21.attn_k_norm.weight Block 21 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
257 blk.21.attn_norm.weight Block 21 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
258 blk.21.attn_output.weight Block 21 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
259 blk.21.attn_q.weight Block 21 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
260 blk.21.attn_q_norm.weight Block 21 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
261 blk.21.attn_v.weight Block 21 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
262 blk.21.ffn_down_exps.weight Block 21 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
263 blk.21.ffn_gate_exps.weight Block 21 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
264 blk.21.ffn_gate_inp.weight Block 21 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
265 blk.21.ffn_norm.weight Block 21 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
266 blk.21.ffn_up_exps.weight Block 21 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.21: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 22 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
267 blk.22.attn_k.weight Block 22 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
268 blk.22.attn_k_norm.weight Block 22 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
269 blk.22.attn_norm.weight Block 22 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
270 blk.22.attn_output.weight Block 22 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
271 blk.22.attn_q.weight Block 22 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
272 blk.22.attn_q_norm.weight Block 22 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
273 blk.22.attn_v.weight Block 22 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
274 blk.22.ffn_down_exps.weight Block 22 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
275 blk.22.ffn_gate_exps.weight Block 22 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
276 blk.22.ffn_gate_inp.weight Block 22 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
277 blk.22.ffn_norm.weight Block 22 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
278 blk.22.ffn_up_exps.weight Block 22 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.22: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 23 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
279 blk.23.attn_k.weight Block 23 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q3_K
280 blk.23.attn_k_norm.weight Block 23 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
281 blk.23.attn_norm.weight Block 23 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
282 blk.23.attn_output.weight Block 23 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
283 blk.23.attn_q.weight Block 23 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q3_K
284 blk.23.attn_q_norm.weight Block 23 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
285 blk.23.attn_v.weight Block 23 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
286 blk.23.ffn_down_exps.weight Block 23 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
287 blk.23.ffn_gate_exps.weight Block 23 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
288 blk.23.ffn_gate_inp.weight Block 23 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
289 blk.23.ffn_norm.weight Block 23 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
290 blk.23.ffn_up_exps.weight Block 23 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.23: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 24 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
291 blk.24.attn_k.weight Block 24 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
292 blk.24.attn_k_norm.weight Block 24 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
293 blk.24.attn_norm.weight Block 24 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
294 blk.24.attn_output.weight Block 24 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
295 blk.24.attn_q.weight Block 24 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
296 blk.24.attn_q_norm.weight Block 24 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
297 blk.24.attn_v.weight Block 24 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
298 blk.24.ffn_down_exps.weight Block 24 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
299 blk.24.ffn_gate_exps.weight Block 24 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
300 blk.24.ffn_gate_inp.weight Block 24 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
301 blk.24.ffn_norm.weight Block 24 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
302 blk.24.ffn_up_exps.weight Block 24 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q3_K
  • Total elements in blk.24: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 25 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
303 blk.25.attn_k.weight Block 25 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
304 blk.25.attn_k_norm.weight Block 25 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
305 blk.25.attn_norm.weight Block 25 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
306 blk.25.attn_output.weight Block 25 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
307 blk.25.attn_q.weight Block 25 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
308 blk.25.attn_q_norm.weight Block 25 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
309 blk.25.attn_v.weight Block 25 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
310 blk.25.ffn_down_exps.weight Block 25 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
311 blk.25.ffn_gate_exps.weight Block 25 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
312 blk.25.ffn_gate_inp.weight Block 25 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
313 blk.25.ffn_norm.weight Block 25 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
314 blk.25.ffn_up_exps.weight Block 25 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.25: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 26 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
315 blk.26.attn_k.weight Block 26 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
316 blk.26.attn_k_norm.weight Block 26 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
317 blk.26.attn_norm.weight Block 26 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
318 blk.26.attn_output.weight Block 26 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
319 blk.26.attn_q.weight Block 26 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
320 blk.26.attn_q_norm.weight Block 26 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
321 blk.26.attn_v.weight Block 26 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
322 blk.26.ffn_down_exps.weight Block 26 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
323 blk.26.ffn_gate_exps.weight Block 26 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
324 blk.26.ffn_gate_inp.weight Block 26 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
325 blk.26.ffn_norm.weight Block 26 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
326 blk.26.ffn_up_exps.weight Block 26 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.26: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 27 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
327 blk.27.attn_k.weight Block 27 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
328 blk.27.attn_k_norm.weight Block 27 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
329 blk.27.attn_norm.weight Block 27 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
330 blk.27.attn_output.weight Block 27 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
331 blk.27.attn_q.weight Block 27 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
332 blk.27.attn_q_norm.weight Block 27 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
333 blk.27.attn_v.weight Block 27 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
334 blk.27.ffn_down_exps.weight Block 27 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
335 blk.27.ffn_gate_exps.weight Block 27 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
336 blk.27.ffn_gate_inp.weight Block 27 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
337 blk.27.ffn_norm.weight Block 27 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
338 blk.27.ffn_up_exps.weight Block 27 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.27: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 28 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
339 blk.28.attn_k.weight Block 28 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
340 blk.28.attn_k_norm.weight Block 28 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
341 blk.28.attn_norm.weight Block 28 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
342 blk.28.attn_output.weight Block 28 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
343 blk.28.attn_q.weight Block 28 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
344 blk.28.attn_q_norm.weight Block 28 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
345 blk.28.attn_v.weight Block 28 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
346 blk.28.ffn_down_exps.weight Block 28 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
347 blk.28.ffn_gate_exps.weight Block 28 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
348 blk.28.ffn_gate_inp.weight Block 28 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
349 blk.28.ffn_norm.weight Block 28 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
350 blk.28.ffn_up_exps.weight Block 28 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.28: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 29 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
351 blk.29.attn_k.weight Block 29 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
352 blk.29.attn_k_norm.weight Block 29 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
353 blk.29.attn_norm.weight Block 29 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
354 blk.29.attn_output.weight Block 29 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
355 blk.29.attn_q.weight Block 29 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
356 blk.29.attn_q_norm.weight Block 29 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
357 blk.29.attn_v.weight Block 29 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
358 blk.29.ffn_down_exps.weight Block 29 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
359 blk.29.ffn_gate_exps.weight Block 29 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
360 blk.29.ffn_gate_inp.weight Block 29 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
361 blk.29.ffn_norm.weight Block 29 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
362 blk.29.ffn_up_exps.weight Block 29 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.29: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 30 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
363 blk.30.attn_k.weight Block 30 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
364 blk.30.attn_k_norm.weight Block 30 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
365 blk.30.attn_norm.weight Block 30 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
366 blk.30.attn_output.weight Block 30 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
367 blk.30.attn_q.weight Block 30 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
368 blk.30.attn_q_norm.weight Block 30 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
369 blk.30.attn_v.weight Block 30 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
370 blk.30.ffn_down_exps.weight Block 30 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
371 blk.30.ffn_gate_exps.weight Block 30 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
372 blk.30.ffn_gate_inp.weight Block 30 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
373 blk.30.ffn_norm.weight Block 30 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
374 blk.30.ffn_up_exps.weight Block 30 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.30: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 31 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
375 blk.31.attn_k.weight Block 31 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
376 blk.31.attn_k_norm.weight Block 31 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
377 blk.31.attn_norm.weight Block 31 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
378 blk.31.attn_output.weight Block 31 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
379 blk.31.attn_q.weight Block 31 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
380 blk.31.attn_q_norm.weight Block 31 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
381 blk.31.attn_v.weight Block 31 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
382 blk.31.ffn_down_exps.weight Block 31 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
383 blk.31.ffn_gate_exps.weight Block 31 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
384 blk.31.ffn_gate_inp.weight Block 31 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
385 blk.31.ffn_norm.weight Block 31 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
386 blk.31.ffn_up_exps.weight Block 31 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.31: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 32 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
387 blk.32.attn_k.weight Block 32 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
388 blk.32.attn_k_norm.weight Block 32 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
389 blk.32.attn_norm.weight Block 32 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
390 blk.32.attn_output.weight Block 32 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
391 blk.32.attn_q.weight Block 32 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
392 blk.32.attn_q_norm.weight Block 32 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
393 blk.32.attn_v.weight Block 32 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
394 blk.32.ffn_down_exps.weight Block 32 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
395 blk.32.ffn_gate_exps.weight Block 32 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
396 blk.32.ffn_gate_inp.weight Block 32 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
397 blk.32.ffn_norm.weight Block 32 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
398 blk.32.ffn_up_exps.weight Block 32 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.32: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 33 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
399 blk.33.attn_k.weight Block 33 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
400 blk.33.attn_k_norm.weight Block 33 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
401 blk.33.attn_norm.weight Block 33 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
402 blk.33.attn_output.weight Block 33 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
403 blk.33.attn_q.weight Block 33 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
404 blk.33.attn_q_norm.weight Block 33 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
405 blk.33.attn_v.weight Block 33 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
406 blk.33.ffn_down_exps.weight Block 33 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
407 blk.33.ffn_gate_exps.weight Block 33 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
408 blk.33.ffn_gate_inp.weight Block 33 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
409 blk.33.ffn_norm.weight Block 33 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
410 blk.33.ffn_up_exps.weight Block 33 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.33: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 34 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
411 blk.34.attn_k.weight Block 34 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
412 blk.34.attn_k_norm.weight Block 34 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
413 blk.34.attn_norm.weight Block 34 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
414 blk.34.attn_output.weight Block 34 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
415 blk.34.attn_q.weight Block 34 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
416 blk.34.attn_q_norm.weight Block 34 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
417 blk.34.attn_v.weight Block 34 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
418 blk.34.ffn_down_exps.weight Block 34 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
419 blk.34.ffn_gate_exps.weight Block 34 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
420 blk.34.ffn_gate_inp.weight Block 34 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
421 blk.34.ffn_norm.weight Block 34 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
422 blk.34.ffn_up_exps.weight Block 34 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.34: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 35 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
423 blk.35.attn_k.weight Block 35 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
424 blk.35.attn_k_norm.weight Block 35 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
425 blk.35.attn_norm.weight Block 35 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
426 blk.35.attn_output.weight Block 35 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
427 blk.35.attn_q.weight Block 35 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
428 blk.35.attn_q_norm.weight Block 35 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
429 blk.35.attn_v.weight Block 35 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
430 blk.35.ffn_down_exps.weight Block 35 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
431 blk.35.ffn_gate_exps.weight Block 35 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
432 blk.35.ffn_gate_inp.weight Block 35 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
433 blk.35.ffn_norm.weight Block 35 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
434 blk.35.ffn_up_exps.weight Block 35 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.35: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 36 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
435 blk.36.attn_k.weight Block 36 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
436 blk.36.attn_k_norm.weight Block 36 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
437 blk.36.attn_norm.weight Block 36 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
438 blk.36.attn_output.weight Block 36 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
439 blk.36.attn_q.weight Block 36 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
440 blk.36.attn_q_norm.weight Block 36 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
441 blk.36.attn_v.weight Block 36 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
442 blk.36.ffn_down_exps.weight Block 36 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
443 blk.36.ffn_gate_exps.weight Block 36 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
444 blk.36.ffn_gate_inp.weight Block 36 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
445 blk.36.ffn_norm.weight Block 36 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
446 blk.36.ffn_up_exps.weight Block 36 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.36: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 37 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
447 blk.37.attn_k.weight Block 37 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
448 blk.37.attn_k_norm.weight Block 37 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
449 blk.37.attn_norm.weight Block 37 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
450 blk.37.attn_output.weight Block 37 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
451 blk.37.attn_q.weight Block 37 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
452 blk.37.attn_q_norm.weight Block 37 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
453 blk.37.attn_v.weight Block 37 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
454 blk.37.ffn_down_exps.weight Block 37 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
455 blk.37.ffn_gate_exps.weight Block 37 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
456 blk.37.ffn_gate_inp.weight Block 37 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
457 blk.37.ffn_norm.weight Block 37 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
458 blk.37.ffn_up_exps.weight Block 37 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.37: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 38 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
459 blk.38.attn_k.weight Block 38 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
460 blk.38.attn_k_norm.weight Block 38 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
461 blk.38.attn_norm.weight Block 38 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
462 blk.38.attn_output.weight Block 38 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
463 blk.38.attn_q.weight Block 38 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
464 blk.38.attn_q_norm.weight Block 38 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
465 blk.38.attn_v.weight Block 38 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
466 blk.38.ffn_down_exps.weight Block 38 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
467 blk.38.ffn_gate_exps.weight Block 38 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
468 blk.38.ffn_gate_inp.weight Block 38 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
469 blk.38.ffn_norm.weight Block 38 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
470 blk.38.ffn_up_exps.weight Block 38 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.38: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 39 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
471 blk.39.attn_k.weight Block 39 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
472 blk.39.attn_k_norm.weight Block 39 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
473 blk.39.attn_norm.weight Block 39 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
474 blk.39.attn_output.weight Block 39 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
475 blk.39.attn_q.weight Block 39 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
476 blk.39.attn_q_norm.weight Block 39 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
477 blk.39.attn_v.weight Block 39 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
478 blk.39.ffn_down_exps.weight Block 39 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
479 blk.39.ffn_gate_exps.weight Block 39 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
480 blk.39.ffn_gate_inp.weight Block 39 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
481 blk.39.ffn_norm.weight Block 39 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
482 blk.39.ffn_up_exps.weight Block 39 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.39: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 40 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
483 blk.40.attn_k.weight Block 40 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
484 blk.40.attn_k_norm.weight Block 40 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
485 blk.40.attn_norm.weight Block 40 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
486 blk.40.attn_output.weight Block 40 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
487 blk.40.attn_q.weight Block 40 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
488 blk.40.attn_q_norm.weight Block 40 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
489 blk.40.attn_v.weight Block 40 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
490 blk.40.ffn_down_exps.weight Block 40 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
491 blk.40.ffn_gate_exps.weight Block 40 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
492 blk.40.ffn_gate_inp.weight Block 40 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
493 blk.40.ffn_norm.weight Block 40 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
494 blk.40.ffn_up_exps.weight Block 40 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.40: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 41 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
495 blk.41.attn_k.weight Block 41 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
496 blk.41.attn_k_norm.weight Block 41 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
497 blk.41.attn_norm.weight Block 41 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
498 blk.41.attn_output.weight Block 41 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
499 blk.41.attn_q.weight Block 41 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
500 blk.41.attn_q_norm.weight Block 41 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
501 blk.41.attn_v.weight Block 41 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
502 blk.41.ffn_down_exps.weight Block 41 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
503 blk.41.ffn_gate_exps.weight Block 41 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
504 blk.41.ffn_gate_inp.weight Block 41 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
505 blk.41.ffn_norm.weight Block 41 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
506 blk.41.ffn_up_exps.weight Block 41 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.41: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 42 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
507 blk.42.attn_k.weight Block 42 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
508 blk.42.attn_k_norm.weight Block 42 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
509 blk.42.attn_norm.weight Block 42 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
510 blk.42.attn_output.weight Block 42 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
511 blk.42.attn_q.weight Block 42 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
512 blk.42.attn_q_norm.weight Block 42 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
513 blk.42.attn_v.weight Block 42 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
514 blk.42.ffn_down_exps.weight Block 42 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
515 blk.42.ffn_gate_exps.weight Block 42 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
516 blk.42.ffn_gate_inp.weight Block 42 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
517 blk.42.ffn_norm.weight Block 42 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
518 blk.42.ffn_up_exps.weight Block 42 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.42: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 43 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
519 blk.43.attn_k.weight Block 43 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
520 blk.43.attn_k_norm.weight Block 43 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
521 blk.43.attn_norm.weight Block 43 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
522 blk.43.attn_output.weight Block 43 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
523 blk.43.attn_q.weight Block 43 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
524 blk.43.attn_q_norm.weight Block 43 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
525 blk.43.attn_v.weight Block 43 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
526 blk.43.ffn_down_exps.weight Block 43 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
527 blk.43.ffn_gate_exps.weight Block 43 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
528 blk.43.ffn_gate_inp.weight Block 43 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
529 blk.43.ffn_norm.weight Block 43 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
530 blk.43.ffn_up_exps.weight Block 43 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.43: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 44 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
531 blk.44.attn_k.weight Block 44 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
532 blk.44.attn_k_norm.weight Block 44 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
533 blk.44.attn_norm.weight Block 44 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
534 blk.44.attn_output.weight Block 44 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
535 blk.44.attn_q.weight Block 44 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
536 blk.44.attn_q_norm.weight Block 44 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
537 blk.44.attn_v.weight Block 44 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
538 blk.44.ffn_down_exps.weight Block 44 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
539 blk.44.ffn_gate_exps.weight Block 44 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
540 blk.44.ffn_gate_inp.weight Block 44 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
541 blk.44.ffn_norm.weight Block 44 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
542 blk.44.ffn_up_exps.weight Block 44 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.44: (~623M) 623120640
  • Percentage of total elements: 2.13%

Block 45 Tensor Group : ~623M Elements

T_ID Tensor Layer Name Human Friendly Tensor Layer Name Elements Shape Type
543 blk.45.attn_k.weight Block 45 Attention Key (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
544 blk.45.attn_k_norm.weight Block 45 Attn_K_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
545 blk.45.attn_norm.weight Block 45 Attention Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
546 blk.45.attn_output.weight Block 45 Attention Output (W) ( ~8M) 8388608 4096 x 2048 x 1 x 1 Q4_K
547 blk.45.attn_q.weight Block 45 Attention Query (W) ( ~8M) 8388608 2048 x 4096 x 1 x 1 Q4_K
548 blk.45.attn_q_norm.weight Block 45 Attn_Q_Norm (W) ( 128) 128 128 x 1 x 1 x 1 F32
549 blk.45.attn_v.weight Block 45 Attention Value (W) ( ~1M) 1048576 2048 x 512 x 1 x 1 Q4_K
550 blk.45.ffn_down_exps.weight Block 45 Ffn_Down_Exps (W) (~201M) 201326592 768 x 2048 x 128 x 1 Q5_K
551 blk.45.ffn_gate_exps.weight Block 45 Ffn_Gate_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
552 blk.45.ffn_gate_inp.weight Block 45 Expert-Routing Layer For The Feed-Forward Network In Mixture Of Expert Models (W) (~262K) 262144 2048 x 128 x 1 x 1 F32
553 blk.45.ffn_norm.weight Block 45 Feed-Forward Network Normalization (W) ( ~2K) 2048 2048 x 1 x 1 x 1 F32
554 blk.45.ffn_up_exps.weight Block 45 Ffn_Up_Exps (W) (~201M) 201326592 2048 x 768 x 128 x 1 Q4_K
  • Total elements in blk.45: (~623M) 623120640
  • Percentage of total elements: 2.13%