doberst commited on
Commit
c895d91
1 Parent(s): a38d9b5

Upload 15 files

Browse files
LICENSE.txt ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
177
+
178
+ APPENDIX: How to apply the Apache License to your work.
179
+
180
+ To apply the Apache License to your work, attach the following
181
+ boilerplate notice, with the fields enclosed by brackets "[]"
182
+ replaced with your own identifying information. (Don't include
183
+ the brackets!) The text should be enclosed in the appropriate
184
+ comment syntax for the file format. We also recommend that a
185
+ file or class name and description of purpose be included on the
186
+ same "printed page" as the copyright notice for easier
187
+ identification within third-party archives.
188
+
189
+ Copyright 2023-2024 by AI Bloks for LLMWare
190
+
191
+ Licensed under the Apache License, Version 2.0 (the "License");
192
+ you may not use this file except in compliance with the License.
193
+ You may obtain a copy of the License at
194
+
195
+ http://www.apache.org/licenses/LICENSE-2.0
196
+
197
+ Unless required by applicable law or agreed to in writing, software
198
+ distributed under the License is distributed on an "AS IS" BASIS,
199
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200
+ See the License for the specific language governing permissions and
201
+ limitations under the License.
README.md CHANGED
@@ -1,3 +1,38 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ inference: false
4
+ tags: [green, p1, llmware-fx, ov]
5
+ ---
6
+
7
+ # slim-extract-tiny-ov
8
+
9
+ <!-- Provide a quick summary of what the model is/does. -->
10
+
11
+ **slim-extract-tiny-ov** is an OpenVino int4 quantized version of slim-extract-tiny, providing a very fast, very small inference implementation, optimized for AI PCs using Intel GPU, CPU and NPU.
12
+
13
+ [**slim-extract-tiny**](https://huggingface.co/llmware/slim-extract-tiny) is a specialized extraction function calling model that looks for a key and value in a complex business document and returns a python dictionary with the designated key and the values found in the text for that key.
14
+
15
+
16
+ Get started right away with [OpenVino](https://github.com/openvinotoolkit/openvino)
17
+
18
+ Looking for AI PC solutions and demos, contact us at [llmware](https://www.llmware.ai)
19
+
20
+
21
+ ### Model Description
22
+
23
+ - **Developed by:** llmware
24
+ - **Model type:** tinyllama
25
+ - **Parameters:** 1.1 billion
26
+ - **Model Parent:** llmware/slim-extract-tiny
27
+ - **Language(s) (NLP):** English
28
+ - **License:** Apache 2.0
29
+ - **Uses:** Extraction of values from complex business documents
30
+ - **RAG Benchmark Accuracy Score:** NA
31
+ - **Quantization:** int4
32
+
33
+
34
+ ## Model Card Contact
35
+
36
+ [llmware on hf](https://www.huggingface.co/llmware)
37
+
38
+ [llmware website](https://www.llmware.ai)
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "llmware/slim-extract-tiny",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 1,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 2048,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 5632,
14
+ "max_position_embeddings": 2048,
15
+ "mlp_bias": false,
16
+ "model_type": "llama",
17
+ "num_attention_heads": 32,
18
+ "num_hidden_layers": 22,
19
+ "num_key_value_heads": 4,
20
+ "pretraining_tp": 1,
21
+ "rms_norm_eps": 1e-05,
22
+ "rope_scaling": null,
23
+ "rope_theta": 10000.0,
24
+ "tie_word_embeddings": false,
25
+ "trained": "custom training",
26
+ "transformers_version": "4.41.2",
27
+ "use_cache": true,
28
+ "vocab_size": 32000
29
+ }
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 1,
3
+ "eos_token_id": 2,
4
+ "max_length": 2048,
5
+ "pad_token_id": 0,
6
+ "transformers_version": "4.41.2"
7
+ }
hash_record_sha256.json ADDED
@@ -0,0 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "10017076745956895557.cl_cache": "c577f12978b402f57213ed1e629d992752d794dba54e3911b5820f30eaf45194",
3
+ "10251972156381614606.cl_cache": "5959a3ff3c3f4de279c598988d0c8b073f64e40b22a21b99bf908ae89d59f3d0",
4
+ "10421905850448152292.cl_cache": "fe6108c19d92051f1e71c34a2e2101921653305d5e6f24c2c1c389a76a37a5dd",
5
+ "10892667568239075546.cl_cache": "3efcb57833b73cf4753cdea79a1a2a9dc4a7fcb5f40de11a1c446eb5767a9b9a",
6
+ "11863316609947626651.cl_cache": "c0227172d4cc404fd44d795fbddc30af4eb6851612e9478900c81aca6d182be7",
7
+ "12067983408593617887.cl_cache": "454c8fa989bf18ad2be4912f568d90c5c4a97c85a473bf203f420fa51bd7dc22",
8
+ "1211408416710559981.cl_cache": "59dce57b33796967f999790431545b2fd520fb68eb4691e51a74ca3cd9fbacec",
9
+ "13663747248802902894.cl_cache": "d8b03cb6893828777b052495f9515cbcea4f305980065e2a372e9444ddc1154e",
10
+ "1488223384954392242.cl_cache": "6323070d23e9f59cf9ba74d16d875b606e2ea98fe4d9f682e86a1a99b450e1e2",
11
+ "15273068526190347127.cl_cache": "672ef4305a856ecfa6f6e7d00898a62cb1f32dec3d1924db5e451c4b1815b8a5",
12
+ "16013559403963528396.cl_cache": "db356ca8754384e288c5b0feb2d84420ace8f747bf1f1e8e75706dafad1ec5ab",
13
+ "16092546339969726550.cl_cache": "dbffe2e62e21d45f9578d06fd591d709f44e659d9901352255ce180072cb2427",
14
+ "16092858344562685539.blob": "2f78ecaf272850cf5ac6c4f153d5cda92bd898c1ac34b6a0fac1a66c5a00fc31",
15
+ "16277720051429187430.cl_cache": "4b5a1ca09887aa7c3befcde9836976edcb496aaf4cbf3e40feabf9434c461cc8",
16
+ "16328907459157599865.cl_cache": "b3bebf27924ac15c160fc08692b5d50fe18ec87fd9acdf57b47d338c30538aba",
17
+ "16435465179122196361.cl_cache": "2fb148696302c418625df7959872fcac4247be2ab6a9b4058d438b47c15f9ccb",
18
+ "16530507146309972796.cl_cache": "7ee25ce4aae462047d81808e34d3960ef0919e6e7e18be191bc5e99172d89eb0",
19
+ "17467966780094775964.cl_cache": "48151bbeaf2fedcc0d20f794870306847b7123a606fe984b41bf9f6eb74d9d16",
20
+ "17647940908008978230.cl_cache": "7fafba90a242268dd4007404270649f219793e6d42969ec22bb97239f7fe52df",
21
+ "2435439586092129613.cl_cache": "52cc7e24fb7e53793668412a0c6b4446ebe1ecec427afec061b4f12a01a00d6d",
22
+ "2511366062664625784.cl_cache": "8f512c619d50e6b3bdca532b24c00967e838d3511737a2ee3b4597dffb835d8e",
23
+ "2761668099468498134.cl_cache": "e6dd1d9d7fb9e8c9c9883b563cf358b450328b2915cea2b0078a0f6b4cb3d499",
24
+ "3160767641594385912.cl_cache": "07563cba5d38cf1b2e317287a8251016e78ad907fa4fee4b422e489cee1b4467",
25
+ "3695349945631169630.cl_cache": "aeddca817b0efb238dee40ca9eafe66acbb34e0f2ee33cac3561eef15fe523a8",
26
+ "3810181821840626033.cl_cache": "8640a7321a46422ee8ad4610b59e61849f48cd95955772b231925d946beebbcb",
27
+ "40332748099245647.cl_cache": "9077162ba910fc3ffb003ed9d127b754e03c0224f64f52dc5365b74b71a66b55",
28
+ "4058783824731931821.cl_cache": "c691c4711a79d8a8b5e96088d096badab662418cd9e3f54433dd14b2a4b4327e",
29
+ "424192793973889885.cl_cache": "46219da618af8b3f4db58ed8ed6255c4483f683b8ddef2bd09306ff31a49c782",
30
+ "4479419931704673578.cl_cache": "0500dc2b2058eaea577b0f68f790563f3c6a1b621f8b13e4412ebe549e8f44d3",
31
+ "467977919019461964.cl_cache": "82819601a6a6d1acdbf37ae926419619dd7dde1f7fc82b976947693bab366467",
32
+ "5182695880491208015.cl_cache": "0110b7f1be7e93594c7ac3a83155129f36ab0cb3db1d1b9c107166b4e6fd46c7",
33
+ "5202977111404062721.cl_cache": "7395f1279deb2b0435790277bfcb476699fde059b096b5e7101c592a2ba13ada",
34
+ "5471596125032205356.cl_cache": "7fcaab2e4ea948319674d03c465cd7af3a04a08235c20b487f2c03d5a2cf1858",
35
+ "5884047582914130790.cl_cache": "188a7b9a6ac3b15e8125afa22cdb7b4007681fa734d2d31ab08e7be6f741d309",
36
+ "652848059456515006.cl_cache": "5fa409a659acaca18bc76b7353a50acebba83fe63f91428a6648872b8dc97922",
37
+ "7029891696310807781.cl_cache": "c850a9583fc352edf1d6eea12b707c0ed3f737d0f78f1fe3b3107bfd17f52836",
38
+ "7545666204379722701.cl_cache": "bff765cdf8c03bac411ee3d872dc643e68509d43cc3486a2fd25948e41e9602b",
39
+ "8184188448506965886.cl_cache": "2e965d5f18a7bb663d09c36509306cf443da661da6e798bbd561b1928487e1ad",
40
+ "8362003048715687901.cl_cache": "132db0081135e6288bb3ce617c7ec33a1e83c380408694f680eff4b94c11f1db",
41
+ "8435292243739618997.cl_cache": "fb6c4b9ccad3c453cd061743fe73f9b0e022acc9efcd2d56a09917fb424f23c7",
42
+ "8477905602395726110.cl_cache": "ccf97f377f23de7c9b9262e8e08209d6dbca5c79a9eb733c4198249b59280892",
43
+ "8786306630874388824.cl_cache": "bcc527c393b0e9686e4fc4f57df0bdc7c86eac8f153262381e399da1d530ae53",
44
+ "8796041534153307604.cl_cache": "cc9c7f46e28321d50ccd205b20c893527c92fde47b26aa487054ed0594efc5b1",
45
+ "8941530927531791590.cl_cache": "a351fcbceacc61c52197507cd78b46283d31e522801d454bff4e2cbab4165c49",
46
+ "9180670305511071321.cl_cache": "d7f7444d268c184c5b8b2164295e89f161aa48d6a70fd2edc91a5a772940ca68",
47
+ "9352059739805730388.cl_cache": "865aa215a833a30dd4acc4961da404dfc185406d4159b2cd9cd95e6ac0241502",
48
+ "config.json": "044488e608126643c88a20b776ebbdbbb09b121464087c5e330c16d8e6172937",
49
+ "generation_config.json": "b713fd4ac99c3979dfb3bb079f723295c28201b8c4044458291cd5a72cf64215",
50
+ "openvino_detokenizer.bin": "9556d0a1f310629e217450ac4198c49f5457f1a69e22ce7c9f8e81fab4d530a7",
51
+ "openvino_detokenizer.xml": "241f5e58a7334526ef5043463715c183963cda14cdb3c0917c5120f6a4753104",
52
+ "openvino_model.bin": "f65622bfd71c9beaf1a10ad2d54cf3edeec7bbe37066d9cd906ad96f319c1fe6",
53
+ "openvino_model.xml": "e61e47fbc7c790f4eea8034de37e336a5558b4f778722d580813340d10b75b43",
54
+ "openvino_tokenizer.bin": "9ec38b843ff6c229ec650d242feaad2c9e4fa694c69d5e20a8e68b95a878c7ad",
55
+ "openvino_tokenizer.xml": "74b1968374a9cb11548ee03d879254796c84cc53797921d00dccb6ec8fa8d527",
56
+ "special_tokens_map.json": "b779ce0b2be00f7cfd9d6188ed88e94700dafc1a7138fef8fca88f06aca11dac",
57
+ "tokenizer.json": "bf467c9e0f536bda271283c6ef85eb1a943e3196b621c8a912d64953b205df83",
58
+ "tokenizer.model": "9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347",
59
+ "tokenizer_config.json": "bd238ac5e35809a935fe299d5cd22fe94b702613a1468c5d3f12211674990cef",
60
+ "time_stamp": "2024-07-29_150702"
61
+ }
openvino_detokenizer.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9556d0a1f310629e217450ac4198c49f5457f1a69e22ce7c9f8e81fab4d530a7
3
+ size 499723
openvino_detokenizer.xml ADDED
@@ -0,0 +1,97 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="detokenizer" version="11">
3
+ <layers>
4
+ <layer id="0" name="Parameter_159882" type="Parameter" version="opset1">
5
+ <data shape="?,?" element_type="i64" />
6
+ <output>
7
+ <port id="0" precision="I64" names="Parameter_159882">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ </port>
11
+ </output>
12
+ </layer>
13
+ <layer id="1" name="Constant_159862" type="Const" version="opset1">
14
+ <data element_type="u8" shape="499723" offset="0" size="499723" />
15
+ <output>
16
+ <port id="0" precision="U8">
17
+ <dim>499723</dim>
18
+ </port>
19
+ </output>
20
+ </layer>
21
+ <layer id="2" name="Convert_159892" type="Convert" version="opset1">
22
+ <data destination_type="i32" />
23
+ <input>
24
+ <port id="0" precision="I64">
25
+ <dim>-1</dim>
26
+ <dim>-1</dim>
27
+ </port>
28
+ </input>
29
+ <output>
30
+ <port id="1" precision="I32">
31
+ <dim>-1</dim>
32
+ <dim>-1</dim>
33
+ </port>
34
+ </output>
35
+ </layer>
36
+ <layer id="3" name="SentencepieceDetokenizer_159883" type="SentencepieceDetokenizer" version="extension">
37
+ <input>
38
+ <port id="0" precision="U8">
39
+ <dim>499723</dim>
40
+ </port>
41
+ <port id="1" precision="I32">
42
+ <dim>-1</dim>
43
+ <dim>-1</dim>
44
+ </port>
45
+ </input>
46
+ <output>
47
+ <port id="2" precision="I32">
48
+ <dim>-1</dim>
49
+ </port>
50
+ <port id="3" precision="I32">
51
+ <dim>-1</dim>
52
+ </port>
53
+ <port id="4" precision="U8">
54
+ <dim>-1</dim>
55
+ </port>
56
+ </output>
57
+ </layer>
58
+ <layer id="4" name="StringTensorPack_159884" type="StringTensorPack" version="extension">
59
+ <data mode="begins_ends" />
60
+ <input>
61
+ <port id="0" precision="I32">
62
+ <dim>-1</dim>
63
+ </port>
64
+ <port id="1" precision="I32">
65
+ <dim>-1</dim>
66
+ </port>
67
+ <port id="2" precision="U8">
68
+ <dim>-1</dim>
69
+ </port>
70
+ </input>
71
+ <output>
72
+ <port id="3" precision="STRING" names="string_output">
73
+ <dim>-1</dim>
74
+ </port>
75
+ </output>
76
+ </layer>
77
+ <layer id="5" name="Result_159885" type="Result" version="opset1">
78
+ <input>
79
+ <port id="0" precision="STRING">
80
+ <dim>-1</dim>
81
+ </port>
82
+ </input>
83
+ </layer>
84
+ </layers>
85
+ <edges>
86
+ <edge from-layer="0" from-port="0" to-layer="2" to-port="0" />
87
+ <edge from-layer="1" from-port="0" to-layer="3" to-port="0" />
88
+ <edge from-layer="2" from-port="1" to-layer="3" to-port="1" />
89
+ <edge from-layer="3" from-port="2" to-layer="4" to-port="0" />
90
+ <edge from-layer="3" from-port="3" to-layer="4" to-port="1" />
91
+ <edge from-layer="3" from-port="4" to-layer="4" to-port="2" />
92
+ <edge from-layer="4" from-port="3" to-layer="5" to-port="0" />
93
+ </edges>
94
+ <rt_info>
95
+ <eos_token_id value="2" />
96
+ </rt_info>
97
+ </net>
openvino_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f65622bfd71c9beaf1a10ad2d54cf3edeec7bbe37066d9cd906ad96f319c1fe6
3
+ size 731519552
openvino_model.xml ADDED
The diff for this file is too large to render. See raw diff
 
openvino_tokenizer.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9ec38b843ff6c229ec650d242feaad2c9e4fa694c69d5e20a8e68b95a878c7ad
3
+ size 499731
openvino_tokenizer.xml ADDED
@@ -0,0 +1,231 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="tokenizer" version="11">
3
+ <layers>
4
+ <layer id="0" name="string_input" type="Parameter" version="opset1">
5
+ <data shape="?" element_type="string" />
6
+ <output>
7
+ <port id="0" precision="STRING" names="string_input">
8
+ <dim>-1</dim>
9
+ </port>
10
+ </output>
11
+ </layer>
12
+ <layer id="1" name="Constant_159868" type="Const" version="opset1">
13
+ <data element_type="i32" shape="" offset="0" size="4" />
14
+ <output>
15
+ <port id="0" precision="I32" />
16
+ </output>
17
+ </layer>
18
+ <layer id="2" name="Constant_159861" type="Const" version="opset1">
19
+ <data element_type="u8" shape="499723" offset="4" size="499723" />
20
+ <output>
21
+ <port id="0" precision="U8">
22
+ <dim>499723</dim>
23
+ </port>
24
+ </output>
25
+ </layer>
26
+ <layer id="3" name="SentencepieceTokenizer_159864" type="SentencepieceTokenizer" version="extension">
27
+ <data nbest_size="0" alpha="0" add_bos="true" add_eos="false" reverse="false" />
28
+ <input>
29
+ <port id="0" precision="U8">
30
+ <dim>499723</dim>
31
+ </port>
32
+ <port id="1" precision="STRING">
33
+ <dim>-1</dim>
34
+ </port>
35
+ </input>
36
+ <output>
37
+ <port id="2" precision="I64">
38
+ <dim>-1</dim>
39
+ <dim>2</dim>
40
+ </port>
41
+ <port id="3" precision="I32">
42
+ <dim>-1</dim>
43
+ </port>
44
+ <port id="4" precision="I64">
45
+ <dim>2</dim>
46
+ </port>
47
+ </output>
48
+ </layer>
49
+ <layer id="4" name="Broadcast_159869" type="Broadcast" version="opset3">
50
+ <data mode="numpy" />
51
+ <input>
52
+ <port id="0" precision="I32" />
53
+ <port id="1" precision="I64">
54
+ <dim>2</dim>
55
+ </port>
56
+ </input>
57
+ <output>
58
+ <port id="2" precision="I32">
59
+ <dim>-1</dim>
60
+ <dim>-1</dim>
61
+ </port>
62
+ </output>
63
+ </layer>
64
+ <layer id="5" name="Constant_159870" type="Const" version="opset1">
65
+ <data element_type="i32" shape="" offset="499727" size="4" />
66
+ <output>
67
+ <port id="0" precision="I32" />
68
+ </output>
69
+ </layer>
70
+ <layer id="6" name="ShapeOf_159871" type="ShapeOf" version="opset3">
71
+ <data output_type="i64" />
72
+ <input>
73
+ <port id="0" precision="I32">
74
+ <dim>-1</dim>
75
+ </port>
76
+ </input>
77
+ <output>
78
+ <port id="1" precision="I64">
79
+ <dim>1</dim>
80
+ </port>
81
+ </output>
82
+ </layer>
83
+ <layer id="7" name="Broadcast_159872" type="Broadcast" version="opset3">
84
+ <data mode="numpy" />
85
+ <input>
86
+ <port id="0" precision="I32" />
87
+ <port id="1" precision="I64">
88
+ <dim>1</dim>
89
+ </port>
90
+ </input>
91
+ <output>
92
+ <port id="2" precision="I32">
93
+ <dim>-1</dim>
94
+ </port>
95
+ </output>
96
+ </layer>
97
+ <layer id="8" name="ScatterNDUpdate_159876" type="ScatterNDUpdate" version="opset4">
98
+ <input>
99
+ <port id="0" precision="I32">
100
+ <dim>-1</dim>
101
+ <dim>-1</dim>
102
+ </port>
103
+ <port id="1" precision="I64">
104
+ <dim>-1</dim>
105
+ <dim>2</dim>
106
+ </port>
107
+ <port id="2" precision="I32">
108
+ <dim>-1</dim>
109
+ </port>
110
+ </input>
111
+ <output>
112
+ <port id="3" precision="I32">
113
+ <dim>-1</dim>
114
+ <dim>-1</dim>
115
+ </port>
116
+ </output>
117
+ </layer>
118
+ <layer id="9" name="ScatterNDUpdate_159876" type="Convert" version="opset1">
119
+ <data destination_type="i64" />
120
+ <input>
121
+ <port id="0" precision="I32">
122
+ <dim>-1</dim>
123
+ <dim>-1</dim>
124
+ </port>
125
+ </input>
126
+ <output>
127
+ <port id="1" precision="I64" names="attention_mask">
128
+ <dim>-1</dim>
129
+ <dim>-1</dim>
130
+ </port>
131
+ </output>
132
+ </layer>
133
+ <layer id="11" name="Constant_159865" type="Const" version="opset1">
134
+ <data element_type="i32" shape="" offset="0" size="4" />
135
+ <output>
136
+ <port id="0" precision="I32" />
137
+ </output>
138
+ </layer>
139
+ <layer id="12" name="Broadcast_159866" type="Broadcast" version="opset3">
140
+ <data mode="numpy" />
141
+ <input>
142
+ <port id="0" precision="I32" />
143
+ <port id="1" precision="I64">
144
+ <dim>2</dim>
145
+ </port>
146
+ </input>
147
+ <output>
148
+ <port id="2" precision="I32">
149
+ <dim>-1</dim>
150
+ <dim>-1</dim>
151
+ </port>
152
+ </output>
153
+ </layer>
154
+ <layer id="13" name="ScatterNDUpdate_159867" type="ScatterNDUpdate" version="opset4">
155
+ <input>
156
+ <port id="0" precision="I32">
157
+ <dim>-1</dim>
158
+ <dim>-1</dim>
159
+ </port>
160
+ <port id="1" precision="I64">
161
+ <dim>-1</dim>
162
+ <dim>2</dim>
163
+ </port>
164
+ <port id="2" precision="I32">
165
+ <dim>-1</dim>
166
+ </port>
167
+ </input>
168
+ <output>
169
+ <port id="3" precision="I32">
170
+ <dim>-1</dim>
171
+ <dim>-1</dim>
172
+ </port>
173
+ </output>
174
+ </layer>
175
+ <layer id="14" name="ScatterNDUpdate_159867" type="Convert" version="opset1">
176
+ <data destination_type="i64" />
177
+ <input>
178
+ <port id="0" precision="I32">
179
+ <dim>-1</dim>
180
+ <dim>-1</dim>
181
+ </port>
182
+ </input>
183
+ <output>
184
+ <port id="1" precision="I64" names="input_ids">
185
+ <dim>-1</dim>
186
+ <dim>-1</dim>
187
+ </port>
188
+ </output>
189
+ </layer>
190
+ <layer id="15" name="Result_159877" type="Result" version="opset1">
191
+ <input>
192
+ <port id="0" precision="I64">
193
+ <dim>-1</dim>
194
+ <dim>-1</dim>
195
+ </port>
196
+ </input>
197
+ </layer>
198
+ <layer id="10" name="Result_159878" type="Result" version="opset1">
199
+ <input>
200
+ <port id="0" precision="I64">
201
+ <dim>-1</dim>
202
+ <dim>-1</dim>
203
+ </port>
204
+ </input>
205
+ </layer>
206
+ </layers>
207
+ <edges>
208
+ <edge from-layer="0" from-port="0" to-layer="3" to-port="1" />
209
+ <edge from-layer="1" from-port="0" to-layer="4" to-port="0" />
210
+ <edge from-layer="2" from-port="0" to-layer="3" to-port="0" />
211
+ <edge from-layer="3" from-port="4" to-layer="4" to-port="1" />
212
+ <edge from-layer="3" from-port="3" to-layer="6" to-port="0" />
213
+ <edge from-layer="3" from-port="2" to-layer="8" to-port="1" />
214
+ <edge from-layer="3" from-port="4" to-layer="12" to-port="1" />
215
+ <edge from-layer="3" from-port="2" to-layer="13" to-port="1" />
216
+ <edge from-layer="3" from-port="3" to-layer="13" to-port="2" />
217
+ <edge from-layer="4" from-port="2" to-layer="8" to-port="0" />
218
+ <edge from-layer="5" from-port="0" to-layer="7" to-port="0" />
219
+ <edge from-layer="6" from-port="1" to-layer="7" to-port="1" />
220
+ <edge from-layer="7" from-port="2" to-layer="8" to-port="2" />
221
+ <edge from-layer="8" from-port="3" to-layer="9" to-port="0" />
222
+ <edge from-layer="9" from-port="1" to-layer="10" to-port="0" />
223
+ <edge from-layer="11" from-port="0" to-layer="12" to-port="0" />
224
+ <edge from-layer="12" from-port="2" to-layer="13" to-port="0" />
225
+ <edge from-layer="13" from-port="3" to-layer="14" to-port="0" />
226
+ <edge from-layer="14" from-port="1" to-layer="15" to-port="0" />
227
+ </edges>
228
+ <rt_info>
229
+ <eos_token_id value="2" />
230
+ </rt_info>
231
+ </net>
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "unk_token": {
17
+ "content": "<unk>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
3
+ size 499723
tokenizer_config.json ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<unk>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<s>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "</s>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ }
29
+ },
30
+ "bos_token": "<s>",
31
+ "clean_up_tokenization_spaces": false,
32
+ "eos_token": "</s>",
33
+ "legacy": false,
34
+ "model_max_length": 1000000000000000019884624838656,
35
+ "pad_token": null,
36
+ "padding_side": "right",
37
+ "sp_model_kwargs": {},
38
+ "tokenizer_class": "LlamaTokenizer",
39
+ "unk_token": "<unk>",
40
+ "use_default_system_prompt": false
41
+ }