avinashhm commited on
Commit
e2044cd
·
verified ·
1 Parent(s): d1487aa

Upload folder using huggingface_hub

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .config/.last_opt_in_prompt.yaml +1 -0
  2. .config/.last_survey_prompt.yaml +1 -0
  3. .config/.last_update_check.json +1 -0
  4. .config/active_config +1 -0
  5. .config/config_sentinel +0 -0
  6. .config/configurations/config_default +6 -0
  7. .config/default_configs.db +0 -0
  8. .config/gce +1 -0
  9. .config/hidden_gcloud_config_universe_descriptor_data_cache_configs.db +0 -0
  10. .config/logs/2025.05.23/13.38.11.844188.log +765 -0
  11. .config/logs/2025.05.23/13.38.32.731517.log +5 -0
  12. .config/logs/2025.05.23/13.38.40.911008.log +153 -0
  13. .config/logs/2025.05.23/13.38.42.097421.log +5 -0
  14. .config/logs/2025.05.23/13.38.50.979302.log +8 -0
  15. .config/logs/2025.05.23/13.38.51.693491.log +8 -0
  16. .gitattributes +125 -0
  17. llama.cpp/.clang-format +161 -0
  18. llama.cpp/.clang-tidy +27 -0
  19. llama.cpp/.devops/cloud-v-pipeline +22 -0
  20. llama.cpp/.devops/cpu.Dockerfile +92 -0
  21. llama.cpp/.devops/cuda.Dockerfile +94 -0
  22. llama.cpp/.devops/intel.Dockerfile +91 -0
  23. llama.cpp/.devops/llama-cli-cann.Dockerfile +44 -0
  24. llama.cpp/.devops/llama-cpp-cuda.srpm.spec +83 -0
  25. llama.cpp/.devops/llama-cpp.srpm.spec +85 -0
  26. llama.cpp/.devops/musa.Dockerfile +101 -0
  27. llama.cpp/.devops/nix/apps.nix +21 -0
  28. llama.cpp/.devops/nix/devshells.nix +52 -0
  29. llama.cpp/.devops/nix/docker.nix +37 -0
  30. llama.cpp/.devops/nix/jetson-support.nix +39 -0
  31. llama.cpp/.devops/nix/nixpkgs-instances.nix +45 -0
  32. llama.cpp/.devops/nix/package-gguf-py.nix +36 -0
  33. llama.cpp/.devops/nix/package.nix +247 -0
  34. llama.cpp/.devops/nix/python-scripts.nix +66 -0
  35. llama.cpp/.devops/nix/scope.nix +41 -0
  36. llama.cpp/.devops/nix/sif.nix +27 -0
  37. llama.cpp/.devops/rocm.Dockerfile +113 -0
  38. llama.cpp/.devops/tools.sh +49 -0
  39. llama.cpp/.devops/vulkan.Dockerfile +89 -0
  40. llama.cpp/.dockerignore +20 -0
  41. llama.cpp/.ecrc +6 -0
  42. llama.cpp/.editorconfig +54 -0
  43. llama.cpp/.flake8 +18 -0
  44. llama.cpp/.github/ISSUE_TEMPLATE/010-bug-compilation.yml +87 -0
  45. llama.cpp/.github/ISSUE_TEMPLATE/011-bug-results.yml +101 -0
  46. llama.cpp/.github/ISSUE_TEMPLATE/019-bug-misc.yml +91 -0
  47. llama.cpp/.github/ISSUE_TEMPLATE/020-enhancement.yml +51 -0
  48. llama.cpp/.github/ISSUE_TEMPLATE/030-research.yml +52 -0
  49. llama.cpp/.github/ISSUE_TEMPLATE/040-refactor.yml +28 -0
  50. llama.cpp/.github/ISSUE_TEMPLATE/config.yml +11 -0
.config/.last_opt_in_prompt.yaml ADDED
@@ -0,0 +1 @@
 
 
1
+ {}
.config/.last_survey_prompt.yaml ADDED
@@ -0,0 +1 @@
 
 
1
+ last_prompt_time: 1748007520.2370543
.config/.last_update_check.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"last_update_check_time": 1748007521.5707717, "last_update_check_revision": 20250522182510, "notifications": [], "last_nag_times": {}}
.config/active_config ADDED
@@ -0,0 +1 @@
 
 
1
+ default
.config/config_sentinel ADDED
File without changes
.config/configurations/config_default ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ [component_manager]
2
+ disable_update_check = true
3
+
4
+ [compute]
5
+ gce_metadata_read_timeout_sec = 0
6
+
.config/default_configs.db ADDED
Binary file (12.3 kB). View file
 
.config/gce ADDED
@@ -0,0 +1 @@
 
 
1
+ False
.config/hidden_gcloud_config_universe_descriptor_data_cache_configs.db ADDED
Binary file (12.3 kB). View file
 
.config/logs/2025.05.23/13.38.11.844188.log ADDED
@@ -0,0 +1,765 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2025-05-23 13:38:23,865 DEBUG root Loaded Command Group: ['gcloud', 'components']
2
+ 2025-05-23 13:38:23,868 DEBUG root Loaded Command Group: ['gcloud', 'components', 'update']
3
+ 2025-05-23 13:38:23,870 DEBUG root Running [gcloud.components.update] with arguments: [--compile-python: "True", --quiet: "True", COMPONENT-IDS:6: "['core', 'gcloud-deps', 'bq', 'gcloud', 'gcloud-crc32c', 'gsutil']"]
4
+ 2025-05-23 13:38:23,871 INFO ___FILE_ONLY___ Beginning update. This process may take several minutes.
5
+
6
+ 2025-05-23 13:38:23,914 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
7
+ 2025-05-23 13:38:23,928 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components-2.json HTTP/11" 200 233994
8
+ 2025-05-23 13:38:23,939 INFO ___FILE_ONLY___
9
+
10
+ 2025-05-23 13:38:23,939 INFO ___FILE_ONLY___
11
+ Your current Google Cloud CLI version is: 523.0.1
12
+
13
+ 2025-05-23 13:38:23,939 INFO ___FILE_ONLY___ Installing components from version: 523.0.1
14
+
15
+ 2025-05-23 13:38:23,939 INFO ___FILE_ONLY___
16
+
17
+ 2025-05-23 13:38:23,940 DEBUG root Chosen display Format:table[box,title="These components will be removed."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
18
+ 2025-05-23 13:38:23,940 DEBUG root Chosen display Format:table[box,title="These components will be updated."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
19
+ 2025-05-23 13:38:23,941 DEBUG root Chosen display Format:table[box,title="These components will be installed."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
20
+ 2025-05-23 13:38:23,979 INFO ___FILE_ONLY___ ┌─────────────────────────────────────────────────────────────────────────────┐
21
+ 2025-05-23 13:38:23,979 INFO ___FILE_ONLY___
22
+
23
+ 2025-05-23 13:38:23,979 INFO ___FILE_ONLY___ │ These components will be installed. │
24
+ 2025-05-23 13:38:23,979 INFO ___FILE_ONLY___
25
+
26
+ 2025-05-23 13:38:23,979 INFO ___FILE_ONLY___ ├─────────────────────────────────────────────────────┬────────────┬──────────┤
27
+ 2025-05-23 13:38:23,979 INFO ___FILE_ONLY___
28
+
29
+ 2025-05-23 13:38:23,979 INFO ___FILE_ONLY___ │ Name │ Version │ Size │
30
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___
31
+
32
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ ├─────────────────────────────────────────────────────┼────────────┼──────────┤
33
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___
34
+
35
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ │
36
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ BigQuery Command Line Tool
37
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___
38
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ │
39
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ 2.1.17
40
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___
41
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ │
42
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ 1.8 MiB
43
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___
44
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ │
45
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___
46
+
47
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ │
48
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ BigQuery Command Line Tool (Platform Specific)
49
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___
50
+ 2025-05-23 13:38:23,980 INFO ___FILE_ONLY___ │
51
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ 2.1.16
52
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___
53
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ │
54
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ < 1 MiB
55
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___
56
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ │
57
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___
58
+
59
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ │
60
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ Bundled Python 3.12 (Platform Specific)
61
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___
62
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ │
63
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ 3.12.9
64
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___
65
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ │
66
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ 89.3 MiB
67
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___
68
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___ │
69
+ 2025-05-23 13:38:23,981 INFO ___FILE_ONLY___
70
+
71
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ │
72
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ Cloud Storage Command Line Tool
73
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___
74
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ │
75
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ 5.34
76
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___
77
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ │
78
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ 11.8 MiB
79
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___
80
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ │
81
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___
82
+
83
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ │
84
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ Cloud Storage Command Line Tool (Platform Specific)
85
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___
86
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ │
87
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ 5.34
88
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___
89
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ │
90
+ 2025-05-23 13:38:23,982 INFO ___FILE_ONLY___ < 1 MiB
91
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___
92
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ │
93
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___
94
+
95
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ │
96
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ Google Cloud CLI Core Libraries (Platform Specific)
97
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___
98
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ │
99
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ 2025.05.02
100
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___
101
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ │
102
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ < 1 MiB
103
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___
104
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ │
105
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___
106
+
107
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ │
108
+ 2025-05-23 13:38:23,983 INFO ___FILE_ONLY___ Google Cloud CRC32C Hash Tool (Platform Specific)
109
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___
110
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___ │
111
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___ 1.0.0
112
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___
113
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___ │
114
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___ 1.4 MiB
115
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___
116
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___ │
117
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___
118
+
119
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___ │
120
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___ gcloud cli dependencies (Platform Specific)
121
+ 2025-05-23 13:38:23,984 INFO ___FILE_ONLY___
122
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___ │
123
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___ 2021.04.16
124
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___
125
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___ │
126
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___ < 1 MiB
127
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___
128
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___ │
129
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___
130
+
131
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___ └─────────────────────────────────────────────────────┴────────────┴──────────┘
132
+ 2025-05-23 13:38:23,985 INFO ___FILE_ONLY___
133
+
134
+ 2025-05-23 13:38:23,986 INFO ___FILE_ONLY___
135
+
136
+ 2025-05-23 13:38:23,989 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
137
+ 2025-05-23 13:38:24,004 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/RELEASE_NOTES HTTP/11" 200 1422252
138
+ 2025-05-23 13:38:24,485 INFO ___FILE_ONLY___ For the latest full release notes, please visit:
139
+ https://cloud.google.com/sdk/release_notes
140
+
141
+
142
+ 2025-05-23 13:38:24,486 INFO ___FILE_ONLY___ Performing in place update...
143
+
144
+
145
+ 2025-05-23 13:38:24,488 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
146
+
147
+ 2025-05-23 13:38:24,488 INFO ___FILE_ONLY___ ╠═ Downloading: BigQuery Command Line Tool ═╣
148
+
149
+ 2025-05-23 13:38:24,488 INFO ___FILE_ONLY___ ╚
150
+ 2025-05-23 13:38:24,492 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
151
+ 2025-05-23 13:38:24,505 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-bq-20250516153006.tar.gz HTTP/11" 200 1846801
152
+ 2025-05-23 13:38:24,518 INFO ___FILE_ONLY___ ═
153
+ 2025-05-23 13:38:24,518 INFO ___FILE_ONLY___ ═
154
+ 2025-05-23 13:38:24,518 INFO ___FILE_ONLY___ ═
155
+ 2025-05-23 13:38:24,518 INFO ___FILE_ONLY___ ═
156
+ 2025-05-23 13:38:24,519 INFO ___FILE_ONLY___ ═
157
+ 2025-05-23 13:38:24,519 INFO ___FILE_ONLY___ ═
158
+ 2025-05-23 13:38:24,519 INFO ___FILE_ONLY___ ═
159
+ 2025-05-23 13:38:24,519 INFO ___FILE_ONLY___ ═
160
+ 2025-05-23 13:38:24,519 INFO ___FILE_ONLY___ ═
161
+ 2025-05-23 13:38:24,519 INFO ___FILE_ONLY___ ═
162
+ 2025-05-23 13:38:24,519 INFO ___FILE_ONLY___ ═
163
+ 2025-05-23 13:38:24,519 INFO ___FILE_ONLY___ ═
164
+ 2025-05-23 13:38:24,520 INFO ___FILE_ONLY___ ═
165
+ 2025-05-23 13:38:24,520 INFO ___FILE_ONLY___ ═
166
+ 2025-05-23 13:38:24,520 INFO ___FILE_ONLY___ ═
167
+ 2025-05-23 13:38:24,520 INFO ___FILE_ONLY___ ═
168
+ 2025-05-23 13:38:24,520 INFO ___FILE_ONLY___ ═
169
+ 2025-05-23 13:38:24,520 INFO ___FILE_ONLY___ ═
170
+ 2025-05-23 13:38:24,520 INFO ___FILE_ONLY___ ═
171
+ 2025-05-23 13:38:24,521 INFO ___FILE_ONLY___ ═
172
+ 2025-05-23 13:38:24,521 INFO ___FILE_ONLY___ ═
173
+ 2025-05-23 13:38:24,521 INFO ___FILE_ONLY___ ═
174
+ 2025-05-23 13:38:24,521 INFO ___FILE_ONLY___ ═
175
+ 2025-05-23 13:38:24,521 INFO ___FILE_ONLY___ ═
176
+ 2025-05-23 13:38:24,521 INFO ___FILE_ONLY___ ═
177
+ 2025-05-23 13:38:24,521 INFO ___FILE_ONLY___ ═
178
+ 2025-05-23 13:38:24,522 INFO ___FILE_ONLY___ ═
179
+ 2025-05-23 13:38:24,522 INFO ___FILE_ONLY___ ═
180
+ 2025-05-23 13:38:24,522 INFO ___FILE_ONLY___ ═
181
+ 2025-05-23 13:38:24,522 INFO ___FILE_ONLY___ ═
182
+ 2025-05-23 13:38:24,522 INFO ___FILE_ONLY___ ═
183
+ 2025-05-23 13:38:24,522 INFO ___FILE_ONLY___ ═
184
+ 2025-05-23 13:38:24,522 INFO ___FILE_ONLY___ ═
185
+ 2025-05-23 13:38:24,522 INFO ___FILE_ONLY___ ═
186
+ 2025-05-23 13:38:24,523 INFO ___FILE_ONLY___ ═
187
+ 2025-05-23 13:38:24,523 INFO ___FILE_ONLY___ ═
188
+ 2025-05-23 13:38:24,523 INFO ___FILE_ONLY___ ═
189
+ 2025-05-23 13:38:24,523 INFO ___FILE_ONLY___ ═
190
+ 2025-05-23 13:38:24,523 INFO ___FILE_ONLY___ ═
191
+ 2025-05-23 13:38:24,523 INFO ___FILE_ONLY___ ═
192
+ 2025-05-23 13:38:24,524 INFO ___FILE_ONLY___ ═
193
+ 2025-05-23 13:38:24,524 INFO ___FILE_ONLY___ ═
194
+ 2025-05-23 13:38:24,524 INFO ___FILE_ONLY___ ═
195
+ 2025-05-23 13:38:24,524 INFO ___FILE_ONLY___ ═
196
+ 2025-05-23 13:38:24,524 INFO ___FILE_ONLY___ ═
197
+ 2025-05-23 13:38:24,524 INFO ___FILE_ONLY___ ═
198
+ 2025-05-23 13:38:24,524 INFO ___FILE_ONLY___ ═
199
+ 2025-05-23 13:38:24,524 INFO ___FILE_ONLY___ ═
200
+ 2025-05-23 13:38:24,525 INFO ___FILE_ONLY___ ═
201
+ 2025-05-23 13:38:24,525 INFO ___FILE_ONLY___ ═
202
+ 2025-05-23 13:38:24,525 INFO ___FILE_ONLY___ ═
203
+ 2025-05-23 13:38:24,525 INFO ___FILE_ONLY___ ═
204
+ 2025-05-23 13:38:24,525 INFO ___FILE_ONLY___ ═
205
+ 2025-05-23 13:38:24,525 INFO ___FILE_ONLY___ ═
206
+ 2025-05-23 13:38:24,526 INFO ___FILE_ONLY___ ═
207
+ 2025-05-23 13:38:24,526 INFO ___FILE_ONLY___ ═
208
+ 2025-05-23 13:38:24,526 INFO ___FILE_ONLY___ ═
209
+ 2025-05-23 13:38:24,526 INFO ___FILE_ONLY___ ═
210
+ 2025-05-23 13:38:24,526 INFO ___FILE_ONLY___ ═
211
+ 2025-05-23 13:38:24,526 INFO ___FILE_ONLY___ ═
212
+ 2025-05-23 13:38:24,526 INFO ___FILE_ONLY___ ╝
213
+
214
+ 2025-05-23 13:38:24,529 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
215
+
216
+ 2025-05-23 13:38:24,529 INFO ___FILE_ONLY___ ╠═ Downloading: BigQuery Command Line Tool (Platform Spe... ═╣
217
+
218
+ 2025-05-23 13:38:24,529 INFO ___FILE_ONLY___ ╚
219
+ 2025-05-23 13:38:24,534 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
220
+ 2025-05-23 13:38:24,543 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-bq-nix-20250502143716.tar.gz HTTP/11" 200 1912
221
+ 2025-05-23 13:38:24,544 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
222
+ 2025-05-23 13:38:24,544 INFO ___FILE_ONLY___ ╝
223
+
224
+ 2025-05-23 13:38:24,546 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
225
+
226
+ 2025-05-23 13:38:24,546 INFO ___FILE_ONLY___ ╠═ Downloading: Bundled Python 3.12 ═╣
227
+
228
+ 2025-05-23 13:38:24,546 INFO ___FILE_ONLY___ ╚
229
+ 2025-05-23 13:38:24,546 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
230
+ 2025-05-23 13:38:24,546 INFO ___FILE_ONLY___ ╝
231
+
232
+ 2025-05-23 13:38:24,548 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════���═══════════════╗
233
+
234
+ 2025-05-23 13:38:24,548 INFO ___FILE_ONLY___ ╠═ Downloading: Bundled Python 3.12 (Platform Specific) ═╣
235
+
236
+ 2025-05-23 13:38:24,548 INFO ___FILE_ONLY___ ╚
237
+ 2025-05-23 13:38:24,551 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
238
+ 2025-05-23 13:38:24,569 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-bundled-python3-unix-linux-x86_64-20250502143716.tar.gz HTTP/11" 200 93610468
239
+ 2025-05-23 13:38:24,902 INFO ___FILE_ONLY___ ═
240
+ 2025-05-23 13:38:24,904 INFO ___FILE_ONLY___ ═
241
+ 2025-05-23 13:38:24,908 INFO ___FILE_ONLY___ ═
242
+ 2025-05-23 13:38:24,911 INFO ___FILE_ONLY___ ═
243
+ 2025-05-23 13:38:24,913 INFO ___FILE_ONLY___ ═
244
+ 2025-05-23 13:38:24,915 INFO ___FILE_ONLY___ ═
245
+ 2025-05-23 13:38:24,917 INFO ___FILE_ONLY___ ═
246
+ 2025-05-23 13:38:24,919 INFO ___FILE_ONLY___ ═
247
+ 2025-05-23 13:38:24,921 INFO ___FILE_ONLY___ ═
248
+ 2025-05-23 13:38:24,923 INFO ___FILE_ONLY___ ═
249
+ 2025-05-23 13:38:24,925 INFO ___FILE_ONLY___ ═
250
+ 2025-05-23 13:38:24,927 INFO ___FILE_ONLY___ ═
251
+ 2025-05-23 13:38:24,929 INFO ___FILE_ONLY___ ═
252
+ 2025-05-23 13:38:24,931 INFO ___FILE_ONLY___ ═
253
+ 2025-05-23 13:38:24,933 INFO ___FILE_ONLY___ ═
254
+ 2025-05-23 13:38:24,935 INFO ___FILE_ONLY___ ═
255
+ 2025-05-23 13:38:24,937 INFO ___FILE_ONLY___ ═
256
+ 2025-05-23 13:38:24,939 INFO ___FILE_ONLY___ ═
257
+ 2025-05-23 13:38:24,941 INFO ___FILE_ONLY___ ═
258
+ 2025-05-23 13:38:24,943 INFO ___FILE_ONLY___ ═
259
+ 2025-05-23 13:38:24,945 INFO ___FILE_ONLY___ ═
260
+ 2025-05-23 13:38:24,947 INFO ___FILE_ONLY___ ═
261
+ 2025-05-23 13:38:24,949 INFO ___FILE_ONLY___ ═
262
+ 2025-05-23 13:38:24,951 INFO ___FILE_ONLY___ ═
263
+ 2025-05-23 13:38:24,953 INFO ___FILE_ONLY___ ═
264
+ 2025-05-23 13:38:24,955 INFO ___FILE_ONLY___ ═
265
+ 2025-05-23 13:38:24,957 INFO ___FILE_ONLY___ ═
266
+ 2025-05-23 13:38:24,960 INFO ___FILE_ONLY___ ═
267
+ 2025-05-23 13:38:24,962 INFO ___FILE_ONLY___ ═
268
+ 2025-05-23 13:38:24,964 INFO ___FILE_ONLY___ ═
269
+ 2025-05-23 13:38:24,966 INFO ___FILE_ONLY___ ═
270
+ 2025-05-23 13:38:24,968 INFO ___FILE_ONLY___ ═
271
+ 2025-05-23 13:38:24,970 INFO ___FILE_ONLY___ ═
272
+ 2025-05-23 13:38:24,972 INFO ___FILE_ONLY___ ═
273
+ 2025-05-23 13:38:24,974 INFO ___FILE_ONLY___ ═
274
+ 2025-05-23 13:38:24,976 INFO ___FILE_ONLY___ ═
275
+ 2025-05-23 13:38:24,978 INFO ___FILE_ONLY___ ═
276
+ 2025-05-23 13:38:24,980 INFO ___FILE_ONLY___ ═
277
+ 2025-05-23 13:38:24,982 INFO ___FILE_ONLY___ ═
278
+ 2025-05-23 13:38:24,984 INFO ___FILE_ONLY___ ═
279
+ 2025-05-23 13:38:24,986 INFO ___FILE_ONLY___ ═
280
+ 2025-05-23 13:38:24,988 INFO ___FILE_ONLY___ ═
281
+ 2025-05-23 13:38:24,990 INFO ___FILE_ONLY___ ═
282
+ 2025-05-23 13:38:24,992 INFO ___FILE_ONLY___ ═
283
+ 2025-05-23 13:38:24,994 INFO ___FILE_ONLY___ ═
284
+ 2025-05-23 13:38:24,996 INFO ___FILE_ONLY___ ═
285
+ 2025-05-23 13:38:24,998 INFO ___FILE_ONLY___ ═
286
+ 2025-05-23 13:38:25,000 INFO ___FILE_ONLY___ ═
287
+ 2025-05-23 13:38:25,002 INFO ___FILE_ONLY___ ═
288
+ 2025-05-23 13:38:25,004 INFO ___FILE_ONLY___ ═
289
+ 2025-05-23 13:38:25,006 INFO ___FILE_ONLY___ ═
290
+ 2025-05-23 13:38:25,008 INFO ___FILE_ONLY___ ═
291
+ 2025-05-23 13:38:25,010 INFO ___FILE_ONLY___ ═
292
+ 2025-05-23 13:38:25,012 INFO ___FILE_ONLY___ ═
293
+ 2025-05-23 13:38:25,014 INFO ___FILE_ONLY___ ═
294
+ 2025-05-23 13:38:25,016 INFO ___FILE_ONLY___ ═
295
+ 2025-05-23 13:38:25,018 INFO ___FILE_ONLY___ ═
296
+ 2025-05-23 13:38:25,020 INFO ___FILE_ONLY___ ═
297
+ 2025-05-23 13:38:25,022 INFO ___FILE_ONLY___ ═
298
+ 2025-05-23 13:38:25,024 INFO ___FILE_ONLY___ ═
299
+ 2025-05-23 13:38:25,024 INFO ___FILE_ONLY___ ╝
300
+
301
+ 2025-05-23 13:38:25,027 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
302
+
303
+ 2025-05-23 13:38:25,027 INFO ___FILE_ONLY___ ╠═ Downloading: Cloud Storage Command Line Tool ═╣
304
+
305
+ 2025-05-23 13:38:25,027 INFO ___FILE_ONLY___ ╚
306
+ 2025-05-23 13:38:25,030 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
307
+ 2025-05-23 13:38:25,045 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-gsutil-20250418150427.tar.gz HTTP/11" 200 12382702
308
+ 2025-05-23 13:38:25,090 INFO ___FILE_ONLY___ ═
309
+ 2025-05-23 13:38:25,091 INFO ___FILE_ONLY___ ═
310
+ 2025-05-23 13:38:25,091 INFO ___FILE_ONLY___ ═
311
+ 2025-05-23 13:38:25,091 INFO ___FILE_ONLY___ ═
312
+ 2025-05-23 13:38:25,092 INFO ___FILE_ONLY___ ═
313
+ 2025-05-23 13:38:25,092 INFO ___FILE_ONLY___ ═
314
+ 2025-05-23 13:38:25,092 INFO ___FILE_ONLY___ ═
315
+ 2025-05-23 13:38:25,093 INFO ___FILE_ONLY___ ═
316
+ 2025-05-23 13:38:25,093 INFO ___FILE_ONLY___ ═
317
+ 2025-05-23 13:38:25,093 INFO ___FILE_ONLY___ ═
318
+ 2025-05-23 13:38:25,094 INFO ___FILE_ONLY___ ═
319
+ 2025-05-23 13:38:25,094 INFO ___FILE_ONLY___ ═
320
+ 2025-05-23 13:38:25,094 INFO ___FILE_ONLY___ ═
321
+ 2025-05-23 13:38:25,095 INFO ___FILE_ONLY___ ═
322
+ 2025-05-23 13:38:25,095 INFO ___FILE_ONLY___ ═
323
+ 2025-05-23 13:38:25,095 INFO ___FILE_ONLY___ ═
324
+ 2025-05-23 13:38:25,096 INFO ___FILE_ONLY___ ═
325
+ 2025-05-23 13:38:25,096 INFO ___FILE_ONLY___ ═
326
+ 2025-05-23 13:38:25,096 INFO ___FILE_ONLY___ ═
327
+ 2025-05-23 13:38:25,097 INFO ___FILE_ONLY___ ═
328
+ 2025-05-23 13:38:25,097 INFO ___FILE_ONLY___ ═
329
+ 2025-05-23 13:38:25,097 INFO ___FILE_ONLY___ ═
330
+ 2025-05-23 13:38:25,098 INFO ___FILE_ONLY___ ═
331
+ 2025-05-23 13:38:25,098 INFO ___FILE_ONLY___ ═
332
+ 2025-05-23 13:38:25,098 INFO ___FILE_ONLY___ ═
333
+ 2025-05-23 13:38:25,099 INFO ___FILE_ONLY___ ═
334
+ 2025-05-23 13:38:25,099 INFO ___FILE_ONLY___ ═
335
+ 2025-05-23 13:38:25,099 INFO ___FILE_ONLY___ ═
336
+ 2025-05-23 13:38:25,099 INFO ___FILE_ONLY___ ═
337
+ 2025-05-23 13:38:25,100 INFO ___FILE_ONLY___ ═
338
+ 2025-05-23 13:38:25,100 INFO ___FILE_ONLY___ ═
339
+ 2025-05-23 13:38:25,100 INFO ___FILE_ONLY___ ═
340
+ 2025-05-23 13:38:25,101 INFO ___FILE_ONLY___ ═
341
+ 2025-05-23 13:38:25,101 INFO ___FILE_ONLY___ ═
342
+ 2025-05-23 13:38:25,101 INFO ___FILE_ONLY___ ═
343
+ 2025-05-23 13:38:25,102 INFO ___FILE_ONLY___ ═
344
+ 2025-05-23 13:38:25,102 INFO ___FILE_ONLY___ ═
345
+ 2025-05-23 13:38:25,102 INFO ___FILE_ONLY___ ═
346
+ 2025-05-23 13:38:25,103 INFO ___FILE_ONLY___ ═
347
+ 2025-05-23 13:38:25,103 INFO ___FILE_ONLY___ ═
348
+ 2025-05-23 13:38:25,103 INFO ___FILE_ONLY___ ═
349
+ 2025-05-23 13:38:25,104 INFO ___FILE_ONLY___ ═
350
+ 2025-05-23 13:38:25,104 INFO ___FILE_ONLY___ ═
351
+ 2025-05-23 13:38:25,104 INFO ___FILE_ONLY___ ═
352
+ 2025-05-23 13:38:25,105 INFO ___FILE_ONLY___ ═
353
+ 2025-05-23 13:38:25,105 INFO ___FILE_ONLY___ ═
354
+ 2025-05-23 13:38:25,105 INFO ___FILE_ONLY___ ═
355
+ 2025-05-23 13:38:25,106 INFO ___FILE_ONLY___ ═
356
+ 2025-05-23 13:38:25,106 INFO ___FILE_ONLY___ ═
357
+ 2025-05-23 13:38:25,106 INFO ___FILE_ONLY___ ═
358
+ 2025-05-23 13:38:25,106 INFO ___FILE_ONLY___ ═
359
+ 2025-05-23 13:38:25,107 INFO ___FILE_ONLY___ ═
360
+ 2025-05-23 13:38:25,107 INFO ___FILE_ONLY___ ═
361
+ 2025-05-23 13:38:25,107 INFO ___FILE_ONLY___ ═
362
+ 2025-05-23 13:38:25,108 INFO ___FILE_ONLY___ ═
363
+ 2025-05-23 13:38:25,108 INFO ___FILE_ONLY___ ═
364
+ 2025-05-23 13:38:25,108 INFO ___FILE_ONLY___ ═
365
+ 2025-05-23 13:38:25,109 INFO ___FILE_ONLY___ ═
366
+ 2025-05-23 13:38:25,109 INFO ___FILE_ONLY___ ═
367
+ 2025-05-23 13:38:25,109 INFO ___FILE_ONLY___ ═
368
+ 2025-05-23 13:38:25,109 INFO ___FILE_ONLY___ ╝
369
+
370
+ 2025-05-23 13:38:25,112 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
371
+
372
+ 2025-05-23 13:38:25,112 INFO ___FILE_ONLY___ ╠═ Downloading: Cloud Storage Command Line Tool (Platfor... ═╣
373
+
374
+ 2025-05-23 13:38:25,112 INFO ___FILE_ONLY___ ╚
375
+ 2025-05-23 13:38:25,115 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
376
+ 2025-05-23 13:38:25,125 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-gsutil-nix-20250502143716.tar.gz HTTP/11" 200 1930
377
+ 2025-05-23 13:38:25,126 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
378
+ 2025-05-23 13:38:25,126 INFO ___FILE_ONLY___ ╝
379
+
380
+ 2025-05-23 13:38:25,128 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
381
+
382
+ 2025-05-23 13:38:25,128 INFO ___FILE_ONLY___ ╠═ Downloading: Default set of gcloud commands ═╣
383
+
384
+ 2025-05-23 13:38:25,128 INFO ___FILE_ONLY___ ╚
385
+ 2025-05-23 13:38:25,128 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
386
+ 2025-05-23 13:38:25,128 INFO ___FILE_ONLY___ ╝
387
+
388
+ 2025-05-23 13:38:25,130 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
389
+
390
+ 2025-05-23 13:38:25,130 INFO ___FILE_ONLY___ ╠═ Downloading: Google Cloud CLI Core Libraries (Platfor... ═╣
391
+
392
+ 2025-05-23 13:38:25,130 INFO ___FILE_ONLY___ ╚
393
+ 2025-05-23 13:38:25,133 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
394
+ 2025-05-23 13:38:25,148 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-core-nix-20250502143716.tar.gz HTTP/11" 200 2309
395
+ 2025-05-23 13:38:25,149 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
396
+ 2025-05-23 13:38:25,149 INFO ___FILE_ONLY___ ╝
397
+
398
+ 2025-05-23 13:38:25,151 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
399
+
400
+ 2025-05-23 13:38:25,151 INFO ___FILE_ONLY___ ╠═ Downloading: Google Cloud CRC32C Hash Tool ═╣
401
+
402
+ 2025-05-23 13:38:25,151 INFO ___FILE_ONLY___ ╚
403
+ 2025-05-23 13:38:25,151 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
404
+ 2025-05-23 13:38:25,151 INFO ___FILE_ONLY___ ╝
405
+
406
+ 2025-05-23 13:38:25,153 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
407
+
408
+ 2025-05-23 13:38:25,153 INFO ___FILE_ONLY___ ╠═ Downloading: Google Cloud CRC32C Hash Tool (Platform ... ═╣
409
+
410
+ 2025-05-23 13:38:25,153 INFO ___FILE_ONLY___ ╚
411
+ 2025-05-23 13:38:25,156 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
412
+ 2025-05-23 13:38:25,171 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-gcloud-crc32c-linux-x86_64-20250110133808.tar.gz HTTP/11" 200 1478989
413
+ 2025-05-23 13:38:25,182 INFO ___FILE_ONLY___ ═
414
+ 2025-05-23 13:38:25,182 INFO ___FILE_ONLY___ ═
415
+ 2025-05-23 13:38:25,182 INFO ___FILE_ONLY___ ═
416
+ 2025-05-23 13:38:25,182 INFO ___FILE_ONLY___ ═
417
+ 2025-05-23 13:38:25,182 INFO ___FILE_ONLY___ ═
418
+ 2025-05-23 13:38:25,182 INFO ___FILE_ONLY___ ═
419
+ 2025-05-23 13:38:25,182 INFO ___FILE_ONLY___ ═
420
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
421
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
422
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
423
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
424
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
425
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
426
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
427
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
428
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
429
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
430
+ 2025-05-23 13:38:25,183 INFO ___FILE_ONLY___ ═
431
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
432
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
433
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
434
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
435
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
436
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
437
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
438
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
439
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
440
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
441
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
442
+ 2025-05-23 13:38:25,184 INFO ___FILE_ONLY___ ═
443
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
444
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
445
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
446
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
447
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
448
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
449
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
450
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
451
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
452
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
453
+ 2025-05-23 13:38:25,185 INFO ___FILE_ONLY___ ═
454
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
455
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
456
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
457
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
458
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
459
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
460
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
461
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
462
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
463
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
464
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
465
+ 2025-05-23 13:38:25,186 INFO ___FILE_ONLY___ ═
466
+ 2025-05-23 13:38:25,187 INFO ___FILE_ONLY___ ═
467
+ 2025-05-23 13:38:25,187 INFO ___FILE_ONLY___ ═
468
+ 2025-05-23 13:38:25,187 INFO ___FILE_ONLY___ ═
469
+ 2025-05-23 13:38:25,187 INFO ___FILE_ONLY___ ═
470
+ 2025-05-23 13:38:25,187 INFO ___FILE_ONLY___ ═
471
+ 2025-05-23 13:38:25,187 INFO ___FILE_ONLY___ ═
472
+ 2025-05-23 13:38:25,187 INFO ___FILE_ONLY___ ═
473
+ 2025-05-23 13:38:25,187 INFO ___FILE_ONLY___ ╝
474
+
475
+ 2025-05-23 13:38:25,189 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
476
+
477
+ 2025-05-23 13:38:25,189 INFO ___FILE_ONLY___ ╠═ Downloading: gcloud cli dependencies (Platform Specific) ═╣
478
+
479
+ 2025-05-23 13:38:25,189 INFO ___FILE_ONLY___ ╚
480
+ 2025-05-23 13:38:25,192 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
481
+ 2025-05-23 13:38:25,205 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-gcloud-deps-linux-x86_64-20210416153011.tar.gz HTTP/11" 200 104
482
+ 2025-05-23 13:38:25,205 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
483
+ 2025-05-23 13:38:25,205 INFO ___FILE_ONLY___ ╝
484
+
485
+ 2025-05-23 13:38:25,207 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
486
+
487
+ 2025-05-23 13:38:25,207 INFO ___FILE_ONLY___ ╠═ Installing: BigQuery Command Line Tool ═╣
488
+
489
+ 2025-05-23 13:38:25,208 INFO ___FILE_ONLY___ ╚
490
+ 2025-05-23 13:38:25,300 INFO ___FILE_ONLY___ ═
491
+ 2025-05-23 13:38:25,302 INFO ___FILE_ONLY___ ═
492
+ 2025-05-23 13:38:25,305 INFO ___FILE_ONLY___ ═
493
+ 2025-05-23 13:38:25,307 INFO ___FILE_ONLY___ ═
494
+ 2025-05-23 13:38:25,310 INFO ___FILE_ONLY___ ═
495
+ 2025-05-23 13:38:25,312 INFO ___FILE_ONLY___ ═
496
+ 2025-05-23 13:38:25,314 INFO ___FILE_ONLY___ ═
497
+ 2025-05-23 13:38:25,316 INFO ___FILE_ONLY___ ═
498
+ 2025-05-23 13:38:25,318 INFO ___FILE_ONLY___ ═
499
+ 2025-05-23 13:38:25,321 INFO ___FILE_ONLY___ ═
500
+ 2025-05-23 13:38:25,323 INFO ___FILE_ONLY___ ═
501
+ 2025-05-23 13:38:25,325 INFO ___FILE_ONLY___ ═
502
+ 2025-05-23 13:38:25,327 INFO ___FILE_ONLY___ ═
503
+ 2025-05-23 13:38:25,330 INFO ___FILE_ONLY___ ═
504
+ 2025-05-23 13:38:25,332 INFO ___FILE_ONLY___ ═
505
+ 2025-05-23 13:38:25,334 INFO ___FILE_ONLY___ ═
506
+ 2025-05-23 13:38:25,336 INFO ___FILE_ONLY___ ═
507
+ 2025-05-23 13:38:25,338 INFO ___FILE_ONLY___ ═
508
+ 2025-05-23 13:38:25,340 INFO ___FILE_ONLY___ ═
509
+ 2025-05-23 13:38:25,344 INFO ___FILE_ONLY___ ═
510
+ 2025-05-23 13:38:25,346 INFO ___FILE_ONLY___ ═
511
+ 2025-05-23 13:38:25,349 INFO ___FILE_ONLY___ ═
512
+ 2025-05-23 13:38:25,351 INFO ___FILE_ONLY___ ═
513
+ 2025-05-23 13:38:25,353 INFO ___FILE_ONLY___ ═
514
+ 2025-05-23 13:38:25,355 INFO ___FILE_ONLY___ ═
515
+ 2025-05-23 13:38:25,357 INFO ___FILE_ONLY___ ═
516
+ 2025-05-23 13:38:25,359 INFO ___FILE_ONLY___ ═
517
+ 2025-05-23 13:38:25,361 INFO ___FILE_ONLY___ ═
518
+ 2025-05-23 13:38:25,364 INFO ___FILE_ONLY___ ═
519
+ 2025-05-23 13:38:25,368 INFO ___FILE_ONLY___ ═
520
+ 2025-05-23 13:38:25,370 INFO ___FILE_ONLY___ ═
521
+ 2025-05-23 13:38:25,372 INFO ___FILE_ONLY___ ═
522
+ 2025-05-23 13:38:25,374 INFO ___FILE_ONLY___ ═
523
+ 2025-05-23 13:38:25,377 INFO ___FILE_ONLY___ ═
524
+ 2025-05-23 13:38:25,380 INFO ___FILE_ONLY___ ═
525
+ 2025-05-23 13:38:25,387 INFO ___FILE_ONLY___ ═
526
+ 2025-05-23 13:38:25,392 INFO ___FILE_ONLY___ ═
527
+ 2025-05-23 13:38:25,397 INFO ___FILE_ONLY___ ═
528
+ 2025-05-23 13:38:25,399 INFO ___FILE_ONLY___ ═
529
+ 2025-05-23 13:38:25,402 INFO ___FILE_ONLY___ ═
530
+ 2025-05-23 13:38:25,404 INFO ___FILE_ONLY___ ═
531
+ 2025-05-23 13:38:25,406 INFO ___FILE_ONLY___ ═
532
+ 2025-05-23 13:38:25,410 INFO ___FILE_ONLY___ ═
533
+ 2025-05-23 13:38:25,413 INFO ___FILE_ONLY___ ═
534
+ 2025-05-23 13:38:25,415 INFO ___FILE_ONLY___ ═
535
+ 2025-05-23 13:38:25,418 INFO ___FILE_ONLY___ ═
536
+ 2025-05-23 13:38:25,420 INFO ___FILE_ONLY___ ═
537
+ 2025-05-23 13:38:25,422 INFO ___FILE_ONLY___ ═
538
+ 2025-05-23 13:38:25,424 INFO ___FILE_ONLY___ ═
539
+ 2025-05-23 13:38:25,427 INFO ___FILE_ONLY___ ═
540
+ 2025-05-23 13:38:25,429 INFO ___FILE_ONLY___ ═
541
+ 2025-05-23 13:38:25,431 INFO ___FILE_ONLY___ ═
542
+ 2025-05-23 13:38:25,433 INFO ___FILE_ONLY___ ═
543
+ 2025-05-23 13:38:25,436 INFO ___FILE_ONLY___ ═
544
+ 2025-05-23 13:38:25,438 INFO ___FILE_ONLY___ ═
545
+ 2025-05-23 13:38:25,440 INFO ___FILE_ONLY___ ═
546
+ 2025-05-23 13:38:25,442 INFO ___FILE_ONLY___ ═
547
+ 2025-05-23 13:38:25,444 INFO ___FILE_ONLY___ ═
548
+ 2025-05-23 13:38:25,446 INFO ___FILE_ONLY___ ═
549
+ 2025-05-23 13:38:25,448 INFO ___FILE_ONLY___ ═
550
+ 2025-05-23 13:38:25,448 INFO ___FILE_ONLY___ ╝
551
+
552
+ 2025-05-23 13:38:25,456 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
553
+
554
+ 2025-05-23 13:38:25,456 INFO ___FILE_ONLY___ ╠═ Installing: BigQuery Command Line Tool (Platform Spec... ═╣
555
+
556
+ 2025-05-23 13:38:25,457 INFO ___FILE_ONLY___ ╚
557
+ 2025-05-23 13:38:25,457 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
558
+ 2025-05-23 13:38:25,458 INFO ___FILE_ONLY___ ╝
559
+
560
+ 2025-05-23 13:38:25,463 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
561
+
562
+ 2025-05-23 13:38:25,463 INFO ___FILE_ONLY___ ╠═ Installing: Bundled Python 3.12 ═╣
563
+
564
+ 2025-05-23 13:38:25,463 INFO ___FILE_ONLY___ ╚
565
+ 2025-05-23 13:38:25,466 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
566
+ 2025-05-23 13:38:25,466 INFO ___FILE_ONLY___ ╝
567
+
568
+ 2025-05-23 13:38:25,468 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
569
+
570
+ 2025-05-23 13:38:25,468 INFO ___FILE_ONLY___ ╠═ Installing: Bundled Python 3.12 (Platform Specific) ═╣
571
+
572
+ 2025-05-23 13:38:25,468 INFO ___FILE_ONLY___ ╚
573
+ 2025-05-23 13:38:27,668 INFO ___FILE_ONLY___ ═
574
+ 2025-05-23 13:38:27,680 INFO ___FILE_ONLY___ ═
575
+ 2025-05-23 13:38:27,692 INFO ___FILE_ONLY___ ═
576
+ 2025-05-23 13:38:27,705 INFO ___FILE_ONLY___ ═
577
+ 2025-05-23 13:38:27,718 INFO ___FILE_ONLY___ ═
578
+ 2025-05-23 13:38:27,731 INFO ___FILE_ONLY___ ═
579
+ 2025-05-23 13:38:27,744 INFO ___FILE_ONLY___ ═
580
+ 2025-05-23 13:38:27,757 INFO ___FILE_ONLY___ ═
581
+ 2025-05-23 13:38:27,770 INFO ___FILE_ONLY___ ═
582
+ 2025-05-23 13:38:27,782 INFO ___FILE_ONLY___ ═
583
+ 2025-05-23 13:38:27,795 INFO ___FILE_ONLY___ ═
584
+ 2025-05-23 13:38:27,808 INFO ___FILE_ONLY___ ═
585
+ 2025-05-23 13:38:27,821 INFO ___FILE_ONLY___ ═
586
+ 2025-05-23 13:38:27,834 INFO ___FILE_ONLY___ ═
587
+ 2025-05-23 13:38:27,848 INFO ___FILE_ONLY___ ═
588
+ 2025-05-23 13:38:27,861 INFO ___FILE_ONLY___ ═
589
+ 2025-05-23 13:38:27,873 INFO ___FILE_ONLY___ ═
590
+ 2025-05-23 13:38:27,887 INFO ___FILE_ONLY___ ═
591
+ 2025-05-23 13:38:27,900 INFO ___FILE_ONLY___ ═
592
+ 2025-05-23 13:38:27,913 INFO ___FILE_ONLY___ ═
593
+ 2025-05-23 13:38:27,926 INFO ___FILE_ONLY___ ═
594
+ 2025-05-23 13:38:27,941 INFO ___FILE_ONLY___ ═
595
+ 2025-05-23 13:38:27,954 INFO ___FILE_ONLY___ ═
596
+ 2025-05-23 13:38:27,967 INFO ___FILE_ONLY___ ═
597
+ 2025-05-23 13:38:27,980 INFO ___FILE_ONLY___ ═
598
+ 2025-05-23 13:38:27,993 INFO ___FILE_ONLY___ ═
599
+ 2025-05-23 13:38:28,006 INFO ___FILE_ONLY___ ═
600
+ 2025-05-23 13:38:28,019 INFO ___FILE_ONLY___ ═
601
+ 2025-05-23 13:38:28,032 INFO ___FILE_ONLY___ ═
602
+ 2025-05-23 13:38:28,045 INFO ___FILE_ONLY___ ═
603
+ 2025-05-23 13:38:28,058 INFO ___FILE_ONLY___ ═
604
+ 2025-05-23 13:38:28,071 INFO ___FILE_ONLY___ ═
605
+ 2025-05-23 13:38:28,085 INFO ___FILE_ONLY___ ═
606
+ 2025-05-23 13:38:28,099 INFO ___FILE_ONLY___ ═
607
+ 2025-05-23 13:38:28,116 INFO ___FILE_ONLY___ ═
608
+ 2025-05-23 13:38:28,131 INFO ___FILE_ONLY___ ═
609
+ 2025-05-23 13:38:29,008 INFO ___FILE_ONLY___ ═
610
+ 2025-05-23 13:38:29,032 INFO ___FILE_ONLY___ ═
611
+ 2025-05-23 13:38:29,565 INFO ___FILE_ONLY___ ═
612
+ 2025-05-23 13:38:29,586 INFO ___FILE_ONLY___ ═
613
+ 2025-05-23 13:38:29,612 INFO ___FILE_ONLY___ ═
614
+ 2025-05-23 13:38:29,640 INFO ___FILE_ONLY___ ═
615
+ 2025-05-23 13:38:29,662 INFO ___FILE_ONLY___ ═
616
+ 2025-05-23 13:38:29,693 INFO ___FILE_ONLY___ ═
617
+ 2025-05-23 13:38:29,717 INFO ___FILE_ONLY___ ═
618
+ 2025-05-23 13:38:29,738 INFO ___FILE_ONLY___ ═
619
+ 2025-05-23 13:38:29,756 INFO ___FILE_ONLY___ ═
620
+ 2025-05-23 13:38:29,774 INFO ___FILE_ONLY___ ═
621
+ 2025-05-23 13:38:29,866 INFO ___FILE_ONLY___ ═
622
+ 2025-05-23 13:38:29,886 INFO ___FILE_ONLY___ ═
623
+ 2025-05-23 13:38:30,034 INFO ___FILE_ONLY___ ═
624
+ 2025-05-23 13:38:30,053 INFO ___FILE_ONLY___ ═
625
+ 2025-05-23 13:38:30,072 INFO ___FILE_ONLY___ ═
626
+ 2025-05-23 13:38:30,096 INFO ___FILE_ONLY___ ═
627
+ 2025-05-23 13:38:30,112 INFO ___FILE_ONLY___ ═
628
+ 2025-05-23 13:38:30,141 INFO ___FILE_ONLY___ ═
629
+ 2025-05-23 13:38:30,159 INFO ___FILE_ONLY___ ═
630
+ 2025-05-23 13:38:30,177 INFO ___FILE_ONLY___ ═
631
+ 2025-05-23 13:38:30,207 INFO ___FILE_ONLY___ ═
632
+ 2025-05-23 13:38:30,700 INFO ___FILE_ONLY___ ═
633
+ 2025-05-23 13:38:30,701 INFO ___FILE_ONLY___ ╝
634
+
635
+ 2025-05-23 13:38:30,756 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
636
+
637
+ 2025-05-23 13:38:30,757 INFO ___FILE_ONLY___ ╠═ Installing: Cloud Storage Command Line Tool ═╣
638
+
639
+ 2025-05-23 13:38:30,757 INFO ___FILE_ONLY___ ╚
640
+ 2025-05-23 13:38:31,274 INFO ___FILE_ONLY___ ═
641
+ 2025-05-23 13:38:31,286 INFO ___FILE_ONLY___ ═
642
+ 2025-05-23 13:38:31,297 INFO ___FILE_ONLY___ ═
643
+ 2025-05-23 13:38:31,308 INFO ___FILE_ONLY___ ═
644
+ 2025-05-23 13:38:31,318 INFO ___FILE_ONLY___ ═
645
+ 2025-05-23 13:38:31,354 INFO ___FILE_ONLY___ ═
646
+ 2025-05-23 13:38:31,367 INFO ___FILE_ONLY___ ═
647
+ 2025-05-23 13:38:31,383 INFO ___FILE_ONLY___ ═
648
+ 2025-05-23 13:38:31,395 INFO ___FILE_ONLY___ ═
649
+ 2025-05-23 13:38:31,412 INFO ___FILE_ONLY___ ═
650
+ 2025-05-23 13:38:31,425 INFO ___FILE_ONLY___ ═
651
+ 2025-05-23 13:38:31,436 INFO ___FILE_ONLY___ ═
652
+ 2025-05-23 13:38:31,447 INFO ___FILE_ONLY___ ═
653
+ 2025-05-23 13:38:31,456 INFO ___FILE_ONLY___ ═
654
+ 2025-05-23 13:38:31,468 INFO ___FILE_ONLY___ ═
655
+ 2025-05-23 13:38:31,483 INFO ___FILE_ONLY___ ═
656
+ 2025-05-23 13:38:31,497 INFO ___FILE_ONLY___ ═
657
+ 2025-05-23 13:38:31,507 INFO ___FILE_ONLY___ ═
658
+ 2025-05-23 13:38:31,521 INFO ___FILE_ONLY___ ═
659
+ 2025-05-23 13:38:31,531 INFO ___FILE_ONLY___ ═
660
+ 2025-05-23 13:38:31,544 INFO ___FILE_ONLY___ ═
661
+ 2025-05-23 13:38:31,555 INFO ___FILE_ONLY___ ═
662
+ 2025-05-23 13:38:31,568 INFO ___FILE_ONLY___ ═
663
+ 2025-05-23 13:38:31,579 INFO ___FILE_ONLY___ ═
664
+ 2025-05-23 13:38:31,590 INFO ___FILE_ONLY___ ═
665
+ 2025-05-23 13:38:31,603 INFO ___FILE_ONLY___ ═
666
+ 2025-05-23 13:38:31,617 INFO ___FILE_ONLY___ ═
667
+ 2025-05-23 13:38:31,638 INFO ___FILE_ONLY___ ═
668
+ 2025-05-23 13:38:31,650 INFO ___FILE_ONLY___ ═
669
+ 2025-05-23 13:38:31,660 INFO ___FILE_ONLY___ ═
670
+ 2025-05-23 13:38:31,681 INFO ___FILE_ONLY___ ═
671
+ 2025-05-23 13:38:31,699 INFO ___FILE_ONLY___ ═
672
+ 2025-05-23 13:38:31,717 INFO ___FILE_ONLY___ ═
673
+ 2025-05-23 13:38:31,736 INFO ___FILE_ONLY___ ═
674
+ 2025-05-23 13:38:31,754 INFO ___FILE_ONLY___ ═
675
+ 2025-05-23 13:38:31,768 INFO ___FILE_ONLY___ ═
676
+ 2025-05-23 13:38:31,779 INFO ___FILE_ONLY___ ═
677
+ 2025-05-23 13:38:31,794 INFO ___FILE_ONLY___ ═
678
+ 2025-05-23 13:38:31,810 INFO ___FILE_ONLY___ ═
679
+ 2025-05-23 13:38:31,825 INFO ___FILE_ONLY___ ═
680
+ 2025-05-23 13:38:31,837 INFO ___FILE_ONLY___ ═
681
+ 2025-05-23 13:38:31,849 INFO ___FILE_ONLY___ ═
682
+ 2025-05-23 13:38:31,860 INFO ___FILE_ONLY___ ═
683
+ 2025-05-23 13:38:31,872 INFO ___FILE_ONLY___ ═
684
+ 2025-05-23 13:38:31,883 INFO ___FILE_ONLY___ ═
685
+ 2025-05-23 13:38:31,893 INFO ___FILE_ONLY___ ═
686
+ 2025-05-23 13:38:31,903 INFO ___FILE_ONLY___ ═
687
+ 2025-05-23 13:38:31,913 INFO ___FILE_ONLY___ ═
688
+ 2025-05-23 13:38:31,925 INFO ___FILE_ONLY___ ═
689
+ 2025-05-23 13:38:31,938 INFO ___FILE_ONLY___ ═
690
+ 2025-05-23 13:38:31,952 INFO ___FILE_ONLY___ ═
691
+ 2025-05-23 13:38:31,966 INFO ___FILE_ONLY___ ═
692
+ 2025-05-23 13:38:31,980 INFO ___FILE_ONLY___ ═
693
+ 2025-05-23 13:38:31,994 INFO ___FILE_ONLY___ ═
694
+ 2025-05-23 13:38:32,006 INFO ___FILE_ONLY___ ═
695
+ 2025-05-23 13:38:32,021 INFO ___FILE_ONLY___ ═
696
+ 2025-05-23 13:38:32,038 INFO ___FILE_ONLY___ ═
697
+ 2025-05-23 13:38:32,060 INFO ___FILE_ONLY___ ═
698
+ 2025-05-23 13:38:32,077 INFO ___FILE_ONLY___ ═
699
+ 2025-05-23 13:38:32,096 INFO ___FILE_ONLY___ ═
700
+ 2025-05-23 13:38:32,096 INFO ___FILE_ONLY___ ╝
701
+
702
+ 2025-05-23 13:38:32,127 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
703
+
704
+ 2025-05-23 13:38:32,127 INFO ___FILE_ONLY___ ╠═ Installing: Cloud Storage Command Line Tool (Platform... ═╣
705
+
706
+ 2025-05-23 13:38:32,127 INFO ___FILE_ONLY___ ╚
707
+ 2025-05-23 13:38:32,128 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
708
+ 2025-05-23 13:38:32,128 INFO ___FILE_ONLY___ ╝
709
+
710
+ 2025-05-23 13:38:32,133 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
711
+
712
+ 2025-05-23 13:38:32,133 INFO ___FILE_ONLY___ ╠═ Installing: Default set of gcloud commands ═╣
713
+
714
+ 2025-05-23 13:38:32,133 INFO ___FILE_ONLY___ ╚
715
+ 2025-05-23 13:38:32,136 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
716
+ 2025-05-23 13:38:32,136 INFO ___FILE_ONLY___ ╝
717
+
718
+ 2025-05-23 13:38:32,137 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
719
+
720
+ 2025-05-23 13:38:32,138 INFO ___FILE_ONLY___ ╠═ Installing: Google Cloud CLI Core Libraries (Platform... ═╣
721
+
722
+ 2025-05-23 13:38:32,138 INFO ___FILE_ONLY___ ╚
723
+ 2025-05-23 13:38:32,139 INFO ___FILE_ONLY___ ══════════════════════════════
724
+ 2025-05-23 13:38:32,139 INFO ___FILE_ONLY___ ══════════════════════════════
725
+ 2025-05-23 13:38:32,139 INFO ___FILE_ONLY___ ╝
726
+
727
+ 2025-05-23 13:38:32,143 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
728
+
729
+ 2025-05-23 13:38:32,143 INFO ___FILE_ONLY___ ╠═ Installing: Google Cloud CRC32C Hash Tool ═╣
730
+
731
+ 2025-05-23 13:38:32,143 INFO ___FILE_ONLY___ ╚
732
+ 2025-05-23 13:38:32,146 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
733
+ 2025-05-23 13:38:32,146 INFO ___FILE_ONLY___ ╝
734
+
735
+ 2025-05-23 13:38:32,148 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
736
+
737
+ 2025-05-23 13:38:32,148 INFO ___FILE_ONLY___ ╠═ Installing: Google Cloud CRC32C Hash Tool (Platform S... ═╣
738
+
739
+ 2025-05-23 13:38:32,148 INFO ___FILE_ONLY___ ╚
740
+ 2025-05-23 13:38:32,186 INFO ___FILE_ONLY___ ══════════════════════════════
741
+ 2025-05-23 13:38:32,187 INFO ___FILE_ONLY___ ══════════════════════════════
742
+ 2025-05-23 13:38:32,187 INFO ___FILE_ONLY___ ╝
743
+
744
+ 2025-05-23 13:38:32,192 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
745
+
746
+ 2025-05-23 13:38:32,192 INFO ___FILE_ONLY___ ╠═ Installing: gcloud cli dependencies (Platform Specific) ═╣
747
+
748
+ 2025-05-23 13:38:32,192 INFO ___FILE_ONLY___ ╚
749
+ 2025-05-23 13:38:32,193 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
750
+ 2025-05-23 13:38:32,193 INFO ___FILE_ONLY___ ╝
751
+
752
+ 2025-05-23 13:38:32,198 DEBUG root Updating notification cache...
753
+ 2025-05-23 13:38:32,198 INFO ___FILE_ONLY___
754
+
755
+ 2025-05-23 13:38:32,200 INFO ___FILE_ONLY___ Performing post processing steps...
756
+ 2025-05-23 13:38:32,200 DEBUG root Executing command: ['/tools/google-cloud-sdk/bin/gcloud', 'components', 'post-process']
757
+ 2025-05-23 13:38:40,146 DEBUG ___FILE_ONLY___
758
+ 2025-05-23 13:38:40,146 DEBUG ___FILE_ONLY___
759
+ 2025-05-23 13:38:40,232 INFO root descriptor_list: [{'universeDomain': 'googleapis.com', 'universeShortName': '', 'authenticationDomain': 'auth.cloud.google.com', 'projectPrefix': '', 'cloudWebDomain': 'cloud.google.com', 'documentationDomain': 'cloud.google.com', 'version': '1.0.0', 'state': 'primary', 'artifactRegistryDomain': 'pkg.dev'}]
760
+ 2025-05-23 13:38:40,232 INFO ___FILE_ONLY___
761
+ Update done!
762
+
763
+
764
+ 2025-05-23 13:38:40,236 DEBUG root Chosen display Format:none
765
+ 2025-05-23 13:38:40,236 INFO root Display format: "none"
.config/logs/2025.05.23/13.38.32.731517.log ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ 2025-05-23 13:38:32,732 DEBUG root Loaded Command Group: ['gcloud', 'components']
2
+ 2025-05-23 13:38:32,734 DEBUG root Loaded Command Group: ['gcloud', 'components', 'post_process']
3
+ 2025-05-23 13:38:32,735 DEBUG root Running [gcloud.components.post-process] with arguments: []
4
+ 2025-05-23 13:38:40,021 DEBUG root Chosen display Format:none
5
+ 2025-05-23 13:38:40,022 INFO root Display format: "none"
.config/logs/2025.05.23/13.38.40.911008.log ADDED
@@ -0,0 +1,153 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2025-05-23 13:38:40,911 DEBUG root Loaded Command Group: ['gcloud', 'components']
2
+ 2025-05-23 13:38:40,914 DEBUG root Loaded Command Group: ['gcloud', 'components', 'update']
3
+ 2025-05-23 13:38:40,915 DEBUG root Running [gcloud.components.update] with arguments: [--quiet: "True", COMPONENT-IDS:8: "['gcloud', 'core', 'bq', 'gsutil', 'compute', 'preview', 'alpha', 'beta']"]
4
+ 2025-05-23 13:38:40,916 INFO ___FILE_ONLY___ Beginning update. This process may take several minutes.
5
+
6
+ 2025-05-23 13:38:40,926 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
7
+ 2025-05-23 13:38:40,943 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components-2.json HTTP/11" 200 233994
8
+ 2025-05-23 13:38:40,957 WARNING root Component [compute] no longer exists.
9
+ 2025-05-23 13:38:40,958 INFO ___FILE_ONLY___
10
+
11
+ 2025-05-23 13:38:40,958 INFO ___FILE_ONLY___
12
+ Your current Google Cloud CLI version is: 523.0.1
13
+
14
+ 2025-05-23 13:38:40,958 INFO ___FILE_ONLY___ Installing components from version: 523.0.1
15
+
16
+ 2025-05-23 13:38:40,958 INFO ___FILE_ONLY___
17
+
18
+ 2025-05-23 13:38:40,959 DEBUG root Chosen display Format:table[box,title="These components will be removed."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
19
+ 2025-05-23 13:38:40,959 DEBUG root Chosen display Format:table[box,title="These components will be updated."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
20
+ 2025-05-23 13:38:40,959 DEBUG root Chosen display Format:table[box,title="These components will be installed."](details.display_name:label=Name:align=left,version.version_string:label=Version:align=right,data.size.size(zero="",min=1048576):label=Size:align=right)
21
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___ ┌────────────────────────────────────────────────┐
22
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___
23
+
24
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___ │ These components will be installed. │
25
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___
26
+
27
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___ ├─────────────────────────┬────────────┬─────────┤
28
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___
29
+
30
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___ │ Name │ Version │ Size │
31
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___
32
+
33
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___ ├─────────────────────────┼────────────┼─────────┤
34
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___
35
+
36
+ 2025-05-23 13:38:40,974 INFO ___FILE_ONLY___ │
37
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ gcloud Alpha Commands
38
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___
39
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ │
40
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ 2025.05.22
41
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___
42
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ │
43
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ < 1 MiB
44
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___
45
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ │
46
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___
47
+
48
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ │
49
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ gcloud Beta Commands
50
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___
51
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ │
52
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ 2025.05.22
53
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___
54
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ │
55
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ < 1 MiB
56
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___
57
+ 2025-05-23 13:38:40,975 INFO ___FILE_ONLY___ │
58
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___
59
+
60
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___ │
61
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___ gcloud Preview Commands
62
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___
63
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___ │
64
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___
65
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___
66
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___ │
67
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___ < 1 MiB
68
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___
69
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___ │
70
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___
71
+
72
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___ └─────────────────────────┴────────────┴─────────┘
73
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___
74
+
75
+ 2025-05-23 13:38:40,976 INFO ___FILE_ONLY___
76
+
77
+ 2025-05-23 13:38:40,979 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
78
+ 2025-05-23 13:38:40,994 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/RELEASE_NOTES HTTP/11" 200 1422252
79
+ 2025-05-23 13:38:41,470 INFO ___FILE_ONLY___ For the latest full release notes, please visit:
80
+ https://cloud.google.com/sdk/release_notes
81
+
82
+
83
+ 2025-05-23 13:38:41,471 INFO ___FILE_ONLY___ Performing in place update...
84
+
85
+
86
+ 2025-05-23 13:38:41,473 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
87
+
88
+ 2025-05-23 13:38:41,473 INFO ___FILE_ONLY___ ╠═ Downloading: gcloud Alpha Commands ═╣
89
+
90
+ 2025-05-23 13:38:41,473 INFO ___FILE_ONLY___ ╚
91
+ 2025-05-23 13:38:41,476 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
92
+ 2025-05-23 13:38:41,488 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-alpha-20250522182510.tar.gz HTTP/11" 200 800
93
+ 2025-05-23 13:38:41,489 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
94
+ 2025-05-23 13:38:41,489 INFO ___FILE_ONLY___ ╝
95
+
96
+ 2025-05-23 13:38:41,491 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
97
+
98
+ 2025-05-23 13:38:41,491 INFO ___FILE_ONLY___ ╠═ Downloading: gcloud Beta Commands ═╣
99
+
100
+ 2025-05-23 13:38:41,491 INFO ___FILE_ONLY___ ╚
101
+ 2025-05-23 13:38:41,495 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
102
+ 2025-05-23 13:38:41,510 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-beta-20250522182510.tar.gz HTTP/11" 200 797
103
+ 2025-05-23 13:38:41,511 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
104
+ 2025-05-23 13:38:41,511 INFO ___FILE_ONLY___ ╝
105
+
106
+ 2025-05-23 13:38:41,513 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
107
+
108
+ 2025-05-23 13:38:41,513 INFO ___FILE_ONLY___ ╠═ Downloading: gcloud Preview Commands ═╣
109
+
110
+ 2025-05-23 13:38:41,513 INFO ___FILE_ONLY___ ╚
111
+ 2025-05-23 13:38:41,516 DEBUG urllib3.connectionpool Starting new HTTPS connection (1): dl.google.com:443
112
+ 2025-05-23 13:38:41,549 DEBUG urllib3.connectionpool https://dl.google.com:443 "GET /dl/cloudsdk/channels/rapid/components/google-cloud-sdk-preview-20241115154308.tar.gz HTTP/11" 200 823
113
+ 2025-05-23 13:38:41,550 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
114
+ 2025-05-23 13:38:41,550 INFO ___FILE_ONLY___ ╝
115
+
116
+ 2025-05-23 13:38:41,552 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
117
+
118
+ 2025-05-23 13:38:41,552 INFO ___FILE_ONLY___ ╠═ Installing: gcloud Alpha Commands ═╣
119
+
120
+ 2025-05-23 13:38:41,552 INFO ___FILE_ONLY___ ╚
121
+ 2025-05-23 13:38:41,553 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
122
+ 2025-05-23 13:38:41,553 INFO ___FILE_ONLY___ ╝
123
+
124
+ 2025-05-23 13:38:41,558 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
125
+
126
+ 2025-05-23 13:38:41,558 INFO ___FILE_ONLY___ ╠═ Installing: gcloud Beta Commands ═╣
127
+
128
+ 2025-05-23 13:38:41,559 INFO ___FILE_ONLY___ ╚
129
+ 2025-05-23 13:38:41,559 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
130
+ 2025-05-23 13:38:41,559 INFO ___FILE_ONLY___ ╝
131
+
132
+ 2025-05-23 13:38:41,564 INFO ___FILE_ONLY___ ╔════════════════════════════════════════════════════════════╗
133
+
134
+ 2025-05-23 13:38:41,564 INFO ___FILE_ONLY___ ╠═ Installing: gcloud Preview Commands ═╣
135
+
136
+ 2025-05-23 13:38:41,564 INFO ___FILE_ONLY___ ╚
137
+ 2025-05-23 13:38:41,565 INFO ___FILE_ONLY___ ════════════════════════════════════════════════════════════
138
+ 2025-05-23 13:38:41,565 INFO ___FILE_ONLY___ ╝
139
+
140
+ 2025-05-23 13:38:41,570 DEBUG root Updating notification cache...
141
+ 2025-05-23 13:38:41,571 INFO ___FILE_ONLY___
142
+
143
+ 2025-05-23 13:38:41,572 INFO ___FILE_ONLY___ Performing post processing steps...
144
+ 2025-05-23 13:38:41,573 DEBUG root Executing command: ['/tools/google-cloud-sdk/bin/gcloud', 'components', 'post-process']
145
+ 2025-05-23 13:38:50,266 DEBUG ___FILE_ONLY___
146
+ 2025-05-23 13:38:50,266 DEBUG ___FILE_ONLY___
147
+ 2025-05-23 13:38:50,334 INFO root descriptor_list: [{'universeDomain': 'googleapis.com', 'universeShortName': '', 'authenticationDomain': 'auth.cloud.google.com', 'projectPrefix': '', 'cloudWebDomain': 'cloud.google.com', 'documentationDomain': 'cloud.google.com', 'version': '1.0.0', 'state': 'primary', 'artifactRegistryDomain': 'pkg.dev'}]
148
+ 2025-05-23 13:38:50,334 INFO ___FILE_ONLY___
149
+ Update done!
150
+
151
+
152
+ 2025-05-23 13:38:50,337 DEBUG root Chosen display Format:none
153
+ 2025-05-23 13:38:50,337 INFO root Display format: "none"
.config/logs/2025.05.23/13.38.42.097421.log ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ 2025-05-23 13:38:42,098 DEBUG root Loaded Command Group: ['gcloud', 'components']
2
+ 2025-05-23 13:38:42,099 DEBUG root Loaded Command Group: ['gcloud', 'components', 'post_process']
3
+ 2025-05-23 13:38:42,101 DEBUG root Running [gcloud.components.post-process] with arguments: []
4
+ 2025-05-23 13:38:50,147 DEBUG root Chosen display Format:none
5
+ 2025-05-23 13:38:50,147 INFO root Display format: "none"
.config/logs/2025.05.23/13.38.50.979302.log ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ 2025-05-23 13:38:50,981 DEBUG root Loaded Command Group: ['gcloud', 'config']
2
+ 2025-05-23 13:38:51,037 DEBUG root Loaded Command Group: ['gcloud', 'config', 'set']
3
+ 2025-05-23 13:38:51,039 DEBUG root Running [gcloud.config.set] with arguments: [SECTION/PROPERTY: "component_manager/disable_update_check", VALUE: "true"]
4
+ 2025-05-23 13:38:51,040 INFO ___FILE_ONLY___ Updated property [component_manager/disable_update_check].
5
+
6
+ 2025-05-23 13:38:51,041 DEBUG root Chosen display Format:default
7
+ 2025-05-23 13:38:51,041 INFO root Display format: "default"
8
+ 2025-05-23 13:38:51,042 DEBUG root SDK update checks are disabled.
.config/logs/2025.05.23/13.38.51.693491.log ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ 2025-05-23 13:38:51,695 DEBUG root Loaded Command Group: ['gcloud', 'config']
2
+ 2025-05-23 13:38:51,743 DEBUG root Loaded Command Group: ['gcloud', 'config', 'set']
3
+ 2025-05-23 13:38:51,746 DEBUG root Running [gcloud.config.set] with arguments: [SECTION/PROPERTY: "compute/gce_metadata_read_timeout_sec", VALUE: "0"]
4
+ 2025-05-23 13:38:51,746 INFO ___FILE_ONLY___ Updated property [compute/gce_metadata_read_timeout_sec].
5
+
6
+ 2025-05-23 13:38:51,747 DEBUG root Chosen display Format:default
7
+ 2025-05-23 13:38:51,748 INFO root Display format: "default"
8
+ 2025-05-23 13:38:51,748 DEBUG root SDK update checks are disabled.
.gitattributes CHANGED
@@ -33,3 +33,128 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ llama.cpp/build/CMakeFiles/3.31.6/CompilerIdCUDA/a.out filter=lfs diff=lfs merge=lfs -text
37
+ llama.cpp/build/bin/libggml-base.so filter=lfs diff=lfs merge=lfs -text
38
+ llama.cpp/build/bin/libggml-cpu.so filter=lfs diff=lfs merge=lfs -text
39
+ llama.cpp/build/bin/libllama.so filter=lfs diff=lfs merge=lfs -text
40
+ llama.cpp/build/bin/libmtmd_helper_shared.so filter=lfs diff=lfs merge=lfs -text
41
+ llama.cpp/build/bin/libmtmd_shared.so filter=lfs diff=lfs merge=lfs -text
42
+ llama.cpp/build/bin/llama-batched filter=lfs diff=lfs merge=lfs -text
43
+ llama.cpp/build/bin/llama-batched-bench filter=lfs diff=lfs merge=lfs -text
44
+ llama.cpp/build/bin/llama-bench filter=lfs diff=lfs merge=lfs -text
45
+ llama.cpp/build/bin/llama-cli filter=lfs diff=lfs merge=lfs -text
46
+ llama.cpp/build/bin/llama-convert-llama2c-to-ggml filter=lfs diff=lfs merge=lfs -text
47
+ llama.cpp/build/bin/llama-cvector-generator filter=lfs diff=lfs merge=lfs -text
48
+ llama.cpp/build/bin/llama-embedding filter=lfs diff=lfs merge=lfs -text
49
+ llama.cpp/build/bin/llama-eval-callback filter=lfs diff=lfs merge=lfs -text
50
+ llama.cpp/build/bin/llama-export-lora filter=lfs diff=lfs merge=lfs -text
51
+ llama.cpp/build/bin/llama-finetune filter=lfs diff=lfs merge=lfs -text
52
+ llama.cpp/build/bin/llama-gen-docs filter=lfs diff=lfs merge=lfs -text
53
+ llama.cpp/build/bin/llama-gguf-hash filter=lfs diff=lfs merge=lfs -text
54
+ llama.cpp/build/bin/llama-gritlm filter=lfs diff=lfs merge=lfs -text
55
+ llama.cpp/build/bin/llama-imatrix filter=lfs diff=lfs merge=lfs -text
56
+ llama.cpp/build/bin/llama-lookahead filter=lfs diff=lfs merge=lfs -text
57
+ llama.cpp/build/bin/llama-lookup filter=lfs diff=lfs merge=lfs -text
58
+ llama.cpp/build/bin/llama-lookup-create filter=lfs diff=lfs merge=lfs -text
59
+ llama.cpp/build/bin/llama-lookup-stats filter=lfs diff=lfs merge=lfs -text
60
+ llama.cpp/build/bin/llama-mtmd-cli filter=lfs diff=lfs merge=lfs -text
61
+ llama.cpp/build/bin/llama-parallel filter=lfs diff=lfs merge=lfs -text
62
+ llama.cpp/build/bin/llama-passkey filter=lfs diff=lfs merge=lfs -text
63
+ llama.cpp/build/bin/llama-perplexity filter=lfs diff=lfs merge=lfs -text
64
+ llama.cpp/build/bin/llama-quantize filter=lfs diff=lfs merge=lfs -text
65
+ llama.cpp/build/bin/llama-retrieval filter=lfs diff=lfs merge=lfs -text
66
+ llama.cpp/build/bin/llama-run filter=lfs diff=lfs merge=lfs -text
67
+ llama.cpp/build/bin/llama-save-load-state filter=lfs diff=lfs merge=lfs -text
68
+ llama.cpp/build/bin/llama-server filter=lfs diff=lfs merge=lfs -text
69
+ llama.cpp/build/bin/llama-speculative filter=lfs diff=lfs merge=lfs -text
70
+ llama.cpp/build/bin/llama-speculative-simple filter=lfs diff=lfs merge=lfs -text
71
+ llama.cpp/build/bin/llama-tokenize filter=lfs diff=lfs merge=lfs -text
72
+ llama.cpp/build/bin/llama-tts filter=lfs diff=lfs merge=lfs -text
73
+ llama.cpp/build/bin/test-arg-parser filter=lfs diff=lfs merge=lfs -text
74
+ llama.cpp/build/bin/test-backend-ops filter=lfs diff=lfs merge=lfs -text
75
+ llama.cpp/build/bin/test-chat filter=lfs diff=lfs merge=lfs -text
76
+ llama.cpp/build/bin/test-chat-parser filter=lfs diff=lfs merge=lfs -text
77
+ llama.cpp/build/bin/test-chat-template filter=lfs diff=lfs merge=lfs -text
78
+ llama.cpp/build/bin/test-grammar-integration filter=lfs diff=lfs merge=lfs -text
79
+ llama.cpp/build/bin/test-json-partial filter=lfs diff=lfs merge=lfs -text
80
+ llama.cpp/build/bin/test-json-schema-to-grammar filter=lfs diff=lfs merge=lfs -text
81
+ llama.cpp/build/bin/test-mtmd-c-api filter=lfs diff=lfs merge=lfs -text
82
+ llama.cpp/build/bin/test-quantize-stats filter=lfs diff=lfs merge=lfs -text
83
+ llama.cpp/build/bin/test-regex-partial filter=lfs diff=lfs merge=lfs -text
84
+ llama.cpp/build/bin/test-tokenizer-0 filter=lfs diff=lfs merge=lfs -text
85
+ llama.cpp/build/bin/test-tokenizer-1-bpe filter=lfs diff=lfs merge=lfs -text
86
+ llama.cpp/build/bin/test-tokenizer-1-spm filter=lfs diff=lfs merge=lfs -text
87
+ llama.cpp/build/common/CMakeFiles/common.dir/arg.cpp.o filter=lfs diff=lfs merge=lfs -text
88
+ llama.cpp/build/common/CMakeFiles/common.dir/chat-parser.cpp.o filter=lfs diff=lfs merge=lfs -text
89
+ llama.cpp/build/common/CMakeFiles/common.dir/chat.cpp.o filter=lfs diff=lfs merge=lfs -text
90
+ llama.cpp/build/common/CMakeFiles/common.dir/common.cpp.o filter=lfs diff=lfs merge=lfs -text
91
+ llama.cpp/build/common/CMakeFiles/common.dir/json-partial.cpp.o filter=lfs diff=lfs merge=lfs -text
92
+ llama.cpp/build/common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o filter=lfs diff=lfs merge=lfs -text
93
+ llama.cpp/build/common/CMakeFiles/common.dir/regex-partial.cpp.o filter=lfs diff=lfs merge=lfs -text
94
+ llama.cpp/build/common/libcommon.a filter=lfs diff=lfs merge=lfs -text
95
+ llama.cpp/build/ggml/src/CMakeFiles/ggml-base.dir/ggml-quants.c.o filter=lfs diff=lfs merge=lfs -text
96
+ llama.cpp/build/ggml/src/CMakeFiles/ggml-base.dir/ggml.c.o filter=lfs diff=lfs merge=lfs -text
97
+ llama.cpp/build/ggml/src/CMakeFiles/ggml-base.dir/gguf.cpp.o filter=lfs diff=lfs merge=lfs -text
98
+ llama.cpp/build/ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/ggml-cpu-aarch64.cpp.o filter=lfs diff=lfs merge=lfs -text
99
+ llama.cpp/build/ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/llamafile/sgemm.cpp.o filter=lfs diff=lfs merge=lfs -text
100
+ llama.cpp/build/ggml/src/CMakeFiles/ggml-cpu.dir/ggml-cpu/ops.cpp.o filter=lfs diff=lfs merge=lfs -text
101
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-arch.cpp.o filter=lfs diff=lfs merge=lfs -text
102
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-context.cpp.o filter=lfs diff=lfs merge=lfs -text
103
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-grammar.cpp.o filter=lfs diff=lfs merge=lfs -text
104
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-kv-cache.cpp.o filter=lfs diff=lfs merge=lfs -text
105
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-model-loader.cpp.o filter=lfs diff=lfs merge=lfs -text
106
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-model.cpp.o filter=lfs diff=lfs merge=lfs -text
107
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-quant.cpp.o filter=lfs diff=lfs merge=lfs -text
108
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-sampling.cpp.o filter=lfs diff=lfs merge=lfs -text
109
+ llama.cpp/build/src/CMakeFiles/llama.dir/llama-vocab.cpp.o filter=lfs diff=lfs merge=lfs -text
110
+ llama.cpp/build/src/CMakeFiles/llama.dir/unicode.cpp.o filter=lfs diff=lfs merge=lfs -text
111
+ llama.cpp/build/tests/CMakeFiles/test-backend-ops.dir/test-backend-ops.cpp.o filter=lfs diff=lfs merge=lfs -text
112
+ llama.cpp/build/tests/CMakeFiles/test-chat-parser.dir/test-chat-parser.cpp.o filter=lfs diff=lfs merge=lfs -text
113
+ llama.cpp/build/tests/CMakeFiles/test-chat-template.dir/test-chat-template.cpp.o filter=lfs diff=lfs merge=lfs -text
114
+ llama.cpp/build/tests/CMakeFiles/test-chat.dir/test-chat.cpp.o filter=lfs diff=lfs merge=lfs -text
115
+ llama.cpp/build/tests/CMakeFiles/test-grammar-integration.dir/test-grammar-integration.cpp.o filter=lfs diff=lfs merge=lfs -text
116
+ llama.cpp/build/tests/CMakeFiles/test-json-partial.dir/test-json-partial.cpp.o filter=lfs diff=lfs merge=lfs -text
117
+ llama.cpp/build/tests/CMakeFiles/test-json-schema-to-grammar.dir/test-json-schema-to-grammar.cpp.o filter=lfs diff=lfs merge=lfs -text
118
+ llama.cpp/build/tests/CMakeFiles/test-quantize-stats.dir/test-quantize-stats.cpp.o filter=lfs diff=lfs merge=lfs -text
119
+ llama.cpp/build/tools/imatrix/CMakeFiles/llama-imatrix.dir/imatrix.cpp.o filter=lfs diff=lfs merge=lfs -text
120
+ llama.cpp/build/tools/llama-bench/CMakeFiles/llama-bench.dir/llama-bench.cpp.o filter=lfs diff=lfs merge=lfs -text
121
+ llama.cpp/build/tools/main/CMakeFiles/llama-cli.dir/main.cpp.o filter=lfs diff=lfs merge=lfs -text
122
+ llama.cpp/build/tools/mtmd/CMakeFiles/mtmd.dir/clip.cpp.o filter=lfs diff=lfs merge=lfs -text
123
+ llama.cpp/build/tools/mtmd/CMakeFiles/mtmd.dir/mtmd.cpp.o filter=lfs diff=lfs merge=lfs -text
124
+ llama.cpp/build/tools/mtmd/CMakeFiles/mtmd_helper.dir/mtmd-helper.cpp.o filter=lfs diff=lfs merge=lfs -text
125
+ llama.cpp/build/tools/perplexity/CMakeFiles/llama-perplexity.dir/perplexity.cpp.o filter=lfs diff=lfs merge=lfs -text
126
+ llama.cpp/build/tools/run/CMakeFiles/llama-run.dir/run.cpp.o filter=lfs diff=lfs merge=lfs -text
127
+ llama.cpp/build/tools/server/CMakeFiles/llama-server.dir/server.cpp.o filter=lfs diff=lfs merge=lfs -text
128
+ llama.cpp/build/tools/tts/CMakeFiles/llama-tts.dir/tts.cpp.o filter=lfs diff=lfs merge=lfs -text
129
+ llama.cpp/docs/development/llama-star/idea-arch.key filter=lfs diff=lfs merge=lfs -text
130
+ llama.cpp/gguf-py/gguf/__pycache__/quants.cpython-311.pyc filter=lfs diff=lfs merge=lfs -text
131
+ llama.cpp/media/llama0-banner.png filter=lfs diff=lfs merge=lfs -text
132
+ llama.cpp/media/llama0-logo.png filter=lfs diff=lfs merge=lfs -text
133
+ llama.cpp/media/matmul.png filter=lfs diff=lfs merge=lfs -text
134
+ llama.cpp/models/ggml-vocab-aquila.gguf filter=lfs diff=lfs merge=lfs -text
135
+ llama.cpp/models/ggml-vocab-baichuan.gguf filter=lfs diff=lfs merge=lfs -text
136
+ llama.cpp/models/ggml-vocab-bert-bge.gguf filter=lfs diff=lfs merge=lfs -text
137
+ llama.cpp/models/ggml-vocab-command-r.gguf filter=lfs diff=lfs merge=lfs -text
138
+ llama.cpp/models/ggml-vocab-deepseek-coder.gguf filter=lfs diff=lfs merge=lfs -text
139
+ llama.cpp/models/ggml-vocab-deepseek-llm.gguf filter=lfs diff=lfs merge=lfs -text
140
+ llama.cpp/models/ggml-vocab-falcon.gguf filter=lfs diff=lfs merge=lfs -text
141
+ llama.cpp/models/ggml-vocab-gpt-2.gguf filter=lfs diff=lfs merge=lfs -text
142
+ llama.cpp/models/ggml-vocab-gpt-neox.gguf filter=lfs diff=lfs merge=lfs -text
143
+ llama.cpp/models/ggml-vocab-llama-bpe.gguf filter=lfs diff=lfs merge=lfs -text
144
+ llama.cpp/models/ggml-vocab-llama-spm.gguf filter=lfs diff=lfs merge=lfs -text
145
+ llama.cpp/models/ggml-vocab-mpt.gguf filter=lfs diff=lfs merge=lfs -text
146
+ llama.cpp/models/ggml-vocab-nomic-bert-moe.gguf filter=lfs diff=lfs merge=lfs -text
147
+ llama.cpp/models/ggml-vocab-phi-3.gguf filter=lfs diff=lfs merge=lfs -text
148
+ llama.cpp/models/ggml-vocab-qwen2.gguf filter=lfs diff=lfs merge=lfs -text
149
+ llama.cpp/models/ggml-vocab-refact.gguf filter=lfs diff=lfs merge=lfs -text
150
+ llama.cpp/models/ggml-vocab-starcoder.gguf filter=lfs diff=lfs merge=lfs -text
151
+ llama.cpp/tools/mtmd/test-1.jpeg filter=lfs diff=lfs merge=lfs -text
152
+ llama.cpp/tools/mtmd/test-2.mp3 filter=lfs diff=lfs merge=lfs -text
153
+ llama.cpp/tools/server/themes/buttons-top/buttons_top.png filter=lfs diff=lfs merge=lfs -text
154
+ llama.cpp/tools/server/themes/wild/llamapattern.png filter=lfs diff=lfs merge=lfs -text
155
+ llama.cpp/tools/server/themes/wild/wild.png filter=lfs diff=lfs merge=lfs -text
156
+ qwen3_1.7b_hf/tokenizer.json filter=lfs diff=lfs merge=lfs -text
157
+ qwen3_1.7b_q4_0.gguf filter=lfs diff=lfs merge=lfs -text
158
+ qwen3_8b.gguf filter=lfs diff=lfs merge=lfs -text
159
+ sample_data/mnist_test.csv filter=lfs diff=lfs merge=lfs -text
160
+ sample_data/mnist_train_small.csv filter=lfs diff=lfs merge=lfs -text
llama.cpp/.clang-format ADDED
@@ -0,0 +1,161 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ Language: Cpp
3
+ AlignAfterOpenBracket: Align
4
+ AlignArrayOfStructures: Left
5
+ AlignConsecutiveAssignments: AcrossComments
6
+ AlignConsecutiveBitFields: AcrossComments
7
+ AlignConsecutiveDeclarations: AcrossComments
8
+ AlignConsecutiveMacros: AcrossComments
9
+ # AlignConsecutiveShortCaseStatements: AcrossComments
10
+ AlignEscapedNewlines: Left # LeftWithLastLine
11
+ AlignOperands: Align
12
+ AlignTrailingComments:
13
+ Kind: Always
14
+ OverEmptyLines: 1
15
+ AllowAllArgumentsOnNextLine: true
16
+ AllowAllParametersOfDeclarationOnNextLine: false
17
+ # AllowBreakBeforeNoexceptSpecifier: OnlyWithParen
18
+ AllowShortBlocksOnASingleLine: Never
19
+ AllowShortCaseLabelsOnASingleLine: false
20
+ AllowShortFunctionsOnASingleLine: Inline
21
+ AllowShortIfStatementsOnASingleLine: Never
22
+ AllowShortLambdasOnASingleLine: Inline
23
+ AllowShortLoopsOnASingleLine: false
24
+ AlwaysBreakBeforeMultilineStrings: true
25
+ BinPackArguments: true
26
+ BinPackParameters: true # OnePerLine
27
+ BitFieldColonSpacing: Both
28
+ BreakBeforeBraces: Custom # Attach
29
+ BraceWrapping:
30
+ AfterCaseLabel: true
31
+ AfterClass: false
32
+ AfterControlStatement: false
33
+ AfterEnum: false
34
+ AfterFunction: false
35
+ AfterNamespace: false
36
+ AfterObjCDeclaration: false
37
+ AfterStruct: false
38
+ AfterUnion: false
39
+ AfterExternBlock: false
40
+ BeforeCatch: false
41
+ BeforeElse: false
42
+ BeforeLambdaBody: false
43
+ BeforeWhile: false
44
+ IndentBraces: false
45
+ SplitEmptyFunction: false
46
+ SplitEmptyRecord: false
47
+ SplitEmptyNamespace: false
48
+ # BreakAdjacentStringLiterals: true
49
+ BreakAfterAttributes: Never
50
+ BreakBeforeBinaryOperators: None
51
+ BreakBeforeInlineASMColon: OnlyMultiline
52
+ BreakBeforeTernaryOperators: false
53
+ # BreakBinaryOperations: Never
54
+ BreakConstructorInitializers: AfterColon
55
+ # BreakFunctionDefinitionParameters: false
56
+ BreakInheritanceList: AfterComma
57
+ BreakStringLiterals: true
58
+ # BreakTemplateDeclarations: Yes
59
+ ColumnLimit: 120
60
+ CommentPragmas: '^ IWYU pragma:'
61
+ CompactNamespaces: false
62
+ ConstructorInitializerIndentWidth: 4
63
+ ContinuationIndentWidth: 4
64
+ Cpp11BracedListStyle: false
65
+ DerivePointerAlignment: false
66
+ DisableFormat: false
67
+ EmptyLineBeforeAccessModifier: Leave
68
+ EmptyLineAfterAccessModifier: Never
69
+ ExperimentalAutoDetectBinPacking: false
70
+ FixNamespaceComments: true
71
+ IncludeBlocks: Regroup
72
+ IncludeCategories:
73
+ - Regex: '^<.*\.h>'
74
+ Priority: 1
75
+ SortPriority: 0
76
+ - Regex: '^<.*'
77
+ Priority: 2
78
+ SortPriority: 0
79
+ - Regex: '.*'
80
+ Priority: 3
81
+ SortPriority: 0
82
+ IncludeIsMainRegex: '([-_](test|unittest))?$'
83
+ IncludeIsMainSourceRegex: ''
84
+ IndentAccessModifiers: false
85
+ IndentCaseBlocks: true
86
+ IndentCaseLabels: true
87
+ IndentExternBlock: NoIndent
88
+ IndentGotoLabels: false
89
+ IndentPPDirectives: AfterHash
90
+ IndentWidth: 4
91
+ IndentWrappedFunctionNames: false
92
+ InsertBraces: true # NOTE: may lead to incorrect formatting
93
+ InsertNewlineAtEOF: true
94
+ JavaScriptQuotes: Leave
95
+ JavaScriptWrapImports: true
96
+ KeepEmptyLinesAtTheStartOfBlocks: false
97
+ LambdaBodyIndentation: Signature
98
+ LineEnding: LF
99
+ MacroBlockBegin: ''
100
+ MacroBlockEnd: ''
101
+ MaxEmptyLinesToKeep: 1
102
+ NamespaceIndentation: None
103
+ ObjCBinPackProtocolList: Auto
104
+ ObjCBlockIndentWidth: 4
105
+ ObjCSpaceAfterProperty: true
106
+ ObjCSpaceBeforeProtocolList: true
107
+ PPIndentWidth: -1
108
+ PackConstructorInitializers: CurrentLine
109
+ PenaltyBreakAssignment: 2
110
+ PenaltyBreakBeforeFirstCallParameter: 1
111
+ PenaltyBreakComment: 300
112
+ PenaltyBreakFirstLessLess: 120
113
+ PenaltyBreakString: 1000
114
+ PenaltyBreakTemplateDeclaration: 10
115
+ PenaltyExcessCharacter: 1000000
116
+ PenaltyReturnTypeOnItsOwnLine: 200
117
+ PointerAlignment: Middle
118
+ QualifierAlignment: Left
119
+ #QualifierOrder: ['static', 'inline', 'friend', 'constexpr', 'const', 'volatile', 'type', 'restrict']
120
+ RawStringFormats:
121
+ - Language: Cpp
122
+ Delimiters:
123
+ - cc
124
+ - CC
125
+ - cpp
126
+ - Cpp
127
+ - CPP
128
+ - 'c++'
129
+ - 'C++'
130
+ CanonicalDelimiter: ''
131
+ ReferenceAlignment: Middle
132
+ ReflowComments: false # IndentOnly
133
+ SeparateDefinitionBlocks: Always
134
+ SortIncludes: CaseInsensitive
135
+ SortUsingDeclarations: LexicographicNumeric
136
+ SpaceAfterCStyleCast: true
137
+ SpaceAfterLogicalNot: false
138
+ SpaceAfterTemplateKeyword: true
139
+ SpaceBeforeAssignmentOperators: true
140
+ SpaceBeforeCpp11BracedList: false
141
+ SpaceBeforeCtorInitializerColon: true
142
+ SpaceBeforeInheritanceColon: true
143
+ SpaceBeforeParens: ControlStatements
144
+ SpaceBeforeRangeBasedForLoopColon: true
145
+ SpaceInEmptyBlock: false
146
+ SpaceInEmptyParentheses: false
147
+ SpacesBeforeTrailingComments: 2
148
+ SpacesInAngles: Never
149
+ SpacesInContainerLiterals: true
150
+ SpacesInLineCommentPrefix:
151
+ Minimum: 1
152
+ Maximum: -1
153
+ SpacesInParentheses: false
154
+ SpacesInSquareBrackets: false
155
+ SpaceBeforeSquareBrackets: false
156
+ Standard: c++17
157
+ TabWidth: 4
158
+ UseTab: Never
159
+ WhitespaceSensitiveMacros: ['STRINGIZE']
160
+ ...
161
+
llama.cpp/.clang-tidy ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ Checks: >
3
+ bugprone-*,
4
+ -bugprone-easily-swappable-parameters,
5
+ -bugprone-implicit-widening-of-multiplication-result,
6
+ -bugprone-misplaced-widening-cast,
7
+ -bugprone-narrowing-conversions,
8
+ readability-*,
9
+ -readability-avoid-unconditional-preprocessor-if,
10
+ -readability-function-cognitive-complexity,
11
+ -readability-identifier-length,
12
+ -readability-implicit-bool-conversion,
13
+ -readability-magic-numbers,
14
+ -readability-uppercase-literal-suffix,
15
+ -readability-simplify-boolean-expr,
16
+ -readability-math-missing-parentheses,
17
+ clang-analyzer-*,
18
+ -clang-analyzer-security.insecureAPI.DeprecatedOrUnsafeBufferHandling,
19
+ performance-*,
20
+ portability-*,
21
+ -portability-simd-intrinsics,
22
+ misc-*,
23
+ -misc-const-correctness,
24
+ -misc-non-private-member-variables-in-classes,
25
+ -misc-no-recursion,
26
+ -misc-use-anonymous-namespace,
27
+ FormatStyle: none
llama.cpp/.devops/cloud-v-pipeline ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ node('x86_runner1'){ // Running on x86 runner containing latest vector qemu, latest vector gcc and all the necessary libraries
2
+ stage('Cleanup'){
3
+ cleanWs() // Cleaning previous CI build in workspace
4
+ }
5
+ stage('checkout repo'){
6
+ retry(5){ // Retry if the cloning fails due to some reason
7
+ checkout scm // Clone the repo on Runner
8
+ }
9
+ }
10
+ stage('Compiling llama.cpp'){
11
+ sh'''#!/bin/bash
12
+ make RISCV=1 RISCV_CROSS_COMPILE=1 # Compiling llama for RISC-V
13
+ '''
14
+ }
15
+ stage('Running llama.cpp'){
16
+ sh'''#!/bin/bash
17
+ module load gnu-bin2/0.1 # loading latest versions of vector qemu and vector gcc
18
+ qemu-riscv64 -L /softwares/gnu-bin2/sysroot -cpu rv64,v=true,vlen=256,elen=64,vext_spec=v1.0 ./llama-cli -m /home/alitariq/codellama-7b.Q4_K_M.gguf -p "Anything" -n 9 > llama_log.txt # Running llama.cpp on vector qemu-riscv64
19
+ cat llama_log.txt # Printing results
20
+ '''
21
+ }
22
+ }
llama.cpp/.devops/cpu.Dockerfile ADDED
@@ -0,0 +1,92 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ARG UBUNTU_VERSION=22.04
2
+
3
+ FROM ubuntu:$UBUNTU_VERSION AS build
4
+
5
+ ARG TARGETARCH
6
+
7
+ ARG GGML_CPU_ARM_ARCH=armv8-a
8
+
9
+ RUN apt-get update && \
10
+ apt-get install -y build-essential git cmake libcurl4-openssl-dev
11
+
12
+ WORKDIR /app
13
+
14
+ COPY . .
15
+
16
+ RUN if [ "$TARGETARCH" = "amd64" ]; then \
17
+ cmake -S . -B build -DCMAKE_BUILD_TYPE=Release -DGGML_NATIVE=OFF -DLLAMA_BUILD_TESTS=OFF -DGGML_BACKEND_DL=ON -DGGML_CPU_ALL_VARIANTS=ON; \
18
+ elif [ "$TARGETARCH" = "arm64" ]; then \
19
+ cmake -S . -B build -DCMAKE_BUILD_TYPE=Release -DGGML_NATIVE=OFF -DLLAMA_BUILD_TESTS=OFF -DGGML_CPU_ARM_ARCH=${GGML_CPU_ARM_ARCH}; \
20
+ else \
21
+ echo "Unsupported architecture"; \
22
+ exit 1; \
23
+ fi && \
24
+ cmake --build build -j $(nproc)
25
+
26
+ RUN mkdir -p /app/lib && \
27
+ find build -name "*.so" -exec cp {} /app/lib \;
28
+
29
+ RUN mkdir -p /app/full \
30
+ && cp build/bin/* /app/full \
31
+ && cp *.py /app/full \
32
+ && cp -r gguf-py /app/full \
33
+ && cp -r requirements /app/full \
34
+ && cp requirements.txt /app/full \
35
+ && cp .devops/tools.sh /app/full/tools.sh
36
+
37
+ ## Base image
38
+ FROM ubuntu:$UBUNTU_VERSION AS base
39
+
40
+ RUN apt-get update \
41
+ && apt-get install -y libgomp1 curl\
42
+ && apt autoremove -y \
43
+ && apt clean -y \
44
+ && rm -rf /tmp/* /var/tmp/* \
45
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
46
+ && find /var/cache -type f -delete
47
+
48
+ COPY --from=build /app/lib/ /app
49
+
50
+ ### Full
51
+ FROM base AS full
52
+
53
+ COPY --from=build /app/full /app
54
+
55
+ WORKDIR /app
56
+
57
+ RUN apt-get update \
58
+ && apt-get install -y \
59
+ git \
60
+ python3 \
61
+ python3-pip \
62
+ && pip install --upgrade pip setuptools wheel \
63
+ && pip install -r requirements.txt \
64
+ && apt autoremove -y \
65
+ && apt clean -y \
66
+ && rm -rf /tmp/* /var/tmp/* \
67
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
68
+ && find /var/cache -type f -delete
69
+
70
+ ENTRYPOINT ["/app/tools.sh"]
71
+
72
+ ### Light, CLI only
73
+ FROM base AS light
74
+
75
+ COPY --from=build /app/full/llama-cli /app
76
+
77
+ WORKDIR /app
78
+
79
+ ENTRYPOINT [ "/app/llama-cli" ]
80
+
81
+ ### Server, Server only
82
+ FROM base AS server
83
+
84
+ ENV LLAMA_ARG_HOST=0.0.0.0
85
+
86
+ COPY --from=build /app/full/llama-server /app
87
+
88
+ WORKDIR /app
89
+
90
+ HEALTHCHECK CMD [ "curl", "-f", "http://localhost:8080/health" ]
91
+
92
+ ENTRYPOINT [ "/app/llama-server" ]
llama.cpp/.devops/cuda.Dockerfile ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ARG UBUNTU_VERSION=22.04
2
+ # This needs to generally match the container host's environment.
3
+ ARG CUDA_VERSION=12.4.0
4
+ # Target the CUDA build image
5
+ ARG BASE_CUDA_DEV_CONTAINER=nvidia/cuda:${CUDA_VERSION}-devel-ubuntu${UBUNTU_VERSION}
6
+
7
+ ARG BASE_CUDA_RUN_CONTAINER=nvidia/cuda:${CUDA_VERSION}-runtime-ubuntu${UBUNTU_VERSION}
8
+
9
+ FROM ${BASE_CUDA_DEV_CONTAINER} AS build
10
+
11
+ # CUDA architecture to build for (defaults to all supported archs)
12
+ ARG CUDA_DOCKER_ARCH=default
13
+
14
+ RUN apt-get update && \
15
+ apt-get install -y build-essential cmake python3 python3-pip git libcurl4-openssl-dev libgomp1
16
+
17
+ WORKDIR /app
18
+
19
+ COPY . .
20
+
21
+ RUN if [ "${CUDA_DOCKER_ARCH}" != "default" ]; then \
22
+ export CMAKE_ARGS="-DCMAKE_CUDA_ARCHITECTURES=${CUDA_DOCKER_ARCH}"; \
23
+ fi && \
24
+ cmake -B build -DGGML_NATIVE=OFF -DGGML_CUDA=ON -DGGML_BACKEND_DL=ON -DGGML_CPU_ALL_VARIANTS=ON -DLLAMA_BUILD_TESTS=OFF ${CMAKE_ARGS} -DCMAKE_EXE_LINKER_FLAGS=-Wl,--allow-shlib-undefined . && \
25
+ cmake --build build --config Release -j$(nproc)
26
+
27
+ RUN mkdir -p /app/lib && \
28
+ find build -name "*.so" -exec cp {} /app/lib \;
29
+
30
+ RUN mkdir -p /app/full \
31
+ && cp build/bin/* /app/full \
32
+ && cp *.py /app/full \
33
+ && cp -r gguf-py /app/full \
34
+ && cp -r requirements /app/full \
35
+ && cp requirements.txt /app/full \
36
+ && cp .devops/tools.sh /app/full/tools.sh
37
+
38
+ ## Base image
39
+ FROM ${BASE_CUDA_RUN_CONTAINER} AS base
40
+
41
+ RUN apt-get update \
42
+ && apt-get install -y libgomp1 curl\
43
+ && apt autoremove -y \
44
+ && apt clean -y \
45
+ && rm -rf /tmp/* /var/tmp/* \
46
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
47
+ && find /var/cache -type f -delete
48
+
49
+ COPY --from=build /app/lib/ /app
50
+
51
+ ### Full
52
+ FROM base AS full
53
+
54
+ COPY --from=build /app/full /app
55
+
56
+ WORKDIR /app
57
+
58
+ RUN apt-get update \
59
+ && apt-get install -y \
60
+ git \
61
+ python3 \
62
+ python3-pip \
63
+ && pip install --upgrade pip setuptools wheel \
64
+ && pip install -r requirements.txt \
65
+ && apt autoremove -y \
66
+ && apt clean -y \
67
+ && rm -rf /tmp/* /var/tmp/* \
68
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
69
+ && find /var/cache -type f -delete
70
+
71
+
72
+ ENTRYPOINT ["/app/tools.sh"]
73
+
74
+ ### Light, CLI only
75
+ FROM base AS light
76
+
77
+ COPY --from=build /app/full/llama-cli /app
78
+
79
+ WORKDIR /app
80
+
81
+ ENTRYPOINT [ "/app/llama-cli" ]
82
+
83
+ ### Server, Server only
84
+ FROM base AS server
85
+
86
+ ENV LLAMA_ARG_HOST=0.0.0.0
87
+
88
+ COPY --from=build /app/full/llama-server /app
89
+
90
+ WORKDIR /app
91
+
92
+ HEALTHCHECK CMD [ "curl", "-f", "http://localhost:8080/health" ]
93
+
94
+ ENTRYPOINT [ "/app/llama-server" ]
llama.cpp/.devops/intel.Dockerfile ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ARG ONEAPI_VERSION=2025.1.1-0-devel-ubuntu24.04
2
+
3
+ ## Build Image
4
+
5
+ FROM intel/oneapi-basekit:$ONEAPI_VERSION AS build
6
+
7
+ ARG GGML_SYCL_F16=OFF
8
+ RUN apt-get update && \
9
+ apt-get install -y git libcurl4-openssl-dev
10
+
11
+ WORKDIR /app
12
+
13
+ COPY . .
14
+
15
+ RUN if [ "${GGML_SYCL_F16}" = "ON" ]; then \
16
+ echo "GGML_SYCL_F16 is set" \
17
+ && export OPT_SYCL_F16="-DGGML_SYCL_F16=ON"; \
18
+ fi && \
19
+ echo "Building with dynamic libs" && \
20
+ cmake -B build -DGGML_NATIVE=OFF -DGGML_SYCL=ON -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DGGML_BACKEND_DL=ON -DGGML_CPU_ALL_VARIANTS=ON -DLLAMA_BUILD_TESTS=OFF ${OPT_SYCL_F16} && \
21
+ cmake --build build --config Release -j$(nproc)
22
+
23
+ RUN mkdir -p /app/lib && \
24
+ find build -name "*.so" -exec cp {} /app/lib \;
25
+
26
+ RUN mkdir -p /app/full \
27
+ && cp build/bin/* /app/full \
28
+ && cp *.py /app/full \
29
+ && cp -r gguf-py /app/full \
30
+ && cp -r requirements /app/full \
31
+ && cp requirements.txt /app/full \
32
+ && cp .devops/tools.sh /app/full/tools.sh
33
+
34
+ FROM intel/oneapi-basekit:$ONEAPI_VERSION AS base
35
+
36
+ RUN apt-get update \
37
+ && apt-get install -y libgomp1 curl\
38
+ && apt autoremove -y \
39
+ && apt clean -y \
40
+ && rm -rf /tmp/* /var/tmp/* \
41
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
42
+ && find /var/cache -type f -delete
43
+
44
+ ### Full
45
+ FROM base AS full
46
+
47
+ COPY --from=build /app/lib/ /app
48
+ COPY --from=build /app/full /app
49
+
50
+ WORKDIR /app
51
+
52
+ RUN apt-get update \
53
+ && apt-get install -y \
54
+ git \
55
+ python3 \
56
+ python3-pip \
57
+ && pip install --upgrade pip setuptools wheel \
58
+ && pip install -r requirements.txt \
59
+ && apt autoremove -y \
60
+ && apt clean -y \
61
+ && rm -rf /tmp/* /var/tmp/* \
62
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
63
+ && find /var/cache -type f -delete
64
+
65
+
66
+ ENTRYPOINT ["/app/tools.sh"]
67
+
68
+ ### Light, CLI only
69
+ FROM base AS light
70
+
71
+ COPY --from=build /app/lib/ /app
72
+ COPY --from=build /app/full/llama-cli /app
73
+
74
+ WORKDIR /app
75
+
76
+ ENTRYPOINT [ "/app/llama-cli" ]
77
+
78
+ ### Server, Server only
79
+ FROM base AS server
80
+
81
+ ENV LLAMA_ARG_HOST=0.0.0.0
82
+
83
+ COPY --from=build /app/lib/ /app
84
+ COPY --from=build /app/full/llama-server /app
85
+
86
+ WORKDIR /app
87
+
88
+ HEALTHCHECK CMD [ "curl", "-f", "http://localhost:8080/health" ]
89
+
90
+ ENTRYPOINT [ "/app/llama-server" ]
91
+
llama.cpp/.devops/llama-cli-cann.Dockerfile ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ARG ASCEND_VERSION=8.1.RC1.alpha001-910b-openeuler22.03-py3.10
2
+
3
+ FROM ascendai/cann:$ASCEND_VERSION AS build
4
+
5
+ WORKDIR /app
6
+
7
+ COPY . .
8
+
9
+ RUN yum install -y gcc g++ cmake make libcurl-devel
10
+ ENV ASCEND_TOOLKIT_HOME=/usr/local/Ascend/ascend-toolkit/latest
11
+ ENV LIBRARY_PATH=${ASCEND_TOOLKIT_HOME}/lib64:$LIBRARY_PATH
12
+ ENV LD_LIBRARY_PATH=${ASCEND_TOOLKIT_HOME}/lib64:${ASCEND_TOOLKIT_HOME}/lib64/plugin/opskernel:${ASCEND_TOOLKIT_HOME}/lib64/plugin/nnengine:${ASCEND_TOOLKIT_HOME}/opp/built-in/op_impl/ai_core/tbe/op_tiling:${LD_LIBRARY_PATH}
13
+ ENV PYTHONPATH=${ASCEND_TOOLKIT_HOME}/python/site-packages:${ASCEND_TOOLKIT_HOME}/opp/built-in/op_impl/ai_core/tbe:${PYTHONPATH}
14
+ ENV PATH=${ASCEND_TOOLKIT_HOME}/bin:${ASCEND_TOOLKIT_HOME}/compiler/ccec_compiler/bin:${PATH}
15
+ ENV ASCEND_AICPU_PATH=${ASCEND_TOOLKIT_HOME}
16
+ ENV ASCEND_OPP_PATH=${ASCEND_TOOLKIT_HOME}/opp
17
+ ENV TOOLCHAIN_HOME=${ASCEND_TOOLKIT_HOME}/toolkit
18
+ ENV ASCEND_HOME_PATH=${ASCEND_TOOLKIT_HOME}
19
+
20
+ # find libascend_hal.so, because the drive hasn`t been mounted.
21
+ ENV LD_LIBRARY_PATH=${ASCEND_TOOLKIT_HOME}/runtime/lib64/stub:$LD_LIBRARY_PATH
22
+
23
+ RUN echo "Building with static libs" && \
24
+ source /usr/local/Ascend/ascend-toolkit/set_env.sh --force && \
25
+ cmake -B build -DGGML_NATIVE=OFF -DGGML_CANN=ON -DBUILD_SHARED_LIBS=OFF -DLLAMA_BUILD_TESTS=OFF && \
26
+ cmake --build build --config Release --target llama-cli
27
+
28
+ # TODO: use image with NNRT
29
+ FROM ascendai/cann:$ASCEND_VERSION AS runtime
30
+ COPY --from=build /app/build/bin/llama-cli /llama-cli
31
+
32
+ ENV LC_ALL=C.utf8
33
+
34
+ ENV ASCEND_TOOLKIT_HOME=/usr/local/Ascend/ascend-toolkit/latest
35
+ ENV LIBRARY_PATH=${ASCEND_TOOLKIT_HOME}/lib64:$LIBRARY_PATH
36
+ ENV LD_LIBRARY_PATH=${ASCEND_TOOLKIT_HOME}/lib64:${ASCEND_TOOLKIT_HOME}/lib64/plugin/opskernel:${ASCEND_TOOLKIT_HOME}/lib64/plugin/nnengine:${ASCEND_TOOLKIT_HOME}/opp/built-in/op_impl/ai_core/tbe/op_tiling:${LD_LIBRARY_PATH}
37
+ ENV PYTHONPATH=${ASCEND_TOOLKIT_HOME}/python/site-packages:${ASCEND_TOOLKIT_HOME}/opp/built-in/op_impl/ai_core/tbe:${PYTHONPATH}
38
+ ENV PATH=${ASCEND_TOOLKIT_HOME}/bin:${ASCEND_TOOLKIT_HOME}/compiler/ccec_compiler/bin:${PATH}
39
+ ENV ASCEND_AICPU_PATH=${ASCEND_TOOLKIT_HOME}
40
+ ENV ASCEND_OPP_PATH=${ASCEND_TOOLKIT_HOME}/opp
41
+ ENV TOOLCHAIN_HOME=${ASCEND_TOOLKIT_HOME}/toolkit
42
+ ENV ASCEND_HOME_PATH=${ASCEND_TOOLKIT_HOME}
43
+
44
+ ENTRYPOINT ["/llama-cli" ]
llama.cpp/.devops/llama-cpp-cuda.srpm.spec ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # SRPM for building from source and packaging an RPM for RPM-based distros.
2
+ # https://docs.fedoraproject.org/en-US/quick-docs/creating-rpm-packages
3
+ # Built and maintained by John Boero - [email protected]
4
+ # In honor of Seth Vidal https://www.redhat.com/it/blog/thank-you-seth-vidal
5
+
6
+ # Notes for llama.cpp:
7
+ # 1. Tags are currently based on hash - which will not sort asciibetically.
8
+ # We need to declare standard versioning if people want to sort latest releases.
9
+ # 2. Builds for CUDA/OpenCL support are separate, with different depenedencies.
10
+ # 3. NVidia's developer repo must be enabled with nvcc, cublas, clblas, etc installed.
11
+ # Example: https://developer.download.nvidia.com/compute/cuda/repos/fedora37/x86_64/cuda-fedora37.repo
12
+ # 4. OpenCL/CLBLAST support simply requires the ICD loader and basic opencl libraries.
13
+ # It is up to the user to install the correct vendor-specific support.
14
+
15
+ Name: llama.cpp-cuda
16
+ Version: %( date "+%%Y%%m%%d" )
17
+ Release: 1%{?dist}
18
+ Summary: CPU Inference of LLaMA model in pure C/C++ (no CUDA/OpenCL)
19
+ License: MIT
20
+ Source0: https://github.com/ggml-org/llama.cpp/archive/refs/heads/master.tar.gz
21
+ BuildRequires: coreutils make gcc-c++ git cuda-toolkit
22
+ Requires: cuda-toolkit
23
+ URL: https://github.com/ggml-org/llama.cpp
24
+
25
+ %define debug_package %{nil}
26
+ %define source_date_epoch_from_changelog 0
27
+
28
+ %description
29
+ CPU inference for Meta's Lllama2 models using default options.
30
+
31
+ %prep
32
+ %setup -n llama.cpp-master
33
+
34
+ %build
35
+ make -j GGML_CUDA=1
36
+
37
+ %install
38
+ mkdir -p %{buildroot}%{_bindir}/
39
+ cp -p llama-cli %{buildroot}%{_bindir}/llama-cuda-cli
40
+ cp -p llama-server %{buildroot}%{_bindir}/llama-cuda-server
41
+ cp -p llama-simple %{buildroot}%{_bindir}/llama-cuda-simple
42
+
43
+ mkdir -p %{buildroot}/usr/lib/systemd/system
44
+ %{__cat} <<EOF > %{buildroot}/usr/lib/systemd/system/llamacuda.service
45
+ [Unit]
46
+ Description=Llama.cpp server, CPU only (no GPU support in this build).
47
+ After=syslog.target network.target local-fs.target remote-fs.target nss-lookup.target
48
+
49
+ [Service]
50
+ Type=simple
51
+ EnvironmentFile=/etc/sysconfig/llama
52
+ ExecStart=/usr/bin/llama-cuda-server $LLAMA_ARGS
53
+ ExecReload=/bin/kill -s HUP $MAINPID
54
+ Restart=never
55
+
56
+ [Install]
57
+ WantedBy=default.target
58
+ EOF
59
+
60
+ mkdir -p %{buildroot}/etc/sysconfig
61
+ %{__cat} <<EOF > %{buildroot}/etc/sysconfig/llama
62
+ LLAMA_ARGS="-m /opt/llama2/ggml-model-f32.bin"
63
+ EOF
64
+
65
+ %clean
66
+ rm -rf %{buildroot}
67
+ rm -rf %{_builddir}/*
68
+
69
+ %files
70
+ %{_bindir}/llama-cuda-cli
71
+ %{_bindir}/llama-cuda-server
72
+ %{_bindir}/llama-cuda-simple
73
+ /usr/lib/systemd/system/llamacuda.service
74
+ %config /etc/sysconfig/llama
75
+
76
+ %pre
77
+
78
+ %post
79
+
80
+ %preun
81
+ %postun
82
+
83
+ %changelog
llama.cpp/.devops/llama-cpp.srpm.spec ADDED
@@ -0,0 +1,85 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # SRPM for building from source and packaging an RPM for RPM-based distros.
2
+ # https://docs.fedoraproject.org/en-US/quick-docs/creating-rpm-packages
3
+ # Built and maintained by John Boero - [email protected]
4
+ # In honor of Seth Vidal https://www.redhat.com/it/blog/thank-you-seth-vidal
5
+
6
+ # Notes for llama.cpp:
7
+ # 1. Tags are currently based on hash - which will not sort asciibetically.
8
+ # We need to declare standard versioning if people want to sort latest releases.
9
+ # In the meantime, YYYYMMDD format will be used.
10
+ # 2. Builds for CUDA/OpenCL support are separate, with different depenedencies.
11
+ # 3. NVidia's developer repo must be enabled with nvcc, cublas, clblas, etc installed.
12
+ # Example: https://developer.download.nvidia.com/compute/cuda/repos/fedora37/x86_64/cuda-fedora37.repo
13
+ # 4. OpenCL/CLBLAST support simply requires the ICD loader and basic opencl libraries.
14
+ # It is up to the user to install the correct vendor-specific support.
15
+
16
+ Name: llama.cpp
17
+ Version: %( date "+%%Y%%m%%d" )
18
+ Release: 1%{?dist}
19
+ Summary: CPU Inference of LLaMA model in pure C/C++ (no CUDA/OpenCL)
20
+ License: MIT
21
+ Source0: https://github.com/ggml-org/llama.cpp/archive/refs/heads/master.tar.gz
22
+ BuildRequires: coreutils make gcc-c++ git libstdc++-devel
23
+ Requires: libstdc++
24
+ URL: https://github.com/ggml-org/llama.cpp
25
+
26
+ %define debug_package %{nil}
27
+ %define source_date_epoch_from_changelog 0
28
+
29
+ %description
30
+ CPU inference for Meta's Lllama2 models using default options.
31
+ Models are not included in this package and must be downloaded separately.
32
+
33
+ %prep
34
+ %setup -n llama.cpp-master
35
+
36
+ %build
37
+ make -j
38
+
39
+ %install
40
+ mkdir -p %{buildroot}%{_bindir}/
41
+ cp -p llama-cli %{buildroot}%{_bindir}/llama-cli
42
+ cp -p llama-server %{buildroot}%{_bindir}/llama-server
43
+ cp -p llama-simple %{buildroot}%{_bindir}/llama-simple
44
+
45
+ mkdir -p %{buildroot}/usr/lib/systemd/system
46
+ %{__cat} <<EOF > %{buildroot}/usr/lib/systemd/system/llama.service
47
+ [Unit]
48
+ Description=Llama.cpp server, CPU only (no GPU support in this build).
49
+ After=syslog.target network.target local-fs.target remote-fs.target nss-lookup.target
50
+
51
+ [Service]
52
+ Type=simple
53
+ EnvironmentFile=/etc/sysconfig/llama
54
+ ExecStart=/usr/bin/llama-server $LLAMA_ARGS
55
+ ExecReload=/bin/kill -s HUP $MAINPID
56
+ Restart=never
57
+
58
+ [Install]
59
+ WantedBy=default.target
60
+ EOF
61
+
62
+ mkdir -p %{buildroot}/etc/sysconfig
63
+ %{__cat} <<EOF > %{buildroot}/etc/sysconfig/llama
64
+ LLAMA_ARGS="-m /opt/llama2/ggml-model-f32.bin"
65
+ EOF
66
+
67
+ %clean
68
+ rm -rf %{buildroot}
69
+ rm -rf %{_builddir}/*
70
+
71
+ %files
72
+ %{_bindir}/llama-cli
73
+ %{_bindir}/llama-server
74
+ %{_bindir}/llama-simple
75
+ /usr/lib/systemd/system/llama.service
76
+ %config /etc/sysconfig/llama
77
+
78
+ %pre
79
+
80
+ %post
81
+
82
+ %preun
83
+ %postun
84
+
85
+ %changelog
llama.cpp/.devops/musa.Dockerfile ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ARG UBUNTU_VERSION=22.04
2
+ # This needs to generally match the container host's environment.
3
+ ARG MUSA_VERSION=rc4.0.1
4
+ # Target the MUSA build image
5
+ ARG BASE_MUSA_DEV_CONTAINER=mthreads/musa:${MUSA_VERSION}-mudnn-devel-ubuntu${UBUNTU_VERSION}
6
+
7
+ ARG BASE_MUSA_RUN_CONTAINER=mthreads/musa:${MUSA_VERSION}-mudnn-runtime-ubuntu${UBUNTU_VERSION}
8
+
9
+ FROM ${BASE_MUSA_DEV_CONTAINER} AS build
10
+
11
+ # MUSA architecture to build for (defaults to all supported archs)
12
+ ARG MUSA_DOCKER_ARCH=default
13
+
14
+ RUN apt-get update && \
15
+ apt-get install -y \
16
+ build-essential \
17
+ cmake \
18
+ python3 \
19
+ python3-pip \
20
+ git \
21
+ libcurl4-openssl-dev \
22
+ libgomp1
23
+
24
+ WORKDIR /app
25
+
26
+ COPY . .
27
+
28
+ RUN if [ "${MUSA_DOCKER_ARCH}" != "default" ]; then \
29
+ export CMAKE_ARGS="-DMUSA_ARCHITECTURES=${MUSA_DOCKER_ARCH}"; \
30
+ fi && \
31
+ cmake -B build -DGGML_NATIVE=OFF -DGGML_MUSA=ON -DGGML_BACKEND_DL=ON -DGGML_CPU_ALL_VARIANTS=ON -DLLAMA_BUILD_TESTS=OFF ${CMAKE_ARGS} -DCMAKE_EXE_LINKER_FLAGS=-Wl,--allow-shlib-undefined . && \
32
+ cmake --build build --config Release -j$(nproc)
33
+
34
+ RUN mkdir -p /app/lib && \
35
+ find build -name "*.so" -exec cp {} /app/lib \;
36
+
37
+ RUN mkdir -p /app/full \
38
+ && cp build/bin/* /app/full \
39
+ && cp *.py /app/full \
40
+ && cp -r gguf-py /app/full \
41
+ && cp -r requirements /app/full \
42
+ && cp requirements.txt /app/full \
43
+ && cp .devops/tools.sh /app/full/tools.sh
44
+
45
+ ## Base image
46
+ FROM ${BASE_MUSA_RUN_CONTAINER} AS base
47
+
48
+ RUN apt-get update \
49
+ && apt-get install -y libgomp1 curl\
50
+ && apt autoremove -y \
51
+ && apt clean -y \
52
+ && rm -rf /tmp/* /var/tmp/* \
53
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
54
+ && find /var/cache -type f -delete
55
+
56
+ COPY --from=build /app/lib/ /app
57
+
58
+ ### Full
59
+ FROM base AS full
60
+
61
+ COPY --from=build /app/full /app
62
+
63
+ WORKDIR /app
64
+
65
+ RUN apt-get update \
66
+ && apt-get install -y \
67
+ git \
68
+ python3 \
69
+ python3-pip \
70
+ && pip install --upgrade pip setuptools wheel \
71
+ && pip install -r requirements.txt \
72
+ && apt autoremove -y \
73
+ && apt clean -y \
74
+ && rm -rf /tmp/* /var/tmp/* \
75
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
76
+ && find /var/cache -type f -delete
77
+
78
+
79
+ ENTRYPOINT ["/app/tools.sh"]
80
+
81
+ ### Light, CLI only
82
+ FROM base AS light
83
+
84
+ COPY --from=build /app/full/llama-cli /app
85
+
86
+ WORKDIR /app
87
+
88
+ ENTRYPOINT [ "/app/llama-cli" ]
89
+
90
+ ### Server, Server only
91
+ FROM base AS server
92
+
93
+ ENV LLAMA_ARG_HOST=0.0.0.0
94
+
95
+ COPY --from=build /app/full/llama-server /app
96
+
97
+ WORKDIR /app
98
+
99
+ HEALTHCHECK CMD [ "curl", "-f", "http://localhost:8080/health" ]
100
+
101
+ ENTRYPOINT [ "/app/llama-server" ]
llama.cpp/.devops/nix/apps.nix ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ perSystem =
3
+ { config, lib, ... }:
4
+ {
5
+ apps =
6
+ let
7
+ inherit (config.packages) default;
8
+ binaries = [
9
+ "llama-cli"
10
+ "llama-embedding"
11
+ "llama-server"
12
+ "llama-quantize"
13
+ ];
14
+ mkApp = name: {
15
+ type = "app";
16
+ program = "${default}/bin/${name}";
17
+ };
18
+ in
19
+ lib.genAttrs binaries mkApp;
20
+ };
21
+ }
llama.cpp/.devops/nix/devshells.nix ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ { inputs, ... }:
2
+
3
+ {
4
+ perSystem =
5
+ {
6
+ config,
7
+ lib,
8
+ system,
9
+ ...
10
+ }:
11
+ {
12
+ devShells =
13
+ let
14
+ pkgs = import inputs.nixpkgs { inherit system; };
15
+ stdenv = pkgs.stdenv;
16
+ scripts = config.packages.python-scripts;
17
+ in
18
+ lib.pipe (config.packages) [
19
+ (lib.concatMapAttrs (
20
+ name: package: {
21
+ ${name} = pkgs.mkShell {
22
+ name = "${name}";
23
+ inputsFrom = [ package ];
24
+ shellHook = ''
25
+ echo "Entering ${name} devShell"
26
+ '';
27
+ };
28
+ "${name}-extra" =
29
+ if (name == "python-scripts") then
30
+ null
31
+ else
32
+ pkgs.mkShell {
33
+ name = "${name}-extra";
34
+ inputsFrom = [
35
+ package
36
+ scripts
37
+ ];
38
+ # Extra packages that *may* be used by some scripts
39
+ packages = [
40
+ pkgs.python3Packages.tiktoken
41
+ ];
42
+ shellHook = ''
43
+ echo "Entering ${name} devShell"
44
+ addToSearchPath "LD_LIBRARY_PATH" "${lib.getLib stdenv.cc.cc}/lib"
45
+ '';
46
+ };
47
+ }
48
+ ))
49
+ (lib.filterAttrs (name: value: value != null))
50
+ ];
51
+ };
52
+ }
llama.cpp/.devops/nix/docker.nix ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ lib,
3
+ dockerTools,
4
+ buildEnv,
5
+ llama-cpp,
6
+ interactive ? true,
7
+ coreutils,
8
+ }:
9
+
10
+ # A tar that can be fed into `docker load`:
11
+ #
12
+ # $ nix build .#llamaPackages.docker
13
+ # $ docker load < result
14
+
15
+ # For details and variations cf.
16
+ # - https://nixos.org/manual/nixpkgs/unstable/#ssec-pkgs-dockerTools-buildLayeredImage
17
+ # - https://discourse.nixos.org/t/a-faster-dockertools-buildimage-prototype/16922
18
+ # - https://nixery.dev/
19
+
20
+ # Approximate (compressed) sizes, at the time of writing, are:
21
+ #
22
+ # .#llamaPackages.docker: 125M;
23
+ # .#llamaPackagesCuda.docker: 537M;
24
+ # .#legacyPackages.aarch64-linux.llamaPackagesXavier.docker: 415M.
25
+
26
+ dockerTools.buildLayeredImage {
27
+ name = llama-cpp.pname;
28
+ tag = "latest";
29
+
30
+ contents =
31
+ [ llama-cpp ]
32
+ ++ lib.optionals interactive [
33
+ coreutils
34
+ dockerTools.binSh
35
+ dockerTools.caCertificates
36
+ ];
37
+ }
llama.cpp/.devops/nix/jetson-support.nix ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ { inputs, ... }:
2
+ {
3
+ perSystem =
4
+ {
5
+ config,
6
+ system,
7
+ lib,
8
+ pkgsCuda,
9
+ ...
10
+ }:
11
+ {
12
+ legacyPackages =
13
+ let
14
+ caps.llamaPackagesXavier = "7.2";
15
+ caps.llamaPackagesOrin = "8.7";
16
+ caps.llamaPackagesTX2 = "6.2";
17
+ caps.llamaPackagesNano = "5.3";
18
+
19
+ pkgsFor =
20
+ cap:
21
+ import inputs.nixpkgs {
22
+ inherit system;
23
+ config = {
24
+ cudaSupport = true;
25
+ cudaCapabilities = [ cap ];
26
+ cudaEnableForwardCompat = false;
27
+ inherit (pkgsCuda.config) allowUnfreePredicate;
28
+ };
29
+ };
30
+ in
31
+ builtins.mapAttrs (name: cap: (pkgsFor cap).callPackage ./scope.nix { }) caps;
32
+
33
+ packages = lib.optionalAttrs (system == "aarch64-linux") {
34
+ jetson-xavier = config.legacyPackages.llamaPackagesXavier.llama-cpp;
35
+ jetson-orin = config.legacyPackages.llamaPackagesOrin.llama-cpp;
36
+ jetson-nano = config.legacyPackages.llamaPackagesNano.llama-cpp;
37
+ };
38
+ };
39
+ }
llama.cpp/.devops/nix/nixpkgs-instances.nix ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ { inputs, ... }:
2
+ {
3
+ # The _module.args definitions are passed on to modules as arguments. E.g.
4
+ # the module `{ pkgs ... }: { /* config */ }` implicitly uses
5
+ # `_module.args.pkgs` (defined in this case by flake-parts).
6
+ perSystem =
7
+ { system, ... }:
8
+ {
9
+ _module.args = {
10
+ # Note: bringing up https://zimbatm.com/notes/1000-instances-of-nixpkgs
11
+ # again, the below creates several nixpkgs instances which the
12
+ # flake-centric CLI will be forced to evaluate e.g. on `nix flake show`.
13
+ #
14
+ # This is currently "slow" and "expensive", on a certain scale.
15
+ # This also isn't "right" in that this hinders dependency injection at
16
+ # the level of flake inputs. This might get removed in the foreseeable
17
+ # future.
18
+ #
19
+ # Note that you can use these expressions without Nix
20
+ # (`pkgs.callPackage ./devops/nix/scope.nix { }` is the entry point).
21
+
22
+ pkgsCuda = import inputs.nixpkgs {
23
+ inherit system;
24
+ # Ensure dependencies use CUDA consistently (e.g. that openmpi, ucc,
25
+ # and ucx are built with CUDA support)
26
+ config.cudaSupport = true;
27
+ config.allowUnfreePredicate =
28
+ p:
29
+ builtins.all (
30
+ license:
31
+ license.free
32
+ || builtins.elem license.shortName [
33
+ "CUDA EULA"
34
+ "cuDNN EULA"
35
+ ]
36
+ ) (p.meta.licenses or [ p.meta.license ]);
37
+ };
38
+ # Ensure dependencies use ROCm consistently
39
+ pkgsRocm = import inputs.nixpkgs {
40
+ inherit system;
41
+ config.rocmSupport = true;
42
+ };
43
+ };
44
+ };
45
+ }
llama.cpp/.devops/nix/package-gguf-py.nix ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ lib,
3
+ llamaVersion,
4
+ numpy,
5
+ tqdm,
6
+ sentencepiece,
7
+ pyyaml,
8
+ poetry-core,
9
+ buildPythonPackage,
10
+ pytestCheckHook,
11
+ }:
12
+
13
+ buildPythonPackage {
14
+ pname = "gguf";
15
+ version = llamaVersion;
16
+ pyproject = true;
17
+ nativeBuildInputs = [ poetry-core ];
18
+ propagatedBuildInputs = [
19
+ numpy
20
+ tqdm
21
+ sentencepiece
22
+ pyyaml
23
+ ];
24
+ src = lib.cleanSource ../../gguf-py;
25
+ pythonImportsCheck = [
26
+ "numpy"
27
+ "gguf"
28
+ ];
29
+ nativeCheckInputs = [ pytestCheckHook ];
30
+ doCheck = true;
31
+ meta = with lib; {
32
+ description = "Python package for writing binary files in the GGUF format";
33
+ license = licenses.mit;
34
+ maintainers = [ maintainers.ditsuke ];
35
+ };
36
+ }
llama.cpp/.devops/nix/package.nix ADDED
@@ -0,0 +1,247 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ lib,
3
+ glibc,
4
+ config,
5
+ stdenv,
6
+ runCommand,
7
+ cmake,
8
+ ninja,
9
+ pkg-config,
10
+ git,
11
+ mpi,
12
+ blas,
13
+ cudaPackages,
14
+ autoAddDriverRunpath,
15
+ darwin,
16
+ rocmPackages,
17
+ vulkan-headers,
18
+ vulkan-loader,
19
+ curl,
20
+ shaderc,
21
+ useBlas ?
22
+ builtins.all (x: !x) [
23
+ useCuda
24
+ useMetalKit
25
+ useRocm
26
+ useVulkan
27
+ ]
28
+ && blas.meta.available,
29
+ useCuda ? config.cudaSupport,
30
+ useMetalKit ? stdenv.isAarch64 && stdenv.isDarwin,
31
+ # Increases the runtime closure size by ~700M
32
+ useMpi ? false,
33
+ useRocm ? config.rocmSupport,
34
+ rocmGpuTargets ? builtins.concatStringsSep ";" rocmPackages.clr.gpuTargets,
35
+ enableCurl ? true,
36
+ useVulkan ? false,
37
+ llamaVersion ? "0.0.0", # Arbitrary version, substituted by the flake
38
+
39
+ # It's necessary to consistently use backendStdenv when building with CUDA support,
40
+ # otherwise we get libstdc++ errors downstream.
41
+ effectiveStdenv ? if useCuda then cudaPackages.backendStdenv else stdenv,
42
+ enableStatic ? effectiveStdenv.hostPlatform.isStatic,
43
+ precompileMetalShaders ? false,
44
+ }:
45
+
46
+ let
47
+ inherit (lib)
48
+ cmakeBool
49
+ cmakeFeature
50
+ optionals
51
+ strings
52
+ ;
53
+
54
+ stdenv = throw "Use effectiveStdenv instead";
55
+
56
+ suffices =
57
+ lib.optionals useBlas [ "BLAS" ]
58
+ ++ lib.optionals useCuda [ "CUDA" ]
59
+ ++ lib.optionals useMetalKit [ "MetalKit" ]
60
+ ++ lib.optionals useMpi [ "MPI" ]
61
+ ++ lib.optionals useRocm [ "ROCm" ]
62
+ ++ lib.optionals useVulkan [ "Vulkan" ];
63
+
64
+ pnameSuffix =
65
+ strings.optionalString (suffices != [ ])
66
+ "-${strings.concatMapStringsSep "-" strings.toLower suffices}";
67
+ descriptionSuffix = strings.optionalString (
68
+ suffices != [ ]
69
+ ) ", accelerated with ${strings.concatStringsSep ", " suffices}";
70
+
71
+ xcrunHost = runCommand "xcrunHost" { } ''
72
+ mkdir -p $out/bin
73
+ ln -s /usr/bin/xcrun $out/bin
74
+ '';
75
+
76
+ # apple_sdk is supposed to choose sane defaults, no need to handle isAarch64
77
+ # separately
78
+ darwinBuildInputs =
79
+ with darwin.apple_sdk.frameworks;
80
+ [
81
+ Accelerate
82
+ CoreVideo
83
+ CoreGraphics
84
+ ]
85
+ ++ optionals useMetalKit [ MetalKit ];
86
+
87
+ cudaBuildInputs = with cudaPackages; [
88
+ cuda_cudart
89
+ cuda_cccl # <nv/target>
90
+ libcublas
91
+ ];
92
+
93
+ rocmBuildInputs = with rocmPackages; [
94
+ clr
95
+ hipblas
96
+ rocblas
97
+ ];
98
+
99
+ vulkanBuildInputs = [
100
+ vulkan-headers
101
+ vulkan-loader
102
+ shaderc
103
+ ];
104
+ in
105
+
106
+ effectiveStdenv.mkDerivation (finalAttrs: {
107
+ pname = "llama-cpp${pnameSuffix}";
108
+ version = llamaVersion;
109
+
110
+ # Note: none of the files discarded here are visible in the sandbox or
111
+ # affect the output hash. This also means they can be modified without
112
+ # triggering a rebuild.
113
+ src = lib.cleanSourceWith {
114
+ filter =
115
+ name: type:
116
+ let
117
+ noneOf = builtins.all (x: !x);
118
+ baseName = baseNameOf name;
119
+ in
120
+ noneOf [
121
+ (lib.hasSuffix ".nix" name) # Ignore *.nix files when computing outPaths
122
+ (lib.hasSuffix ".md" name) # Ignore *.md changes whe computing outPaths
123
+ (lib.hasPrefix "." baseName) # Skip hidden files and directories
124
+ (baseName == "flake.lock")
125
+ ];
126
+ src = lib.cleanSource ../../.;
127
+ };
128
+
129
+ postPatch = ''
130
+ substituteInPlace ./ggml/src/ggml-metal/ggml-metal.m \
131
+ --replace '[bundle pathForResource:@"ggml-metal" ofType:@"metal"];' "@\"$out/bin/ggml-metal.metal\";"
132
+ substituteInPlace ./ggml/src/ggml-metal/ggml-metal.m \
133
+ --replace '[bundle pathForResource:@"default" ofType:@"metallib"];' "@\"$out/bin/default.metallib\";"
134
+ '';
135
+
136
+ # With PR#6015 https://github.com/ggml-org/llama.cpp/pull/6015,
137
+ # `default.metallib` may be compiled with Metal compiler from XCode
138
+ # and we need to escape sandbox on MacOS to access Metal compiler.
139
+ # `xcrun` is used find the path of the Metal compiler, which is varible
140
+ # and not on $PATH
141
+ # see https://github.com/ggml-org/llama.cpp/pull/6118 for discussion
142
+ __noChroot = effectiveStdenv.isDarwin && useMetalKit && precompileMetalShaders;
143
+
144
+ nativeBuildInputs =
145
+ [
146
+ cmake
147
+ ninja
148
+ pkg-config
149
+ git
150
+ ]
151
+ ++ optionals useCuda [
152
+ cudaPackages.cuda_nvcc
153
+
154
+ autoAddDriverRunpath
155
+ ]
156
+ ++ optionals (effectiveStdenv.hostPlatform.isGnu && enableStatic) [ glibc.static ]
157
+ ++ optionals (effectiveStdenv.isDarwin && useMetalKit && precompileMetalShaders) [ xcrunHost ];
158
+
159
+ buildInputs =
160
+ optionals effectiveStdenv.isDarwin darwinBuildInputs
161
+ ++ optionals useCuda cudaBuildInputs
162
+ ++ optionals useMpi [ mpi ]
163
+ ++ optionals useRocm rocmBuildInputs
164
+ ++ optionals useBlas [ blas ]
165
+ ++ optionals useVulkan vulkanBuildInputs
166
+ ++ optionals enableCurl [ curl ];
167
+
168
+ cmakeFlags =
169
+ [
170
+ (cmakeBool "LLAMA_BUILD_SERVER" true)
171
+ (cmakeBool "BUILD_SHARED_LIBS" (!enableStatic))
172
+ (cmakeBool "CMAKE_SKIP_BUILD_RPATH" true)
173
+ (cmakeBool "LLAMA_CURL" enableCurl)
174
+ (cmakeBool "GGML_NATIVE" false)
175
+ (cmakeBool "GGML_BLAS" useBlas)
176
+ (cmakeBool "GGML_CUDA" useCuda)
177
+ (cmakeBool "GGML_HIP" useRocm)
178
+ (cmakeBool "GGML_METAL" useMetalKit)
179
+ (cmakeBool "GGML_VULKAN" useVulkan)
180
+ (cmakeBool "GGML_STATIC" enableStatic)
181
+ ]
182
+ ++ optionals useCuda [
183
+ (
184
+ with cudaPackages.flags;
185
+ cmakeFeature "CMAKE_CUDA_ARCHITECTURES" (
186
+ builtins.concatStringsSep ";" (map dropDot cudaCapabilities)
187
+ )
188
+ )
189
+ ]
190
+ ++ optionals useRocm [
191
+ (cmakeFeature "CMAKE_HIP_COMPILER" "${rocmPackages.llvm.clang}/bin/clang")
192
+ (cmakeFeature "CMAKE_HIP_ARCHITECTURES" rocmGpuTargets)
193
+ ]
194
+ ++ optionals useMetalKit [
195
+ (lib.cmakeFeature "CMAKE_C_FLAGS" "-D__ARM_FEATURE_DOTPROD=1")
196
+ (cmakeBool "GGML_METAL_EMBED_LIBRARY" (!precompileMetalShaders))
197
+ ];
198
+
199
+ # Environment variables needed for ROCm
200
+ env = optionals useRocm {
201
+ ROCM_PATH = "${rocmPackages.clr}";
202
+ HIP_DEVICE_LIB_PATH = "${rocmPackages.rocm-device-libs}/amdgcn/bitcode";
203
+ };
204
+
205
+ # TODO(SomeoneSerge): It's better to add proper install targets at the CMake level,
206
+ # if they haven't been added yet.
207
+ postInstall = ''
208
+ mkdir -p $out/include
209
+ cp $src/include/llama.h $out/include/
210
+ '';
211
+
212
+ meta = {
213
+ # Configurations we don't want even the CI to evaluate. Results in the
214
+ # "unsupported platform" messages. This is mostly a no-op, because
215
+ # cudaPackages would've refused to evaluate anyway.
216
+ badPlatforms = optionals useCuda lib.platforms.darwin;
217
+
218
+ # Configurations that are known to result in build failures. Can be
219
+ # overridden by importing Nixpkgs with `allowBroken = true`.
220
+ broken = (useMetalKit && !effectiveStdenv.isDarwin);
221
+
222
+ description = "Inference of LLaMA model in pure C/C++${descriptionSuffix}";
223
+ homepage = "https://github.com/ggml-org/llama.cpp/";
224
+ license = lib.licenses.mit;
225
+
226
+ # Accommodates `nix run` and `lib.getExe`
227
+ mainProgram = "llama-cli";
228
+
229
+ # These people might respond, on the best effort basis, if you ping them
230
+ # in case of Nix-specific regressions or for reviewing Nix-specific PRs.
231
+ # Consider adding yourself to this list if you want to ensure this flake
232
+ # stays maintained and you're willing to invest your time. Do not add
233
+ # other people without their consent. Consider removing people after
234
+ # they've been unreachable for long periods of time.
235
+
236
+ # Note that lib.maintainers is defined in Nixpkgs, but you may just add
237
+ # an attrset following the same format as in
238
+ # https://github.com/NixOS/nixpkgs/blob/f36a80e54da29775c78d7eff0e628c2b4e34d1d7/maintainers/maintainer-list.nix
239
+ maintainers = with lib.maintainers; [
240
+ philiptaron
241
+ SomeoneSerge
242
+ ];
243
+
244
+ # Extend `badPlatforms` instead
245
+ platforms = lib.platforms.all;
246
+ };
247
+ })
llama.cpp/.devops/nix/python-scripts.nix ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ lib,
3
+ stdenv,
4
+ buildPythonPackage,
5
+ poetry-core,
6
+ mkShell,
7
+ python3Packages,
8
+ gguf-py,
9
+ }@inputs:
10
+
11
+ let
12
+ llama-python-deps = with python3Packages; [
13
+ numpy
14
+ sentencepiece
15
+ transformers
16
+ protobuf
17
+ torchWithoutCuda
18
+ gguf-py
19
+ tqdm
20
+
21
+ # for scripts/compare-llama-bench.py
22
+ gitpython
23
+ tabulate
24
+
25
+ # for examples/pydantic-models-to-grammar-examples.py
26
+ docstring-parser
27
+ pydantic
28
+
29
+ ];
30
+
31
+ llama-python-test-deps = with python3Packages; [
32
+ # Server bench
33
+ matplotlib
34
+
35
+ # server tests
36
+ openai
37
+ pytest
38
+ prometheus-client
39
+ ];
40
+ in
41
+
42
+ buildPythonPackage ({
43
+ pname = "llama-scripts";
44
+ version = "0.0.0";
45
+ pyproject = true;
46
+
47
+ # NOTE: The files filtered out here are not visible in the build sandbox, neither
48
+ # do they affect the output hash. They can be modified without triggering a rebuild.
49
+ src = lib.cleanSourceWith {
50
+ filter =
51
+ name: type:
52
+ let
53
+ any = builtins.any (x: x);
54
+ baseName = builtins.baseNameOf name;
55
+ in
56
+ any [
57
+ (lib.hasSuffix ".py" name)
58
+ (baseName == "README.md")
59
+ (baseName == "pyproject.toml")
60
+ ];
61
+ src = lib.cleanSource ../../.;
62
+ };
63
+ nativeBuildInputs = [ poetry-core ];
64
+ nativeCheckInputs = llama-python-test-deps;
65
+ dependencies = llama-python-deps;
66
+ })
llama.cpp/.devops/nix/scope.nix ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ lib,
3
+ newScope,
4
+ python3,
5
+ llamaVersion ? "0.0.0",
6
+ }:
7
+
8
+ let
9
+ pythonPackages = python3.pkgs;
10
+ buildPythonPackage = pythonPackages.buildPythonPackage;
11
+ numpy = pythonPackages.numpy;
12
+ tqdm = pythonPackages.tqdm;
13
+ sentencepiece = pythonPackages.sentencepiece;
14
+ pyyaml = pythonPackages.pyyaml;
15
+ poetry-core = pythonPackages.poetry-core;
16
+ pytestCheckHook = pythonPackages.pytestCheckHook;
17
+ in
18
+
19
+ # We're using `makeScope` instead of just writing out an attrset
20
+ # because it allows users to apply overlays later using `overrideScope'`.
21
+ # Cf. https://noogle.dev/f/lib/makeScope
22
+
23
+ lib.makeScope newScope (self: {
24
+ inherit llamaVersion;
25
+ gguf-py = self.callPackage ./package-gguf-py.nix {
26
+ inherit
27
+ buildPythonPackage
28
+ numpy
29
+ tqdm
30
+ sentencepiece
31
+ poetry-core
32
+ pyyaml
33
+ pytestCheckHook
34
+ ;
35
+ };
36
+ python-scripts = self.callPackage ./python-scripts.nix { inherit buildPythonPackage poetry-core; };
37
+ llama-cpp = self.callPackage ./package.nix { };
38
+ docker = self.callPackage ./docker.nix { };
39
+ docker-min = self.callPackage ./docker.nix { interactive = false; };
40
+ sif = self.callPackage ./sif.nix { };
41
+ })
llama.cpp/.devops/nix/sif.nix ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ lib,
3
+ singularity-tools,
4
+ llama-cpp,
5
+ bashInteractive,
6
+ interactive ? false,
7
+ }:
8
+
9
+ let
10
+ optionalInt = cond: x: if cond then x else 0;
11
+ in
12
+ singularity-tools.buildImage rec {
13
+ inherit (llama-cpp) name;
14
+ contents = [ llama-cpp ] ++ lib.optionals interactive [ bashInteractive ];
15
+
16
+ # These are excessive (but safe) for most variants. Building singularity
17
+ # images requires superuser privileges, so we build them inside a VM in a
18
+ # writable image of pre-determined size.
19
+ #
20
+ # ROCm is currently affected by https://github.com/NixOS/nixpkgs/issues/276846
21
+ #
22
+ # Expected image sizes:
23
+ # - cpu/blas: 150M,
24
+ # - cuda, all gencodes: 560M,
25
+ diskSize = 4096 + optionalInt llama-cpp.useRocm 16384;
26
+ memSize = diskSize;
27
+ }
llama.cpp/.devops/rocm.Dockerfile ADDED
@@ -0,0 +1,113 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ARG UBUNTU_VERSION=24.04
2
+
3
+ # This needs to generally match the container host's environment.
4
+ ARG ROCM_VERSION=6.3
5
+ ARG AMDGPU_VERSION=6.3
6
+
7
+ # Target the CUDA build image
8
+ ARG BASE_ROCM_DEV_CONTAINER=rocm/dev-ubuntu-${UBUNTU_VERSION}:${ROCM_VERSION}-complete
9
+
10
+ ### Build image
11
+ FROM ${BASE_ROCM_DEV_CONTAINER} AS build
12
+
13
+ # Unless otherwise specified, we make a fat build.
14
+ # List from https://github.com/ggml-org/llama.cpp/pull/1087#issuecomment-1682807878
15
+ # This is mostly tied to rocBLAS supported archs.
16
+ # gfx803, gfx900, gfx1032, gfx1101, gfx1102,not officialy supported
17
+ # gfx906 is deprecated
18
+ #check https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.2.4/reference/system-requirements.html
19
+
20
+ ARG ROCM_DOCKER_ARCH='gfx803,gfx900,gfx906,gfx908,gfx90a,gfx942,gfx1010,gfx1030,gfx1032,gfx1100,gfx1101,gfx1102'
21
+ #ARG ROCM_DOCKER_ARCH=gfx1100
22
+
23
+ # Set nvcc architectured
24
+ ENV AMDGPU_TARGETS=${ROCM_DOCKER_ARCH}
25
+ # Enable ROCm
26
+ # ENV CC=/opt/rocm/llvm/bin/clang
27
+ # ENV CXX=/opt/rocm/llvm/bin/clang++
28
+
29
+ RUN apt-get update \
30
+ && apt-get install -y \
31
+ build-essential \
32
+ cmake \
33
+ git \
34
+ libcurl4-openssl-dev \
35
+ curl \
36
+ libgomp1
37
+
38
+ WORKDIR /app
39
+
40
+ COPY . .
41
+
42
+ RUN HIPCXX="$(hipconfig -l)/clang" HIP_PATH="$(hipconfig -R)" \
43
+ cmake -S . -B build -DGGML_HIP=ON -DAMDGPU_TARGETS=$ROCM_DOCKER_ARCH -DGGML_BACKEND_DL=ON -DGGML_CPU_ALL_VARIANTS=ON -DCMAKE_BUILD_TYPE=Release -DLLAMA_BUILD_TESTS=OFF \
44
+ && cmake --build build --config Release -j$(nproc)
45
+
46
+ RUN mkdir -p /app/lib \
47
+ && find build -name "*.so" -exec cp {} /app/lib \;
48
+
49
+ RUN mkdir -p /app/full \
50
+ && cp build/bin/* /app/full \
51
+ && cp *.py /app/full \
52
+ && cp -r gguf-py /app/full \
53
+ && cp -r requirements /app/full \
54
+ && cp requirements.txt /app/full \
55
+ && cp .devops/tools.sh /app/full/tools.sh
56
+
57
+ ## Base image
58
+ FROM ${BASE_ROCM_DEV_CONTAINER} AS base
59
+
60
+ RUN apt-get update \
61
+ && apt-get install -y libgomp1 curl\
62
+ && apt autoremove -y \
63
+ && apt clean -y \
64
+ && rm -rf /tmp/* /var/tmp/* \
65
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
66
+ && find /var/cache -type f -delete
67
+
68
+ COPY --from=build /app/lib/ /app
69
+
70
+ ### Full
71
+ FROM base AS full
72
+
73
+ COPY --from=build /app/full /app
74
+
75
+ WORKDIR /app
76
+
77
+ RUN apt-get update \
78
+ && apt-get install -y \
79
+ git \
80
+ python3-pip \
81
+ python3 \
82
+ python3-wheel\
83
+ && pip install --break-system-packages --upgrade setuptools \
84
+ && pip install --break-system-packages -r requirements.txt \
85
+ && apt autoremove -y \
86
+ && apt clean -y \
87
+ && rm -rf /tmp/* /var/tmp/* \
88
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
89
+ && find /var/cache -type f -delete
90
+
91
+ ENTRYPOINT ["/app/tools.sh"]
92
+
93
+ ### Light, CLI only
94
+ FROM base AS light
95
+
96
+ COPY --from=build /app/full/llama-cli /app
97
+
98
+ WORKDIR /app
99
+
100
+ ENTRYPOINT [ "/app/llama-cli" ]
101
+
102
+ ### Server, Server only
103
+ FROM base AS server
104
+
105
+ ENV LLAMA_ARG_HOST=0.0.0.0
106
+
107
+ COPY --from=build /app/full/llama-server /app
108
+
109
+ WORKDIR /app
110
+
111
+ HEALTHCHECK CMD [ "curl", "-f", "http://localhost:8080/health" ]
112
+
113
+ ENTRYPOINT [ "/app/llama-server" ]
llama.cpp/.devops/tools.sh ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/bin/bash
2
+ set -e
3
+
4
+ # Read the first argument into a variable
5
+ arg1="$1"
6
+
7
+ # Shift the arguments to remove the first one
8
+ shift
9
+
10
+ if [[ "$arg1" == '--convert' || "$arg1" == '-c' ]]; then
11
+ exec python3 ./convert_hf_to_gguf.py "$@"
12
+ elif [[ "$arg1" == '--quantize' || "$arg1" == '-q' ]]; then
13
+ exec ./llama-quantize "$@"
14
+ elif [[ "$arg1" == '--run' || "$arg1" == '-r' ]]; then
15
+ exec ./llama-cli "$@"
16
+ elif [[ "$arg1" == '--bench' || "$arg1" == '-b' ]]; then
17
+ exec ./llama-bench "$@"
18
+ elif [[ "$arg1" == '--perplexity' || "$arg1" == '-p' ]]; then
19
+ exec ./llama-perplexity "$@"
20
+ elif [[ "$arg1" == '--all-in-one' || "$arg1" == '-a' ]]; then
21
+ echo "Converting PTH to GGML..."
22
+ for i in $(ls $1/$2/ggml-model-f16.bin*); do
23
+ if [ -f "${i/f16/q4_0}" ]; then
24
+ echo "Skip model quantization, it already exists: ${i/f16/q4_0}"
25
+ else
26
+ echo "Converting PTH to GGML: $i into ${i/f16/q4_0}..."
27
+ exec ./llama-quantize "$i" "${i/f16/q4_0}" q4_0
28
+ fi
29
+ done
30
+ elif [[ "$arg1" == '--server' || "$arg1" == '-s' ]]; then
31
+ exec ./llama-server "$@"
32
+ else
33
+ echo "Unknown command: $arg1"
34
+ echo "Available commands: "
35
+ echo " --run (-r): Run a model previously converted into ggml"
36
+ echo " ex: -m /models/7B/ggml-model-q4_0.bin -p \"Building a website can be done in 10 simple steps:\" -n 512"
37
+ echo " --bench (-b): Benchmark the performance of the inference for various parameters."
38
+ echo " ex: -m model.gguf"
39
+ echo " --perplexity (-p): Measure the perplexity of a model over a given text."
40
+ echo " ex: -m model.gguf -f file.txt"
41
+ echo " --convert (-c): Convert a llama model into ggml"
42
+ echo " ex: --outtype f16 \"/models/7B/\" "
43
+ echo " --quantize (-q): Optimize with quantization process ggml"
44
+ echo " ex: \"/models/7B/ggml-model-f16.bin\" \"/models/7B/ggml-model-q4_0.bin\" 2"
45
+ echo " --all-in-one (-a): Execute --convert & --quantize"
46
+ echo " ex: \"/models/\" 7B"
47
+ echo " --server (-s): Run a model on the server"
48
+ echo " ex: -m /models/7B/ggml-model-q4_0.bin -c 2048 -ngl 43 -mg 1 --port 8080"
49
+ fi
llama.cpp/.devops/vulkan.Dockerfile ADDED
@@ -0,0 +1,89 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ARG UBUNTU_VERSION=24.04
2
+
3
+ FROM ubuntu:$UBUNTU_VERSION AS build
4
+
5
+ # Install build tools
6
+ RUN apt update && apt install -y git build-essential cmake wget
7
+
8
+ # Install Vulkan SDK and cURL
9
+ RUN wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
10
+ wget -qO /etc/apt/sources.list.d/lunarg-vulkan-noble.list https://packages.lunarg.com/vulkan/lunarg-vulkan-noble.list && \
11
+ apt update -y && \
12
+ apt-get install -y vulkan-sdk libcurl4-openssl-dev curl
13
+
14
+ # Build it
15
+ WORKDIR /app
16
+
17
+ COPY . .
18
+
19
+ RUN cmake -B build -DGGML_NATIVE=OFF -DGGML_VULKAN=1 -DLLAMA_BUILD_TESTS=OFF -DGGML_BACKEND_DL=ON -DGGML_CPU_ALL_VARIANTS=ON && \
20
+ cmake --build build --config Release -j$(nproc)
21
+
22
+ RUN mkdir -p /app/lib && \
23
+ find build -name "*.so" -exec cp {} /app/lib \;
24
+
25
+ RUN mkdir -p /app/full \
26
+ && cp build/bin/* /app/full \
27
+ && cp *.py /app/full \
28
+ && cp -r gguf-py /app/full \
29
+ && cp -r requirements /app/full \
30
+ && cp requirements.txt /app/full \
31
+ && cp .devops/tools.sh /app/full/tools.sh
32
+
33
+ ## Base image
34
+ FROM ubuntu:$UBUNTU_VERSION AS base
35
+
36
+ RUN apt-get update \
37
+ && apt-get install -y libgomp1 curl libvulkan-dev \
38
+ && apt autoremove -y \
39
+ && apt clean -y \
40
+ && rm -rf /tmp/* /var/tmp/* \
41
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
42
+ && find /var/cache -type f -delete
43
+
44
+ COPY --from=build /app/lib/ /app
45
+
46
+ ### Full
47
+ FROM base AS full
48
+
49
+ COPY --from=build /app/full /app
50
+
51
+ WORKDIR /app
52
+
53
+ RUN apt-get update \
54
+ && apt-get install -y \
55
+ git \
56
+ python3 \
57
+ python3-pip \
58
+ python3-wheel \
59
+ && pip install --break-system-packages --upgrade setuptools \
60
+ && pip install --break-system-packages -r requirements.txt \
61
+ && apt autoremove -y \
62
+ && apt clean -y \
63
+ && rm -rf /tmp/* /var/tmp/* \
64
+ && find /var/cache/apt/archives /var/lib/apt/lists -not -name lock -type f -delete \
65
+ && find /var/cache -type f -delete
66
+
67
+ ENTRYPOINT ["/app/tools.sh"]
68
+
69
+ ### Light, CLI only
70
+ FROM base AS light
71
+
72
+ COPY --from=build /app/full/llama-cli /app
73
+
74
+ WORKDIR /app
75
+
76
+ ENTRYPOINT [ "/app/llama-cli" ]
77
+
78
+ ### Server, Server only
79
+ FROM base AS server
80
+
81
+ ENV LLAMA_ARG_HOST=0.0.0.0
82
+
83
+ COPY --from=build /app/full/llama-server /app
84
+
85
+ WORKDIR /app
86
+
87
+ HEALTHCHECK CMD [ "curl", "-f", "http://localhost:8080/health" ]
88
+
89
+ ENTRYPOINT [ "/app/llama-server" ]
llama.cpp/.dockerignore ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.o
2
+ *.a
3
+ .cache/
4
+ # Do not ignore .git directory, otherwise the reported build number will always be 0
5
+ .github/
6
+ .gitignore
7
+ .vs/
8
+ .vscode/
9
+ .DS_Store
10
+
11
+ build*/
12
+
13
+ models/*
14
+
15
+ /llama-cli
16
+ /llama-quantize
17
+
18
+ arm_neon.h
19
+ compile_commands.json
20
+ Dockerfile
llama.cpp/.ecrc ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "Exclude": ["^\\.gitmodules$", "stb_image\\.h"],
3
+ "Disable": {
4
+ "IndentSize": true
5
+ }
6
+ }
llama.cpp/.editorconfig ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # https://EditorConfig.org
2
+
3
+ # Top-most EditorConfig file
4
+ root = true
5
+
6
+ # Unix-style newlines with a newline ending every file, utf-8 charset
7
+ [*]
8
+ end_of_line = lf
9
+ insert_final_newline = true
10
+ trim_trailing_whitespace = true
11
+ charset = utf-8
12
+ indent_style = space
13
+ indent_size = 4
14
+
15
+ [Makefile]
16
+ indent_style = tab
17
+
18
+ [scripts/*.mk]
19
+ indent_style = tab
20
+
21
+ [prompts/*.txt]
22
+ insert_final_newline = unset
23
+
24
+ [tools/server/public/*]
25
+ indent_size = 2
26
+
27
+ [tools/server/public/deps_*]
28
+ trim_trailing_whitespace = unset
29
+ indent_style = unset
30
+ indent_size = unset
31
+
32
+ [tools/server/deps_*]
33
+ trim_trailing_whitespace = unset
34
+ indent_style = unset
35
+ indent_size = unset
36
+
37
+ [examples/llama.swiftui/llama.swiftui.xcodeproj/*]
38
+ indent_style = tab
39
+
40
+ [tools/cvector-generator/*.txt]
41
+ trim_trailing_whitespace = unset
42
+ insert_final_newline = unset
43
+
44
+ [models/templates/*.jinja]
45
+ indent_style = unset
46
+ indent_size = unset
47
+ end_of_line = unset
48
+ charset = unset
49
+ trim_trailing_whitespace = unset
50
+ insert_final_newline = unset
51
+
52
+ [tools/mtmd/vendor/miniaudio.h]
53
+ trim_trailing_whitespace = unset
54
+ insert_final_newline = unset
llama.cpp/.flake8 ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [flake8]
2
+ max-line-length = 125
3
+ ignore = E203,E211,E221,E225,E231,E241,E251,E261,E266,E501,E701,E704,W503
4
+ exclude =
5
+ # Do not traverse examples and tools
6
+ examples,
7
+ tools,
8
+ # Do not include package initializers
9
+ __init__.py,
10
+ # No need to traverse our git directory
11
+ .git,
12
+ # There's no value in checking cache directories
13
+ __pycache__,
14
+ # No need to include the build path
15
+ build,
16
+ # This contains builds that we don't want to check
17
+ dist # This is generated with `python build .` for package releases
18
+ # max-complexity = 10
llama.cpp/.github/ISSUE_TEMPLATE/010-bug-compilation.yml ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Bug (compilation)
2
+ description: Something goes wrong when trying to compile llama.cpp.
3
+ title: "Compile bug: "
4
+ labels: ["bug-unconfirmed", "compilation"]
5
+ body:
6
+ - type: markdown
7
+ attributes:
8
+ value: >
9
+ Thanks for taking the time to fill out this bug report!
10
+ This issue template is intended for bug reports where the compilation of llama.cpp fails.
11
+ Before opening an issue, please confirm that the compilation still fails with `-DGGML_CCACHE=OFF`.
12
+ If the compilation succeeds with ccache disabled you should be able to permanently fix the issue
13
+ by clearing `~/.cache/ccache` (on Linux).
14
+ - type: textarea
15
+ id: commit
16
+ attributes:
17
+ label: Git commit
18
+ description: Which commit are you trying to compile?
19
+ placeholder: |
20
+ $git rev-parse HEAD
21
+ 84a07a17b1b08cf2b9747c633a2372782848a27f
22
+ validations:
23
+ required: true
24
+ - type: dropdown
25
+ id: operating-system
26
+ attributes:
27
+ label: Operating systems
28
+ description: Which operating systems do you know to be affected?
29
+ multiple: true
30
+ options:
31
+ - Linux
32
+ - Mac
33
+ - Windows
34
+ - BSD
35
+ - Other? (Please let us know in description)
36
+ validations:
37
+ required: true
38
+ - type: dropdown
39
+ id: backends
40
+ attributes:
41
+ label: GGML backends
42
+ description: Which GGML backends do you know to be affected?
43
+ options: [AMX, BLAS, CPU, CUDA, HIP, Kompute, Metal, Musa, RPC, SYCL, Vulkan]
44
+ multiple: true
45
+ validations:
46
+ required: true
47
+ - type: textarea
48
+ id: info
49
+ attributes:
50
+ label: Problem description & steps to reproduce
51
+ description: >
52
+ Please give us a summary of the problem and tell us how to reproduce it.
53
+ If you can narrow down the bug to specific compile flags, that information would be very much appreciated by us.
54
+ placeholder: >
55
+ I'm trying to compile llama.cpp with CUDA support on a fresh install of Ubuntu and get error XY.
56
+ Here are the exact commands that I used: ...
57
+ validations:
58
+ required: true
59
+ - type: textarea
60
+ id: first_bad_commit
61
+ attributes:
62
+ label: First Bad Commit
63
+ description: >
64
+ If the bug was not present on an earlier version: when did it start appearing?
65
+ If possible, please do a git bisect and identify the exact commit that introduced the bug.
66
+ validations:
67
+ required: false
68
+ - type: textarea
69
+ id: command
70
+ attributes:
71
+ label: Compile command
72
+ description: >
73
+ Please provide the exact command you used to compile llama.cpp. For example: `cmake -B ...`.
74
+ This will be automatically formatted into code, so no need for backticks.
75
+ render: shell
76
+ validations:
77
+ required: true
78
+ - type: textarea
79
+ id: logs
80
+ attributes:
81
+ label: Relevant log output
82
+ description: >
83
+ Please copy and paste any relevant log output, including any generated text.
84
+ This will be automatically formatted into code, so no need for backticks.
85
+ render: shell
86
+ validations:
87
+ required: true
llama.cpp/.github/ISSUE_TEMPLATE/011-bug-results.yml ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Bug (model use)
2
+ description: Something goes wrong when using a model (in general, not specific to a single llama.cpp module).
3
+ title: "Eval bug: "
4
+ labels: ["bug-unconfirmed", "model evaluation"]
5
+ body:
6
+ - type: markdown
7
+ attributes:
8
+ value: >
9
+ Thanks for taking the time to fill out this bug report!
10
+ This issue template is intended for bug reports where the model evaluation results
11
+ (i.e. the generated text) are incorrect or llama.cpp crashes during model evaluation.
12
+ If you encountered the issue while using an external UI (e.g. ollama),
13
+ please reproduce your issue using one of the examples/binaries in this repository.
14
+ The `llama-cli` binary can be used for simple and reproducible model inference.
15
+ - type: textarea
16
+ id: version
17
+ attributes:
18
+ label: Name and Version
19
+ description: Which version of our software are you running? (use `--version` to get a version string)
20
+ placeholder: |
21
+ $./llama-cli --version
22
+ version: 2999 (42b4109e)
23
+ built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
24
+ validations:
25
+ required: true
26
+ - type: dropdown
27
+ id: operating-system
28
+ attributes:
29
+ label: Operating systems
30
+ description: Which operating systems do you know to be affected?
31
+ multiple: true
32
+ options:
33
+ - Linux
34
+ - Mac
35
+ - Windows
36
+ - BSD
37
+ - Other? (Please let us know in description)
38
+ validations:
39
+ required: true
40
+ - type: dropdown
41
+ id: backends
42
+ attributes:
43
+ label: GGML backends
44
+ description: Which GGML backends do you know to be affected?
45
+ options: [AMX, BLAS, CPU, CUDA, HIP, Kompute, Metal, Musa, RPC, SYCL, Vulkan]
46
+ multiple: true
47
+ validations:
48
+ required: true
49
+ - type: textarea
50
+ id: hardware
51
+ attributes:
52
+ label: Hardware
53
+ description: Which CPUs/GPUs are you using?
54
+ placeholder: >
55
+ e.g. Ryzen 5950X + 2x RTX 4090
56
+ validations:
57
+ required: true
58
+ - type: textarea
59
+ id: model
60
+ attributes:
61
+ label: Models
62
+ description: >
63
+ Which model(s) at which quantization were you using when encountering the bug?
64
+ If you downloaded a GGUF file off of Huggingface, please provide a link.
65
+ placeholder: >
66
+ e.g. Meta LLaMA 3.1 Instruct 8b q4_K_M
67
+ validations:
68
+ required: false
69
+ - type: textarea
70
+ id: info
71
+ attributes:
72
+ label: Problem description & steps to reproduce
73
+ description: >
74
+ Please give us a summary of the problem and tell us how to reproduce it.
75
+ If you can narrow down the bug to specific hardware, compile flags, or command line arguments,
76
+ that information would be very much appreciated by us.
77
+ placeholder: >
78
+ e.g. when I run llama-cli with -ngl 99 I get garbled outputs.
79
+ When I use -ngl 0 it works correctly.
80
+ Here are the exact commands that I used: ...
81
+ validations:
82
+ required: true
83
+ - type: textarea
84
+ id: first_bad_commit
85
+ attributes:
86
+ label: First Bad Commit
87
+ description: >
88
+ If the bug was not present on an earlier version: when did it start appearing?
89
+ If possible, please do a git bisect and identify the exact commit that introduced the bug.
90
+ validations:
91
+ required: false
92
+ - type: textarea
93
+ id: logs
94
+ attributes:
95
+ label: Relevant log output
96
+ description: >
97
+ Please copy and paste any relevant log output, including the command that you entered and any generated text.
98
+ This will be automatically formatted into code, so no need for backticks.
99
+ render: shell
100
+ validations:
101
+ required: true
llama.cpp/.github/ISSUE_TEMPLATE/019-bug-misc.yml ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Bug (misc.)
2
+ description: Something is not working the way it should (and it's not covered by any of the above cases).
3
+ title: "Misc. bug: "
4
+ labels: ["bug-unconfirmed"]
5
+ body:
6
+ - type: markdown
7
+ attributes:
8
+ value: >
9
+ Thanks for taking the time to fill out this bug report!
10
+ This issue template is intended for miscellaneous bugs that don't fit into any other category.
11
+ If you encountered the issue while using an external UI (e.g. ollama),
12
+ please reproduce your issue using one of the examples/binaries in this repository.
13
+ - type: textarea
14
+ id: version
15
+ attributes:
16
+ label: Name and Version
17
+ description: Which version of our software is affected? (You can use `--version` to get a version string.)
18
+ placeholder: |
19
+ $./llama-cli --version
20
+ version: 2999 (42b4109e)
21
+ built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
22
+ validations:
23
+ required: true
24
+ - type: dropdown
25
+ id: operating-system
26
+ attributes:
27
+ label: Operating systems
28
+ description: Which operating systems do you know to be affected?
29
+ multiple: true
30
+ options:
31
+ - Linux
32
+ - Mac
33
+ - Windows
34
+ - BSD
35
+ - Other? (Please let us know in description)
36
+ validations:
37
+ required: false
38
+ - type: dropdown
39
+ id: module
40
+ attributes:
41
+ label: Which llama.cpp modules do you know to be affected?
42
+ multiple: true
43
+ options:
44
+ - Documentation/Github
45
+ - libllama (core library)
46
+ - llama-cli
47
+ - llama-server
48
+ - llama-bench
49
+ - llama-quantize
50
+ - Python/Bash scripts
51
+ - Test code
52
+ - Other (Please specify in the next section)
53
+ validations:
54
+ required: false
55
+ - type: textarea
56
+ id: command
57
+ attributes:
58
+ label: Command line
59
+ description: >
60
+ Please provide the exact commands you entered, if applicable. For example: `llama-server -m ... -c ...`, `llama-cli -m ...`, etc.
61
+ This will be automatically formatted into code, so no need for backticks.
62
+ render: shell
63
+ validations:
64
+ required: false
65
+ - type: textarea
66
+ id: info
67
+ attributes:
68
+ label: Problem description & steps to reproduce
69
+ description: >
70
+ Please give us a summary of the problem and tell us how to reproduce it (if applicable).
71
+ validations:
72
+ required: true
73
+ - type: textarea
74
+ id: first_bad_commit
75
+ attributes:
76
+ label: First Bad Commit
77
+ description: >
78
+ If the bug was not present on an earlier version and it's not trivial to track down: when did it start appearing?
79
+ If possible, please do a git bisect and identify the exact commit that introduced the bug.
80
+ validations:
81
+ required: false
82
+ - type: textarea
83
+ id: logs
84
+ attributes:
85
+ label: Relevant log output
86
+ description: >
87
+ If applicable, please copy and paste any relevant log output, including any generated text.
88
+ This will be automatically formatted into code, so no need for backticks.
89
+ render: shell
90
+ validations:
91
+ required: false
llama.cpp/.github/ISSUE_TEMPLATE/020-enhancement.yml ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Enhancement
2
+ description: Used to request enhancements for llama.cpp.
3
+ title: "Feature Request: "
4
+ labels: ["enhancement"]
5
+ body:
6
+ - type: markdown
7
+ attributes:
8
+ value: |
9
+ [Please post your idea first in Discussion if there is not yet a consensus for this enhancement request. This will help to keep this issue tracker focused on enhancements that the community has agreed needs to be implemented.](https://github.com/ggml-org/llama.cpp/discussions/categories/ideas)
10
+
11
+ - type: checkboxes
12
+ id: prerequisites
13
+ attributes:
14
+ label: Prerequisites
15
+ description: Please confirm the following before submitting your enhancement request.
16
+ options:
17
+ - label: I am running the latest code. Mention the version if possible as well.
18
+ required: true
19
+ - label: I carefully followed the [README.md](https://github.com/ggml-org/llama.cpp/blob/master/README.md).
20
+ required: true
21
+ - label: I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
22
+ required: true
23
+ - label: I reviewed the [Discussions](https://github.com/ggml-org/llama.cpp/discussions), and have a new and useful enhancement to share.
24
+ required: true
25
+
26
+ - type: textarea
27
+ id: feature-description
28
+ attributes:
29
+ label: Feature Description
30
+ description: Please provide a detailed written description of what you were trying to do, and what you expected `llama.cpp` to do as an enhancement.
31
+ placeholder: Detailed description of the enhancement
32
+ validations:
33
+ required: true
34
+
35
+ - type: textarea
36
+ id: motivation
37
+ attributes:
38
+ label: Motivation
39
+ description: Please provide a detailed written description of reasons why this feature is necessary and how it is useful to `llama.cpp` users.
40
+ placeholder: Explanation of why this feature is needed and its benefits
41
+ validations:
42
+ required: true
43
+
44
+ - type: textarea
45
+ id: possible-implementation
46
+ attributes:
47
+ label: Possible Implementation
48
+ description: If you have an idea as to how it can be implemented, please write a detailed description. Feel free to give links to external sources or share visuals that might be helpful to understand the details better.
49
+ placeholder: Detailed description of potential implementation
50
+ validations:
51
+ required: false
llama.cpp/.github/ISSUE_TEMPLATE/030-research.yml ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Research
2
+ description: Track new technical research area.
3
+ title: "Research: "
4
+ labels: ["research 🔬"]
5
+ body:
6
+ - type: markdown
7
+ attributes:
8
+ value: |
9
+ Don't forget to check for any [duplicate research issue tickets](https://github.com/ggml-org/llama.cpp/issues?q=is%3Aopen+is%3Aissue+label%3A%22research+%F0%9F%94%AC%22)
10
+
11
+ - type: checkboxes
12
+ id: research-stage
13
+ attributes:
14
+ label: Research Stage
15
+ description: Track general state of this research ticket
16
+ options:
17
+ - label: Background Research (Let's try to avoid reinventing the wheel)
18
+ - label: Hypothesis Formed (How do you think this will work and it's effect?)
19
+ - label: Strategy / Implementation Forming
20
+ - label: Analysis of results
21
+ - label: Debrief / Documentation (So people in the future can learn from us)
22
+
23
+ - type: textarea
24
+ id: background
25
+ attributes:
26
+ label: Previous existing literature and research
27
+ description: Whats the current state of the art and whats the motivation for this research?
28
+
29
+ - type: textarea
30
+ id: hypothesis
31
+ attributes:
32
+ label: Hypothesis
33
+ description: How do you think this will work and it's effect?
34
+
35
+ - type: textarea
36
+ id: implementation
37
+ attributes:
38
+ label: Implementation
39
+ description: Got an approach? e.g. a PR ready to go?
40
+
41
+ - type: textarea
42
+ id: analysis
43
+ attributes:
44
+ label: Analysis
45
+ description: How does the proposed implementation behave?
46
+
47
+ - type: textarea
48
+ id: logs
49
+ attributes:
50
+ label: Relevant log output
51
+ description: Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks.
52
+ render: shell
llama.cpp/.github/ISSUE_TEMPLATE/040-refactor.yml ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ name: Refactor (Maintainers)
2
+ description: Used to track refactoring opportunities.
3
+ title: "Refactor: "
4
+ labels: ["refactor"]
5
+ body:
6
+ - type: markdown
7
+ attributes:
8
+ value: |
9
+ Don't forget to [check for existing refactor issue tickets](https://github.com/ggml-org/llama.cpp/issues?q=is%3Aopen+is%3Aissue+label%3Arefactoring) in case it's already covered.
10
+ Also you may want to check [Pull request refactor label as well](https://github.com/ggml-org/llama.cpp/pulls?q=is%3Aopen+is%3Apr+label%3Arefactoring) for duplicates too.
11
+
12
+ - type: textarea
13
+ id: background-description
14
+ attributes:
15
+ label: Background Description
16
+ description: Please provide a detailed written description of the pain points you are trying to solve.
17
+ placeholder: Detailed description behind your motivation to request refactor
18
+ validations:
19
+ required: true
20
+
21
+ - type: textarea
22
+ id: possible-approaches
23
+ attributes:
24
+ label: Possible Refactor Approaches
25
+ description: If you have some idea of possible approaches to solve this problem. You may want to make it a todo list.
26
+ placeholder: Your idea of possible refactoring opportunity/approaches
27
+ validations:
28
+ required: false
llama.cpp/.github/ISSUE_TEMPLATE/config.yml ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ blank_issues_enabled: true
2
+ contact_links:
3
+ - name: Got an idea?
4
+ url: https://github.com/ggml-org/llama.cpp/discussions/categories/ideas
5
+ about: Pop it there. It may then become an enhancement ticket.
6
+ - name: Got a question?
7
+ url: https://github.com/ggml-org/llama.cpp/discussions/categories/q-a
8
+ about: Ask a question there!
9
+ - name: Want to contribute?
10
+ url: https://github.com/ggml-org/llama.cpp/wiki/contribute
11
+ about: Head to the contribution guide page of the wiki for areas you can help with