Update README.md
Browse files
README.md
CHANGED
@@ -18,6 +18,10 @@ No IMatrix.
|
|
18 |
|
19 |
The fork is currently required to run inference and there's no guarantee these checkpoints will work with future builds. Temporary builds are available [here](https://github.com/green-s/llama.cpp.qwen2vl/releases). The latest tested build as of writing is `qwen25-vl-b4899-bc4163b`.
|
20 |
|
|
|
|
|
|
|
|
|
21 |
[Original model](https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct)
|
22 |
|
23 |
## Usage
|
|
|
18 |
|
19 |
The fork is currently required to run inference and there's no guarantee these checkpoints will work with future builds. Temporary builds are available [here](https://github.com/green-s/llama.cpp.qwen2vl/releases). The latest tested build as of writing is `qwen25-vl-b4899-bc4163b`.
|
20 |
|
21 |
+
Edit:
|
22 |
+
|
23 |
+
As of 1-April-2025 inference support has been added to [koboldcpp](https://github.com/LostRuins/koboldcpp).
|
24 |
+
|
25 |
[Original model](https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct)
|
26 |
|
27 |
## Usage
|