Gliese-Query_Tool-0.6B

Gliese-Query_Tool-0.6B is a function-calling and query-oriented reasoning model fine-tuned from Qwen3-0.6B using Salesforce/xlam-function-calling-60k, designed for tool orchestration, structured query resolution, and operation chaining across diverse tasks. It excels in dynamic function execution, structured reasoning pipelines, and multi-tool decision workflows, making it a powerful lightweight solution for developers, tooling platforms, and automation systems.

Model Files

File Name Quant Type File Size
Gliese-Query_Tool-0.6B.BF16.gguf BF16 1.2 GB
Gliese-Query_Tool-0.6B.F16.gguf F16 1.2 GB
Gliese-Query_Tool-0.6B.F32.gguf F32 2.39 GB
Gliese-Query_Tool-0.6B.Q2_K.gguf Q2_K 296 MB
Gliese-Query_Tool-0.6B.Q3_K_L.gguf Q3_K_L 368 MB
Gliese-Query_Tool-0.6B.Q3_K_M.gguf Q3_K_M 347 MB
Gliese-Query_Tool-0.6B.Q3_K_S.gguf Q3_K_S 323 MB
Gliese-Query_Tool-0.6B.Q4_0.gguf Q4_0 382 MB
Gliese-Query_Tool-0.6B.Q4_1.gguf Q4_1 409 MB
Gliese-Query_Tool-0.6B.Q4_K.gguf Q4_K 397 MB
Gliese-Query_Tool-0.6B.Q4_K_M.gguf Q4_K_M 397 MB
Gliese-Query_Tool-0.6B.Q4_K_S.gguf Q4_K_S 383 MB
Gliese-Query_Tool-0.6B.Q5_0.gguf Q5_0 437 MB
Gliese-Query_Tool-0.6B.Q5_1.gguf Q5_1 464 MB
Gliese-Query_Tool-0.6B.Q5_K.gguf Q5_K 444 MB
Gliese-Query_Tool-0.6B.Q5_K_M.gguf Q5_K_M 444 MB
Gliese-Query_Tool-0.6B.Q5_K_S.gguf Q5_K_S 437 MB
Gliese-Query_Tool-0.6B.Q6_K.gguf Q6_K 495 MB
Gliese-Query_Tool-0.6B.Q8_0.gguf Q8_0 639 MB

sample inference.

Solve 2**2
[{"name": "power", "description": "Calculates the power of a number with a specified exponent.",
"parameters": {"number": {"description": "The base for which the power is calculated.", "type": "int"},
"exponent": {"description": "The exponent to which the number should be raised.", "type": "int"}}}]
solve for 'x' in the equation 2x + 5 = 11?
[{"name": "solving_equation", "description": "Solves a linear equation for a variable.",
 "parameters": {"equation": {"description": "The equation to solve. The format is 'a*x + b = c'.
For example, '5x + 2 = 10' or '3x - 7 = 1'.", "type": "str"}, "operation": {"description": "The operation (add, sub, etc.) to perform the solving.",
"type": "str, optional"}, "variable": {"description": "The variable to solve for. Defaults to 'x' if not provided.", "default": "x"}}}]
What is the volume of a sphere with a radius of 6 cm?
[{"name": "volume_of_sphere", "description": "Calculates the volume of a sphere given its radius using the formula (4/3)πr³.",
"parameters": {"radius": {"description": "The radius of the sphere.", "type": "int"}}}]
In an examination 80% of the candidates passed in Urdu and 85% in Hindi, while 75% passed in both . If 45 candidates failed in both. Then the total number of candidates was ?
[{"name": "passing_percentage", "description": "Calculates the passing percentage for an exam given the percentage of students who passed each subject, and the intersection percentage of passing subjects.",
"parameters": {"subject1_percent": {"description": "Percentage of students who passed the first subject (e.g., 85% if Hindi).", "type": "int"},
"subject2_percent": {"description": "Percentage of students who passed the second subject (e.g., 80% if Urdu).", "type": "int"}, "passed_both_percent": {"description": "Percentage of students who passed both subjects.", "type": "int"}}}]

Quants Usage

(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)

Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better):

image.png

Downloads last month
328
GGUF
Model size
596M params
Architecture
qwen3
Hardware compatibility
Log In to view the estimation

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

16-bit

32-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for prithivMLmods/Gliese-Query_Tool-0.6B-GGUF

Finetuned
Qwen/Qwen3-0.6B
Quantized
(2)
this model

Dataset used to train prithivMLmods/Gliese-Query_Tool-0.6B-GGUF

Collections including prithivMLmods/Gliese-Query_Tool-0.6B-GGUF