Missing Tool Support in Ollama
I get the same problem running in llama.cpp server. Tried q4_0 and q8_0, bunch of different prompts and generation settings, same result. It'll actually tell the users to execute the function manually themselves and report back! I've actually not had any luck with any of the LFM models. I've even tried writing my own Jinja templates, but no dice. It's almost like they uploaded the wrong models or something.
I am also getting the same error trying it using Ollama through n8n, model doesn't support tools. I think it is really a wrong model uploaded too
The embedded template in the gguf does support tool calling. The problem you're having with Ollama is that the modelfile doesn't have the correct template. My issue is that even with the correct template, it won't call any functions.
Put this in a file with the extension *.modelfile and edit it to have the path to your downloaded gguf, then use "ollama create LFM2-1.2B_tool -f your-modelfile-here.modelfile" to add it to your local ollama:
FROM /path/to/your/downloaded/gguf/file
TEMPLATE """{{- bos_token -}}{%- set system_prompt = "" -%}{%- set ns = namespace(system_prompt="") -%}{%- if messages[0]["role"] == "system" -%} {%- set ns.system_prompt = messages[0]["content"] -%} {%- set messages = messages[1:] -%}{%- endif -%}{%- if tools -%} {%- set ns.system_prompt = ns.system_prompt + (" " if ns.system_prompt else "") + "List of tools: <|tool_list_start|>[" -%} {%- for tool in tools -%} {%- if tool is not string -%} {%- set tool = tool | tojson -%} {%- endif -%} {%- set ns.system_prompt = ns.system_prompt + tool -%} {%- if not loop.last -%} {%- set ns.system_prompt = ns.system_prompt + ", " -%} {%- endif -%} {%- endfor -%} {%- set ns.system_prompt = ns.system_prompt + "]<|tool_list_end|>" -%}{%- endif -%}{%- if ns.system_prompt -%} {{- "<|im_start|>system " + ns.system_prompt + "<|im_end|> " -}}{%- endif -%}{%- for message in messages -%} {{- "<|im_start|>" + message["role"] + " " -}} {%- set content = message["content"] -%} {%- if content is not string -%} {%- set content = content | tojson -%} {%- endif -%} {%- if message["role"] == "tool" -%} {%- set content = "<|tool_response_start|>" + content + "<|tool_response_end|>" -%} {%- endif -%} {{- content + "<|im_end|> " -}}{%- endfor -%}{%- if add_generation_prompt -%} {{- "<|im_start|>assistant " -}}{%- endif -%}"""
SYSTEM You are a helpful assistant trained by Liquid AI.
That should at least fix the error with ollama and let you experiment with what little parameters Ollama lets you. The template here was copied directly from the LFM gguf file.