<think> llama.cpp.python and Open WebUI
#9
by
yazoniak
- opened
Hi. < think > tag is missing at the very beginning of the reasoning process what causes an issue in Open WebUI for example when running through llama.cpp-python. Is possible to generate a new fixed GGUFs with < think > tag added (by modifying the tokenizer_config.json)?