model not working
#3
by
charlieroro
- opened
Hmm... I saw a similar problem pop up before - the problem seems to be ollama overriding the system prompt with something that causes these safety checks. This is either a bug in ollama or your ollama setup, and, as far as I can see, nothing to do with this model - as soon as you use the model and the embedded chat template or use your own it will work. OR use another inferencing engine such as llama.cpp that doesn't have this issue.