/v1/chat/completions endpoint not working

#32
by PremkumarChandak - opened

We have successfully load and serve model with vllm.

When we are try to communicate with model /chat/completions , we are not getting any response.(continuously loading)

PremkumarChandak changed discussion title from /v1C endpoint not working to /v1/chat/completions endpoint not working
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment