Streaming support / batch inference?

#2
by brainofchild - opened

For production purposes, these are important. Was curious how the model handles output streaming / batch inference.

Both are possible but have not been added to the codebase yet. We will definitely ship streaming as there is already a PR out for it.

Curious to hear if there are any updates!

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment