Please make 4bit dwq version

#1
by Narutoouz - opened

This will the best AI model for local inference on 128 gb Ram M4 Max Macbook Pros and M3 ulra.

Sign up or log in to comment