nomic-embed-text-v2-moe
Model creator: nomic-ai
Original model: nomic-ai/nomic-embed-text-v2-moe
GGUF quantization: provided by olegshulyakov using llama.cpp
Special thanks
🙏 Special thanks to Georgi Gerganov and the whole team working on llama.cpp for making all of this possible.
Use with Ollama
ollama run "hf.co/olegshulyakov/nomic-embed-text-v2-moe-GGUF:Q4_0"
Use with LM Studio
lms load "olegshulyakov/nomic-embed-text-v2-moe-GGUF"
Use with llama.cpp CLI
llama-cli --hf "olegshulyakov/nomic-embed-text-v2-moe-GGUF:Q4_0" -p "The meaning to life and the universe is"
Use with llama.cpp Server:
llama-server --hf "olegshulyakov/nomic-embed-text-v2-moe-GGUF:Q4_0" -c 4096
- Downloads last month
- 79
Hardware compatibility
Log In
to view the estimation
4-bit
Model tree for olegshulyakov/nomic-embed-text-v2-moe-GGUF
Base model
FacebookAI/xlm-roberta-base
Finetuned
nomic-ai/nomic-xlm-2048
Finetuned
nomic-ai/nomic-embed-text-v2-moe