πŸš€ Quantumhash

This is a Quantumhash trained for text generation.
You can use it to as a text_generation Model.


πŸ”₯ Try It Now

Use the inference widget below or in your own application.

πŸ€— Try on Spaces


πŸš€ How to Use This Model

πŸ”Ή Use in Python

For text models (Transformers):

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "sbapan41/Quantumhash"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

prompt = "Once upon a time..."
inputs = tokenizer(prompt, return_tensors="pt")
output = model.generate(**inputs)
print(tokenizer.decode(output[0]))

---
## 🌍 Inference API
### Try the model directly in your browser with the Hugging Face Inference API.
---
from huggingface_hub import InferenceClient

client = InferenceClient(model="sbapan41/Quantumhash")
response = client.text_generation("Hello, how are you?")
print(response)
Downloads last month
10
Safetensors
Model size
7.24B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for sbapan41/Quantumhash

Unable to build the model tree, the base model loops to the model itself. Learn more.