Cisco Network Configuration Model (16-bit)

Fine-tuned TinyLlama model for Cisco network configuration tasks.

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("Renugadevi82/cisco-nx-ai-16bit")
tokenizer = AutoTokenizer.from_pretrained("Renugadevi82/cisco-nx-ai-16bit")

prompt = "Configure VLAN 100 with name Management"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Downloads last month
10
Safetensors
Model size
1.1B params
Tensor type
F16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support