Model Card for Starcoder-conala

This model is an instruction-tuned version of ⭐️ StarCoder. The instruction dataset involved is Conala-mined-curated which was built by boostrapping by predicting the column rewritten_intent of the mined subset of the CoNaLa corpus.

Usage

The model was fine-tuned with the following template

Question: <instruction>

Answer: <output>

If you have your model and tokenizer loaded, you can use the following code to make the model generate the right output to a given instruction

instruction = "Write a function to compute the GCD between two integers a and b"
prompt = f"Question:{instruction}\n\nAnswer:"
input_ids = tokenizer(prompt, return_tensors="pt")["input_ids"]
completion = model.generate(input_ids, max_length=200)
print(tokenizer.batch_decode(completion[:,input_ids.shape[1]:])[0])

More information

For additional information, check

Downloads last month
14
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train codeparrot/starcoder-conala