Model Card for Model ID

Model Details

Model Description

This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: Sefika

  • Language(s) (NLP): EN

  • License: MIT

  • Finetuned from model [optional]: mistralai/Mistral-7B-Instruct-v0.2

Model Sources [optional]

  • Repository: [More Information Needed]
  • Paper [optional]: [More Information Needed]
  • Demo [optional]: [More Information Needed]

Direct Use

from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline

tokenizer = "mistralai/Mistral-7B-Instruct-v0.2"
model_id = "Sefika/mistral_fewrel_10_5"
tokenizer = AutoTokenizer.from_pretrained(model_id, use_auth_token=True)
model = AutoModelForCausalLM.from_pretrained(
    model_id,
    device_map="auto",
    load_in_4bit=True,  # Requires bitsandbytes
    torch_dtype="auto"
)

Testing Data

FewRel

BibTeX: The paper "Large Language Models for Continual Relation Extraction" is submitted to Springer Machine Learning journal

Model Card Contact

sefika efeoglu

Downloads last month
2
Safetensors
Model size
3.86B params
Tensor type
F32
·
F16
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including Sefika/mistral_fewrel_10_5