Phi3-Legal-Finetuned

This is a fine-tuned version of the Phi-3 Mini model for legal text generation tasks.

Model Details

  • Base Model: Microsoft Phi-3 Mini 128K
  • Fine-tuned On: Legal documents and summaries
  • Context Length: 128K tokens
  • License: MIT

Usage

You can load the model using Hugging Face Transformers:

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "sairamn/Phi3-Legal-Finetuned"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

Limitations

  • The model is not a substitute for professional legal advice.
  • May generate incorrect or biased information.

Acknowledgments

  • Based on Microsoft Phi-3 Mini.

Citation

If you use this model, please cite accordingly.

Downloads last month
8
GGUF
Model size
3.82B params
Architecture
phi3
Hardware compatibility
Log In to view the estimation
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for sairamn/Phi3-Legal-Finetuned

Quantized
(78)
this model

Dataset used to train sairamn/Phi3-Legal-Finetuned