๐ง OphthaScholarโ1.2B

OphthaScholarโ1.2B is a fine-tuned clinical assistant model focused specifically on ophthalmology. It is built on top of the LiquidAI LFMโ1.2B base model and fine-tuned using ophthalmology-relevant question-answer pairs extracted from the MIRIADโ4.4M dataset.
๐ Model Use
- โ๏ธ Good at: Direct factual question answering related to ophthalmology (short-form)
- โ ๏ธ Not suitable for: Multi-choice QA or reasoning-based multi-step inference
- โ Do not use for: Real clinical decision-making or diagnostic support
โ ๏ธ Important Caution (from MIRIAD Dataset)
This model was trained using a filtered subset of the MIRIADโ4.4M dataset. While the dataset was curated using LLMs and rule-based methods, no licensed medical professionals reviewed it. As noted by the dataset authors:
โDespite quality filtering, hallucinations or factual errors may remain. The dataset should not be used to train clinical decision tools or models intended for real medical use.โ
๐ง How to Use
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_id = "yasserrmd/OphthaScholar"
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_id,
device_map="auto",
torch_dtype=torch.float16 if torch.cuda.is_available() else torch.float32,
trust_remote_code=True).eval()
question = "What is the treatment for neovascular age-related macular degeneration?"
prompt = f"System: You are an ophthalmology expert.\nUser: {question}\nAnswer:"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=64, do_sample=False)
print(tokenizer.decode(outputs[0], skip_special_tokens=True).split("Answer:")[-1].strip())
๐ง Suggested Applications
- โ Clinical education (ophthalmology)
- โ Dataset generation for supervised fine-tuning
- โ Question answering for medical research chatbots
โ Not Intended For
- โ Clinical use
- โ Diagnostic automation
- โ Emergency decision tools
๐ Citation
@misc{ophthascholar2025,
title={OphthaScholar-1.2B: A factual ophthalmology model fine-tuned on MIRIAD},
author={Yasserrmd},
year={2025},
url={https://huggingface.co/yasserrmd/OphthaScholar}
}
๐ License
- Base: MIT (Liquid LFM-1.2B)
- Data: MIRIAD-4.4M (filtered use only; not for clinical deployment)
- Downloads last month
- 11
Model tree for yasserrmd/OphthaScholar-1.2B
Base model
LiquidAI/LFM2-1.2B