Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

LoRA Image Binary Classification LoRA adapter

Trained on APTOS 2019 Kaggle competition for identifying diabetic retinopathy. In this case I've modified the problem to binary classifier (diagnosis=0 vs. all others; 50-50% distribution in training data)

Base Model: google/vit-large-patch16-224

Dataset: https://www.kaggle.com/c/aptos2019-blindness-detection - fundus images of the back of the eye, and diabetic retinopathy score

Training notebook: https://colab.research.google.com/drive/1TVsUyyou87E26Sz40CdBH3CzWoVckgtq?usp=sharing

On 10% held-out of training data: accuracy 98%

  • PEFT 0.5.0

PEFT Image classifier inference / Gradio app

from peft import PeftModel
from PIL import Image
from transformers import AutoImageProcessor, AutoModelForImageClassification

from torchvision.transforms import (
    CenterCrop,
    Compose,
    Normalize,
    RandomHorizontalFlip,
    RandomResizedCrop,
    Resize,
    ToTensor,
)

model_name = 'google/vit-large-patch16-224'
adapter = 'monsoon-nlp/eyegazer-vit-binary'

image_processor = AutoImageProcessor.from_pretrained(model_name)

normalize = Normalize(mean=image_processor.image_mean, std=image_processor.image_std)
train_transforms = Compose(
    [
        RandomResizedCrop(image_processor.size["height"]),
        RandomHorizontalFlip(),
        ToTensor(),
        normalize,
    ]
)

val_transforms = Compose(
    [
        Resize(image_processor.size["height"]),
        CenterCrop(image_processor.size["height"]),
        ToTensor(),
        normalize,
    ]
)

model = AutoModelForImageClassification.from_pretrained(
    model_name,
    ignore_mismatched_sizes=True,
    num_labels=2,
)

lora_model = PeftModel.from_pretrained(model, adapter)

img = Image.open("sample.png")
pimg = val_transforms(img.convert("RGB"))
batch = pimg.unsqueeze(0)
op = lora_model(batch)
vals = op.logits.tolist()[0]

if vals[0] > vals[1]:
    return "Predicted unaffected"
else:
    return "Predicted affected to some degree"

Future goals

  • More documentation
  • Modify loss for regression on 0-4 score
Downloads last month
13
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for monsoon-nlp/eyegazer-vit-binary

Adapter
(4)
this model

Space using monsoon-nlp/eyegazer-vit-binary 1

Collection including monsoon-nlp/eyegazer-vit-binary