metadata
library_name: araclip
tags:
- clip
- model_hub_mixin
- pytorch_model_hub_mixin
This model has been pushed to the Hub using the PytorchModelHubMixin integration:
How to use
pip install git+https://github.com/Arabic-Clip/Araclip.git
# load model
import numpy as np
from PIL import Image
from araclip import AraClip
model = AraClip.from_pretrained("Arabic-Clip/araclip")
# data
labels = ["قطة جالسة", "قطة تقفز" ,"كلب", "حصان"]
image = Image.open("cat.png")
# embed data
image_features = model.embed(image=image)
text_features = np.stack([model.embed(text=label) for label in labels])
# search for most similar data
similarities = text_features @ image_features
best_match = labels[np.argmax(similarities)]
print(f"The image is most similar to: {best_match}")
# قطة جالسة