Naive/Memory Heavy Chain Classifier

This repository provides a transformer-based classifier for distinguishing between naive and memory B-cell receptor heavy chain sequences. It uses adapters integrated into pre-trained language models for efficient fine-tuning. The model is fine-tuned from HeavyBERTa, a protein language model based on the RoBERTa architecture, pre-trained on a large corpus of unpaired heavy chain sequences from the OAS database. An equivalent classification model for light chains can be found here.

For more information of how to use this model, please visit our Github repository.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for leaBroe/HeavyBERTa_naive_mem_cls

Base model

leaBroe/HeavyBERTa
Finetuned
(2)
this model