Dynamically quantized DistilBERT base uncased finetuned MPRC

Table of Contents

Model Details

Model Description: This model is a DistilBERT fine-tuned on MPRC dynamically quantized with optimum-intel through the usage of huggingface/optimum-intel through the usage of Intel® Neural Compressor.

  • Model Type: Text Classification
  • Language(s): English
  • License: Apache-2.0
  • Parent Model: For more details on the original model, we encourage users to check out this model card.

How to Get Started With the Model

PyTorch

To load the quantized model, you can do as follows:

from optimum.intel import INCModelForSequenceClassification

model_id = "Intel/distilbert-base-uncased-MRPC-int8-dynamic"
model = INCModelForSequenceClassification.from_pretrained(model_id)

Test result

INT8 FP32
Accuracy (eval-f1) 0.8983 0.9027
Model size (MB) 75 268
Downloads last month
116
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train Intel/distilbert-base-uncased-MRPC-int8-dynamic-inc

Collection including Intel/distilbert-base-uncased-MRPC-int8-dynamic-inc