Edit model card

bert-base-spanish-wwm-cased-xnli

UPDATE, 15.10.2021: Check out our new zero-shot classifiers, much more lightweight and even outperforming this one: zero-shot SELECTRA small and zero-shot SELECTRA medium.

Model description

This model is a fine-tuned version of the spanish BERT model with the Spanish portion of the XNLI dataset. You can have a look at the training script for details of the training.

How to use

You can use this model with Hugging Face's zero-shot-classification pipeline:

from transformers import pipeline
classifier = pipeline("zero-shot-classification", 
                       model="Recognai/bert-base-spanish-wwm-cased-xnli")

classifier(
    "El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo",
    candidate_labels=["cultura", "sociedad", "economia", "salud", "deportes"],
    hypothesis_template="Este ejemplo es {}."
)
"""output
{'sequence': 'El autor se perfila, a los 50 años de su muerte, como uno de los grandes de su siglo',
 'labels': ['cultura', 'sociedad', 'economia', 'salud', 'deportes'],
 'scores': [0.38897448778152466,
  0.22997373342514038,
  0.1658431738615036,
  0.1205764189362526,
  0.09463217109441757]}
"""

Eval results

Accuracy for the test set:

XNLI-es
bert-base-spanish-wwm-cased-xnli 79.9%
Downloads last month
4,221
Safetensors
Model size
110M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Recognai/bert-base-spanish-wwm-cased-xnli

Adapters
4 models
Finetunes
1 model

Dataset used to train Recognai/bert-base-spanish-wwm-cased-xnli

Spaces using Recognai/bert-base-spanish-wwm-cased-xnli 2