🚀 How to use

# 初始化pipeline
from transformers import pipeline
question_answerer = pipeline(
    "question-answering",
    model="guo1006/bert-finetuned-squad-accelerate"
)

# 上下文
context = """
🤗 Transformers is backed by the three most popular deep learning libraries — 
Jax, PyTorch and TensorFlow — with a seamless integration between them. 
It's straightforward to train your models with one before loading them 
for inference with the other.
"""

# 提问
question = "Which deep learning libraries back 🤗 Transformers?"

# 推理
result = question_answerer(question=question, context=context)

# 输出结果
print(result)
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for guo1006/bert-finetuned-squad-accelerate

Finetuned
(2600)
this model

Dataset used to train guo1006/bert-finetuned-squad-accelerate