🚀 How to use
# 初始化pipeline
from transformers import pipeline
question_answerer = pipeline(
"question-answering",
model="guo1006/bert-finetuned-squad-accelerate"
)
# 上下文
context = """
🤗 Transformers is backed by the three most popular deep learning libraries —
Jax, PyTorch and TensorFlow — with a seamless integration between them.
It's straightforward to train your models with one before loading them
for inference with the other.
"""
# 提问
question = "Which deep learning libraries back 🤗 Transformers?"
# 推理
result = question_answerer(question=question, context=context)
# 输出结果
print(result)
- Downloads last month
- 3
Model tree for guo1006/bert-finetuned-squad-accelerate
Base model
google-bert/bert-base-cased