This model has been pushed to the Hub using the PytorchModelHubMixin integration.

Library: pxia

how to load

pip install pxia

use the AutoModel class

from pxia AutoModel
model = AutoModel.from_pretrained("phxia/gpt2")

or you can use the model class directly

from pxia import GPT2
model = GPT2.from_pretrained("phxia/gpt2")

Contributions

Any contributions are welcome at https://github.com/not-lain/pxia

Downloads last month
25
Safetensors
Model size
176M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the HF Inference API does not support pxia models with pipeline type text-generation

Model tree for phxia/gpt2

Adapters
1 model