This model has been pushed to the Hub using the PytorchModelHubMixin integration:

Usage

Here's how to use the model for inference:

from model.model import TwinLiteNetPlus

model = TwinLiteNetPlus.from_pretrained("nielsr/twinlitenetplus-nano")
Downloads last month
4
Safetensors
Model size
33.9k params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support