Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

mesolitica
/
nanot5-large-malaysian-cased

Text2Text Generation
Transformers
Safetensors
Malay
t5
text-generation-inference
Model card Files Files and versions
xet
Community
  • Pretrain LARGE 512 masking context length T5 on Malaysian text

    Pretrain LARGE 512 masking context length T5 on Malaysian text

    README at https://github.com/mesolitica/malaya/tree/5.1/pretrained-model/nanoT5

    WandB, https://wandb.ai/huseinzol05/nanoT5-large?nw=nwuserhuseinzol05

    Downloads last month
    6
    Safetensors
    Model size
    783M params
    Tensor type
    F32
    ·
    Inference Providers NEW
    Text2Text Generation
    This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

    Collection including mesolitica/nanot5-large-malaysian-cased

    Malaysian Seq2Seq

    Collection
    Trained on 17B tokens, 81GB of cleaned texts, able to understand standard Malay, local Malay, local Mandarin, Manglish, and local Tamil. • 8 items • Updated Dec 23, 2024
    Company
    TOS Privacy About Jobs
    Website
    Models Datasets Spaces Pricing Docs