Mol ID

A transformer encoder model pretrained on 50M ZINC SMILES string using flash attention 2

Hardware:

  • gpu that support flash attention 2 and bf16

Software:

  • flash attention 2
  • lightning for mixed precision (bf16-mixed)
  • wandb for logging
  • huggingface
    • tokenizers
    • datasets

github repo: link

Downloads last month
3
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support