zenz-v1 Checkpoints

zenz-v1 is a language model specialized for kana-kanji conversion tasks based on the GPT-2 architecture. It is intended for use in the neural kana-kanji conversion system "Zenzai."

This repository publishes the checkpoints for zenz-v1.

  • 90M parameters
  • Character-level + byte-level BPE tokenizer
  • High performance in kana-kanji conversion tasks using greedy decoding

Model Details

Model Description

The base model used is ku-nlp/gpt2-small-japanese-char provided under CC-BY-SA 4.0.

This model is provided under CC-BY-SA 4.0.

Model Sources

This model is intended for use with Zenzai (AzooKeyKanaKanjiConverter).

Acknowledgements

The following libraries, tools, and language resources were utilized in constructing this model.

Downloads last month
9
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Spaces using Miwa-Keita/zenz-v1-checkpoints 2