File size: 723 Bytes
63255bb
 
 
 
 
5b328fb
63255bb
377efa7
39ccd87
63255bb
 
a47edfe
5b328fb
a47edfe
1
2
3
4
5
6
7
8
9
10
11
12
13
14
---
library_name: transformers
tags: []
---

# gemma2-mitra-base

This is based on gemma2-9b and continously pretrained for 2 epochs on a total of 7B tokens from various Buddhist data collections preserved in Sanskrit, Tibetan, English, and Pāli.  
A publication describing the dataset and training details will follow soon. 

## Model Details
For details on how to run this please see the gemma2-9b repository: https://huggingface.co/google/gemma-2-9b  
Please be aware that this is a base model without any instruction finetuning, so it will perform badly on general tasks without giving at least few-shot examples.  
There is an instruction-finetuned version here: https://huggingface.co/buddhist-nlp/gemma-2-mitra-it