File size: 498 Bytes
29d4e2a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
library_name: moeob
license: mit
pipeline_tag: text-generation
tags:
- byte-level
- experimental
- mixture-of-experts
- model_hub_mixin
- pytorch_model_hub_mixin
- summary-then-generate
---

This model has been pushed to the Hub using the [PytorchModelHubMixin](https://huggingface.co/docs/huggingface_hub/package_reference/mixins#huggingface_hub.PyTorchModelHubMixin) integration:
- Code: https://github.com/enosislabs/moeob
- Paper: [More Information Needed]
- Docs: [More Information Needed]