Update README.md
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ pipeline_tag: question-answering
|
|
16 |
|
17 |
## Model Details
|
18 |
|
19 |
-
This model is a mixture of experts (MoE) using the [RhuiDih/moetify](https://github.com/RhuiDih/moetify) library
|
20 |
|
21 |
## Key Features
|
22 |
|
@@ -62,6 +62,8 @@ INFO:root:MOE active parameters: 5985438720
|
|
62 |
|
63 |
## Inference
|
64 |
|
|
|
|
|
65 |
```python
|
66 |
# git clone moetify fork that fixes dependency issue
|
67 |
!git clone -b fix-transformers-4.47.1-FlashA2-dependency --single-branch https://github.com/davzoku/moetify.git
|
@@ -112,4 +114,3 @@ print(generated_text)
|
|
112 |
|
113 |
- [Flexible and Effective Mixing of Large Language Models into a Mixture of Domain Experts](https://arxiv.org/abs/2408.17280v2)
|
114 |
- [RhuiDih/moetify](https://github.com/RhuiDih/moetify)
|
115 |
-
|
|
|
16 |
|
17 |
## Model Details
|
18 |
|
19 |
+
This model is a mixture of experts (MoE) using the [RhuiDih/moetify](https://github.com/RhuiDih/moetify) library with various task-specific experts. All relevant expert models, LoRA adapters, and datasets are available at [Moecule Ingredients](https://huggingface.co/collections/davzoku/moecule-ingredients-67dac0e6210eb1d95abc6411).
|
20 |
|
21 |
## Key Features
|
22 |
|
|
|
62 |
|
63 |
## Inference
|
64 |
|
65 |
+
To run an inference with this model, you can use the following code snippet:
|
66 |
+
|
67 |
```python
|
68 |
# git clone moetify fork that fixes dependency issue
|
69 |
!git clone -b fix-transformers-4.47.1-FlashA2-dependency --single-branch https://github.com/davzoku/moetify.git
|
|
|
114 |
|
115 |
- [Flexible and Effective Mixing of Large Language Models into a Mixture of Domain Experts](https://arxiv.org/abs/2408.17280v2)
|
116 |
- [RhuiDih/moetify](https://github.com/RhuiDih/moetify)
|
|