|
--- |
|
license: mit |
|
library_name: transformers |
|
base_model: |
|
- deepseek-ai/DeepSeek-V3-0324 |
|
- deepseek-ai/DeepSeek-R1 |
|
pipeline_tag: text-generation |
|
--- |
|
# DeepSeek-R1T-Chimera |
|
|
|
<div align="center"> |
|
<img src="https://www.tngtech.com/_astro/TNG_Logo.URm66zYr_Z2aCrIU.svg" |
|
alt="TNG Logo" |
|
width="400" |
|
style="display: inline-block; vertical-align: middle;"/> |
|
</div> |
|
<br> |
|
<div align="center"> |
|
<a href="LICENSE" style="margin: 2px;"> |
|
<img alt="License" src="https://img.shields.io/badge/License-MIT-f5de53?&color=f5de53" style="display: inline-block; vertical-align: middle;"/> |
|
</a> |
|
</div> |
|
|
|
**Model merge of DeepSeek-R1 and DeepSeek-V3 (0324)** |
|
|
|
An open weights model combining the intelligence of R1 with the token efficiency of V3. |
|
|
|
## Model Details |
|
|
|
- **Architecture**: DeepSeek-MoE Transformer-based language model |
|
- **Combination Method**: Merged model weights from DeepSeek-R1 and DeepSeek-V3 (0324) |
|
- **Release Date**: 2025-04-27 |
|
|
|
|
|
## Contact |
|
|
|
- Email: [email protected] |