File size: 889 Bytes
a4bae70 a7ab4f4 189c737 a4bae70 a7ab4f4 29f0f09 d65f92f c930223 d65f92f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
license: cc-by-4.0
datasets:
- NingLab/ECInstruct
---
# eCeLLM-M
This repo contains the models for "eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data"
## eCeLLM Models
Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models).
The eCeLLM-M model is instruction-tuned from the large base models [Mistral-7B Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2).
## Citation
```bibtex
@inproceedings{
peng2024ecellm,
title={eCe{LLM}: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data},
author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning},
booktitle={Forty-first International Conference on Machine Learning},
year={2024},
url={https://openreview.net/forum?id=LWRI4uPG2X}
}
``` |