AI & ML interests
None defined yet.
Recent Activity
View all activity
CriticLean
Data and model collection for MARBLE: https://github.com/a43992899/MARBLE/
This is the collection of COIG-P's datasets
-
m-a-p/COIG-P
Viewer β’ Updated β’ 1.01M β’ 251 β’ 24 -
m-a-p/COIG-P-CRM
Viewer β’ Updated β’ 484k β’ 35 β’ 4 -
m-a-p/COIG-CRBench
Viewer β’ Updated β’ 1.04k β’ 23 β’ 2 -
COIG-P: A High-Quality and Large-Scale Chinese Preference Dataset for Alignment with Human Values
Paper β’ 2504.05535 β’ Published β’ 45
MuPT
-
m-a-p/OpenCodeInterpreter-DS-1.3B
Text Generation β’ 1B β’ Updated β’ 89 β’ 25 -
m-a-p/OpenCodeInterpreter-DS-6.7B
Text Generation β’ 7B β’ Updated β’ 2.2k β’ 136 -
m-a-p/OpenCodeInterpreter-DS-33B
Text Generation β’ Updated β’ 102 β’ 149 -
m-a-p/OpenCodeInterpreter-CL-7B
Text Generation β’ Updated β’ 2.05k β’ 11
-
MERT: Acoustic Music Understanding Model with Large-Scale Self-supervised Training
Paper β’ 2306.00107 β’ Published β’ 4 -
MusiLingo: Bridging Music and Text with Pre-trained Language Models for Music Captioning and Query Response
Paper β’ 2309.08730 β’ Published β’ 2 -
ChatMusician: Understanding and Generating Music Intrinsically with LLM
Paper β’ 2402.16153 β’ Published β’ 61 -
CMMMU: A Chinese Massive Multi-discipline Multimodal Understanding Benchmark
Paper β’ 2401.11944 β’ Published β’ 28
This is the checkpoints and datasets of MusiLingo: Bridging Music and Text with Pre-trained Language Models for Music Captioning and Query Response
-
TreePO: Bridging the Gap of Policy Optimization and Efficacy and Inference Efficiency with Heuristic Tree-based Modeling
Paper β’ 2508.17445 β’ Published β’ 78 -
m-a-p/TreePO-Qwen2.5-7B
8B β’ Updated β’ 14 β’ 2 -
m-a-p/TreePO_data
Viewer β’ Updated β’ 49.3k β’ 110 -
m-a-p/TreePO-Qwen2.5-7B_fixed-div
8B β’ Updated β’ 16
All 1.3B & 340M hybrid linear-attention experiments.
This is the collections of COIG-P's models
-
m-a-p/Infinity-Instruct-3M-0625-Llama3-8B-COIG-P
Text Generation β’ 8B β’ Updated β’ 67 -
m-a-p/Qwen2.5-Instruct-7B-COIG-P
Text Generation β’ 8B β’ Updated β’ 74 -
m-a-p/Infinity-Instruct-3M-0625-Mistral-7B-COIG-P
Text Generation β’ 7B β’ Updated β’ 69 -
m-a-p/Qwen2-Instruct-7B-COIG-P
Text Generation β’ 8B β’ Updated β’ 63
YuE: Open Full-song Generation Foundation Model
-
m-a-p/YuE-s1-7B-anneal-en-cot
Text Generation β’ 6B β’ Updated β’ 7.54k β’ 423 -
m-a-p/YuE-s1-7B-anneal-en-icl
Text Generation β’ 6B β’ Updated β’ 7.15k β’ 48 -
m-a-p/YuE-s1-7B-anneal-jp-kr-cot
Text Generation β’ 6B β’ Updated β’ 1.25k β’ 21 -
m-a-p/YuE-s1-7B-anneal-jp-kr-icl
Text Generation β’ 6B β’ Updated β’ 1.21k β’ 11
The checkpoints for the MERT: Acoustic Music Understanding Model with Large-Scale Self-supervised Training.
Dataset and Models of Chinese Open Instruction Generalist Series.
Datasets, Benchmark and Models of ChatMusician: Understanding and Generating Music Intrinsically with LLM
Neo
-
TreePO: Bridging the Gap of Policy Optimization and Efficacy and Inference Efficiency with Heuristic Tree-based Modeling
Paper β’ 2508.17445 β’ Published β’ 78 -
m-a-p/TreePO-Qwen2.5-7B
8B β’ Updated β’ 14 β’ 2 -
m-a-p/TreePO_data
Viewer β’ Updated β’ 49.3k β’ 110 -
m-a-p/TreePO-Qwen2.5-7B_fixed-div
8B β’ Updated β’ 16
CriticLean
All 1.3B & 340M hybrid linear-attention experiments.
Data and model collection for MARBLE: https://github.com/a43992899/MARBLE/
This is the collections of COIG-P's models
-
m-a-p/Infinity-Instruct-3M-0625-Llama3-8B-COIG-P
Text Generation β’ 8B β’ Updated β’ 67 -
m-a-p/Qwen2.5-Instruct-7B-COIG-P
Text Generation β’ 8B β’ Updated β’ 74 -
m-a-p/Infinity-Instruct-3M-0625-Mistral-7B-COIG-P
Text Generation β’ 7B β’ Updated β’ 69 -
m-a-p/Qwen2-Instruct-7B-COIG-P
Text Generation β’ 8B β’ Updated β’ 63
This is the collection of COIG-P's datasets
-
m-a-p/COIG-P
Viewer β’ Updated β’ 1.01M β’ 251 β’ 24 -
m-a-p/COIG-P-CRM
Viewer β’ Updated β’ 484k β’ 35 β’ 4 -
m-a-p/COIG-CRBench
Viewer β’ Updated β’ 1.04k β’ 23 β’ 2 -
COIG-P: A High-Quality and Large-Scale Chinese Preference Dataset for Alignment with Human Values
Paper β’ 2504.05535 β’ Published β’ 45
YuE: Open Full-song Generation Foundation Model
-
m-a-p/YuE-s1-7B-anneal-en-cot
Text Generation β’ 6B β’ Updated β’ 7.54k β’ 423 -
m-a-p/YuE-s1-7B-anneal-en-icl
Text Generation β’ 6B β’ Updated β’ 7.15k β’ 48 -
m-a-p/YuE-s1-7B-anneal-jp-kr-cot
Text Generation β’ 6B β’ Updated β’ 1.25k β’ 21 -
m-a-p/YuE-s1-7B-anneal-jp-kr-icl
Text Generation β’ 6B β’ Updated β’ 1.21k β’ 11
The checkpoints for the MERT: Acoustic Music Understanding Model with Large-Scale Self-supervised Training.
MuPT
Dataset and Models of Chinese Open Instruction Generalist Series.
-
m-a-p/OpenCodeInterpreter-DS-1.3B
Text Generation β’ 1B β’ Updated β’ 89 β’ 25 -
m-a-p/OpenCodeInterpreter-DS-6.7B
Text Generation β’ 7B β’ Updated β’ 2.2k β’ 136 -
m-a-p/OpenCodeInterpreter-DS-33B
Text Generation β’ Updated β’ 102 β’ 149 -
m-a-p/OpenCodeInterpreter-CL-7B
Text Generation β’ Updated β’ 2.05k β’ 11
Datasets, Benchmark and Models of ChatMusician: Understanding and Generating Music Intrinsically with LLM
-
MERT: Acoustic Music Understanding Model with Large-Scale Self-supervised Training
Paper β’ 2306.00107 β’ Published β’ 4 -
MusiLingo: Bridging Music and Text with Pre-trained Language Models for Music Captioning and Query Response
Paper β’ 2309.08730 β’ Published β’ 2 -
ChatMusician: Understanding and Generating Music Intrinsically with LLM
Paper β’ 2402.16153 β’ Published β’ 61 -
CMMMU: A Chinese Massive Multi-discipline Multimodal Understanding Benchmark
Paper β’ 2401.11944 β’ Published β’ 28
This is the checkpoints and datasets of MusiLingo: Bridging Music and Text with Pre-trained Language Models for Music Captioning and Query Response
Neo