-
Aurora-M: The First Open Source Multilingual Language Model Red-teamed according to the U.S. Executive Order
Paper • 2404.00399 • Published • 42 -
aurora-m/aurora-m-base
Text Generation • Updated • 52 • 16 -
aurora-m/aurora-m-biden-harris-redteamed
Text Generation • Updated • 310 • 20 -
aurora-m/aurora-m-instruct
Text Generation • Updated • 11

Aurora-M
AI & ML interests
We present a set of mulitlingual red-teamed models. Trained on the LUMI HPC in Finland (thus the name Aurora). See our paper: https://arxiv.org/abs/2404.00399
Recent Activity
We are a group of volunteer researchers focused on equal access to multilingual AI. We present a set of red-teamed models. Trained on the LUMI HPC in Finland (thus the name Aurora). The "-m" designation stands for multimodal, multilingual, multidomain mixture of expert (MOE) models, each of which we intend to research. As part of Ontocord.AI's dedication to lawful open science AI, Ontocord coordinated to this effort with the volunteers and contributed to the safety measures. This work should NOT be confused with the AuroraGPT, https://www.hpcwire.com/2023/11/13/training-of-1-trillion-parameter-scientific-ai-begins/.