Revela
Collection
8 items
โข
Updated
Revela-135M is a compact (135 M-parameter) variant of the Revela dense-retriever.
It is ideal for resource-constrained environments while maintaining strong general-domain retrieval quality.
Training used the same 320 K Wikipedia batches with in-batch attention.
Binary | Description |
---|---|
trumancai/Revela-1b | 1 B-parameter variant (LLaMA-3.2-1B backbone). |
trumancai/Revela-500M | 500 M-parameter variant (Qwen2.5-0.5B backbone). |
trumancai/Revela-135M | โ current repo |
trumancai/Revela-code-1b | 1 B-parameter code-retriever. |
trumancai/Revela-code-500M | 500 M-parameter code-retriever. |
trumancai/Revela-code-135M | 135 M-parameter code-retriever. |
trumancai/revela_training_corpus | Wikipedia training corpus. |
trumancai/revela_code_training_corpus | Code training corpus. |
from mteb.model_meta import ModelMeta
from mteb.models.repllama_models import RepLLaMAWrapper, _loader
import mteb, torch
revela_smol_135m = ModelMeta(
loader=_loader(
RepLLaMAWrapper,
base_model_name_or_path="HuggingFaceTB/SmolLM2-135M",
peft_model_name_or_path="trumancai/Revela-135M",
device_map="auto",
torch_dtype=torch.bfloat16,
),
name="trumancai/Revela-135M",
languages=["eng_Latn"],
open_source=True,
revision="c84848e2708dee28e9a58edaed78867537b489e3",
release_date="2024-04-13",
)
model = revela_smol_135m.loader()
mteb.MTEB(tasks=["SciFact", "NFCorpus"]).run(model=model, output_folder="results/Revela-135M")
Base model
HuggingFaceTB/SmolLM2-135M