Model Summary

Revela is a self-supervised bi-encoder retrieval model, trained on raw text with an in-batch attention mechanism. This version, Revela-1b was trained on a corpus of 320K batches in the size of 16 by chunking Wikipedia. See the paper for more details.

Other Links

Binary Description
trumancai/Revela-500M A Revela dense-retriever bi-encoder based on Qwen2.5-0.5B (500 M parameters), trained on the same Wikipedia corpus.
trumancai/Revela-135M A compact Revela retriever using SmolLM2-135 M (135 M parameters), self-supervised on Wikipedia for efficient general retrieval.
trumancai/Revela-code-1b A Revela code retriever built on LLaMA-3.2-1B (1 B parameters), self-supervised on the Revela code training corpus (Stack Overflow posts, tutorials, library docs) for code-search tasks.
trumancai/Revela-code-500M A Revela code retriever based on Qwen2.5-0.5B (500 M parameters), trained on the same code corpus for software-domain retrieval.
trumancai/Revela-code-135M A lightweight Revela code retriever using SmolLM2-135 M (135 M parameters), self-supervised on code corpora for resource-constrained code-search scenarios.
trumancai/revela_training_corpus Wikipedia Training Corpus: English Wikipedia passages segmented into ≤ 120-word chunks and grouped into batches of 16 (320 k batches) for general-domain Revela training.
trumancai/revela_code_training_corpus Code Training Corpus: Code-centric chunks (358 763 batches) drawn from Stack Overflow posts, online tutorials and library documentation, batched identically for Revela code retrievers.

Usage

We can evaluate the trained models with customized mteb.

from mteb.model_meta import ModelMeta
from mteb.models.repllama_models import RepLLaMAWrapper, _loader

revela_llama_1b = ModelMeta(
    loader=_loader(
        RepLLaMAWrapper,
        base_model_name_or_path="meta-llama/Llama-3.2-1B",
        peft_model_name_or_path="trumancai/Revela-1b",
        device_map="auto",
        torch_dtype=torch.bfloat16,
    ),
    name="trumancai/Revela-1b",
    languages=["eng_Latn"],
    open_source=True,
    revision="41a2bd8968d2640e1e386861776c48bdaac1306a",  # base-peft revision
    release_date="2025-04-13",
)
revela_llama_1b_model = revela_llama_1b.loader()

evaluation = mteb.MTEB(tasks=["SciFact", "NFCorpus"])
evaluation.run(model=revela_llama_1b_model, output_folder="results/Revela-1b")

License

Citation

Downloads last month
2
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for trumancai/Revela-1b

Adapter
(553)
this model

Dataset used to train trumancai/Revela-1b

Collection including trumancai/Revela-1b