Model Summary

Revela-code-3b is a self-supervised code retriever built on the 3 B-parameter LLaMA-3.2-3B backbone.
It was trained on 358 K batches of code-centric text (Stack Overflow, tutorials, API docs) using the Revela next-token-prediction + in-batch attention objective.
Use it for code-search, question-answer navigation, or hybrid doc/code retrieval.

Other Links

Binary Description
trumancai/Revela-code-3b 3 B-parameter code-retriever.
trumancai/Revela-code-1b 1 B-parameter code-retriever.
trumancai/Revela-code-500M 500 M-parameter code-retriever.
trumancai/Revela-3b 3 B-parameter Wikipedia retriever.
trumancai/Revela-1b 1 B-parameter Wikipedia retriever.
trumancai/Revela-500M 500 M-parameter Wikipedia retriever.
trumancai/revela_code_training_corpus Code training corpus.
trumancai/revela_training_corpus Wikipedia training corpus.

Usage

from mteb.model_meta import ModelMeta
from mteb.models.repllama_models import RepLLaMAWrapper, _loader
import mteb, torch

revela_llama_code_3b = ModelMeta(
    loader=_loader(
        RepLLaMAWrapper,
        base_model_name_or_path="meta-llama/Llama-3.2-3B",
        peft_model_name_or_path="trumancai/Revela-code-3b",
        device_map="auto",
        torch_dtype=torch.bfloat16,
    ),
    name="trumancai/Revela-code-3b",
    languages=["eng_Latn"],
    open_source=True,
    revision="974f4d8e7ff5d5439cc1863088948249f612c284",
    release_date="2025-10-07",
)

model = revela_llama_code_3b.loader()

mteb.MTEB(tasks=["AppsRetrieval"])
    .run(model=model, output_folder="results/Revela-code-3b")

License

Citation

Downloads last month
10
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for trumancai/Revela-code-3b

Adapter
(225)
this model

Dataset used to train trumancai/Revela-code-3b

Collection including trumancai/Revela-code-3b