Monad Mistral 7B - Together AI Compatible

This is the safetensors version of the Monad blockchain-aligned Mistral 7B model, optimized for deployment on Together AI.

Model Details

  • Base Model: mistralai/Mistral-7B-Instruct-v0.3
  • Fine-tuning: LoRA adaptation merged into base model
  • Format: Safetensors (Together AI compatible)
  • Parameters: 7B
  • Context Length: 32768 tokens

Together AI Deployment

This model is specifically formatted for Together AI deployment:

import together

together.api_key = "YOUR_API_KEY"

response = together.Complete.create(
    model="YOUR_MODEL_ENDPOINT",
    prompt="[INST] What is Monad blockchain? [/INST]",
    max_tokens=512,
    temperature=0.7
)

Specialization

The model has been fine-tuned on Monad blockchain-specific content including:

  • Monad architecture and parallel execution
  • Ecosystem knowledge
  • Technical documentation
  • Smart contract patterns for Monad

Prompt Format

Use the Mistral instruction format:

[INST] Your question here [/INST]

Other Formats

License

Apache 2.0

Downloads last month
4
Safetensors
Model size
7B params
Tensor type
F16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for NoCritics/monad-mistral-7b-togetherai

Finetuned
(330)
this model