Monad Mistral 7B - Together AI Compatible
This is the safetensors version of the Monad blockchain-aligned Mistral 7B model, optimized for deployment on Together AI.
Model Details
- Base Model: mistralai/Mistral-7B-Instruct-v0.3
- Fine-tuning: LoRA adaptation merged into base model
- Format: Safetensors (Together AI compatible)
- Parameters: 7B
- Context Length: 32768 tokens
Together AI Deployment
This model is specifically formatted for Together AI deployment:
import together
together.api_key = "YOUR_API_KEY"
response = together.Complete.create(
model="YOUR_MODEL_ENDPOINT",
prompt="[INST] What is Monad blockchain? [/INST]",
max_tokens=512,
temperature=0.7
)
Specialization
The model has been fine-tuned on Monad blockchain-specific content including:
- Monad architecture and parallel execution
- Ecosystem knowledge
- Technical documentation
- Smart contract patterns for Monad
Prompt Format
Use the Mistral instruction format:
[INST] Your question here [/INST]
Other Formats
- GGUF Version: Available at NoCritics/salmonaude.ai
- LoRA Adapter: Available at NoCritics/monad-mistral-lora
License
Apache 2.0
- Downloads last month
- 4
Model tree for NoCritics/monad-mistral-7b-togetherai
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3