Quantizations of https://huggingface.co/Sakalti/Magro-7b-v1.1
Open source inference clients/UIs
Closed source inference clients/UIs
- LM Studio
- More will be added...
From original readme
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using HuggingFaceH4/zephyr-7b-alpha as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Sakalti/magro-7B
parameters:
weight: 1
density: 1
merge_method: ties
base_model: HuggingFaceH4/zephyr-7b-alpha
parameters:
weight: 1
density: 1
normalize: true
int8_mask: true
dtype: bfloat16
- Downloads last month
- 302
Hardware compatibility
Log In
to view the estimation
1-bit
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit