--- base_model: - TroyDoesAI/BlackSheep-24B - ReadyArt/Forgotten-Safeword-24B-V2.2 - unsloth/Mistral-Small-24B-Instruct-2501 library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [unsloth/Mistral-Small-24B-Instruct-2501](https://huggingface.co/unsloth/Mistral-Small-24B-Instruct-2501) as a base. ### Models Merged The following models were included in the merge: * [TroyDoesAI/BlackSheep-24B](https://huggingface.co/TroyDoesAI/BlackSheep-24B) * [ReadyArt/Forgotten-Safeword-24B-V2.2](https://huggingface.co/ReadyArt/Forgotten-Safeword-24B-V2.2) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: unsloth/Mistral-Small-24B-Instruct-2501 - model: TroyDoesAI/BlackSheep-24B parameters: density: 0.50 weight: 0.60 - model: ReadyArt/Forgotten-Safeword-24B-V2.2 parameters: density: 0.35 weight: 0.3 merge_method: ties base_model: unsloth/Mistral-Small-24B-Instruct-2501 parameters: normalize: true dtype: bfloat16 ```