--- base_model: - nbeerbower/Mistral-Nemo-Prism-12B - inflatebot/MN-12B-Mag-Mell-R1 - grimjim/magnum-consolidatum-v1-12b - grimjim/mistralai-Mistral-Nemo-Base-2407 - grimjim/mistralai-Mistral-Nemo-Instruct-2407 - Delta-Vector/Rei-V2-12B - grimjim/magnum-twilight-12b library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Task Arithmetic](https://arxiv.org/abs/2212.04089) merge method using [grimjim/mistralai-Mistral-Nemo-Base-2407](https://huggingface.co/grimjim/mistralai-Mistral-Nemo-Base-2407) as a base. ### Models Merged The following models were included in the merge: * [nbeerbower/Mistral-Nemo-Prism-12B](https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B) * [inflatebot/MN-12B-Mag-Mell-R1](https://huggingface.co/inflatebot/MN-12B-Mag-Mell-R1) * [grimjim/magnum-consolidatum-v1-12b](https://huggingface.co/grimjim/magnum-consolidatum-v1-12b) * [grimjim/mistralai-Mistral-Nemo-Instruct-2407](https://huggingface.co/grimjim/mistralai-Mistral-Nemo-Instruct-2407) * [Delta-Vector/Rei-V2-12B](https://huggingface.co/Delta-Vector/Rei-V2-12B) * [grimjim/magnum-twilight-12b](https://huggingface.co/grimjim/magnum-twilight-12b) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: grimjim/mistralai-Mistral-Nemo-Base-2407 dtype: bfloat16 merge_method: task_arithmetic parameters: normalize: true models: - model: grimjim/mistralai-Mistral-Nemo-Base-2407 - model: grimjim/mistralai-Mistral-Nemo-Instruct-2407 parameters: weight: 0.875 - model: grimjim/magnum-consolidatum-v1-12b parameters: weight: 0.015625 - model: grimjim/magnum-twilight-12b parameters: weight: 0.001953125 - model: nbeerbower/Mistral-Nemo-Prism-12B parameters: weight: 0.0625 - model: Delta-Vector/Rei-V2-12B parameters: weight: .00390625 - model: inflatebot/MN-12B-Mag-Mell-R1 parameters: weight: .00390625 ```