DansPreConfig-24B / README.md
DoppelReflEx's picture
Update README.md
fd9c5ff verified
metadata
base_model:
  - PocketDoc/Dans-PersonalityEngine-V1.3.0-24b
  - PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
library_name: transformers
tags:
  - mergekit
  - merge
license: apache-2.0

What is this?

A experiment merge of PocketDoc/Dans-PersonalityEngine-V1.2.0-24b and PocketDoc/Dans-PersonalityEngine-V1.3.0-24b. I like ChatML format more just because I'm lazy, so I try to make this.

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
  - model: PocketDoc/Dans-PersonalityEngine-V1.3.0-24b
    parameters:
      density: 0.8
      weight: 0.8
merge_method: ties
base_model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
dtype: bfloat16