This model seems to be stable. No repetition issues were found.
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
merge_method: passthrough
dtype: bfloat16
slices:
- sources:
- model: yamatazen/EtherealAurora-12B
layer_range: [0,10]
- sources:
- model: yamatazen/EtherealAurora-12B
layer_range: [2,12]
- sources:
- model: yamatazen/EtherealAurora-12B
layer_range: [10,20]
- sources:
- model: yamatazen/EtherealAurora-12B
layer_range: [12,22]
- sources:
- model: yamatazen/EtherealAurora-12B
layer_range: [20,30]
- sources:
- model: yamatazen/EtherealAurora-12B
layer_range: [22,32]
- sources:
- model: yamatazen/EtherealAurora-12B
layer_range: [30,40]
- Downloads last month
- 48
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support