File size: 485 Bytes
69f91b1
 
 
b4dc48c
67bf52c
 
b4dc48c
 
 
497550e
 
b4dc48c
497550e
b4dc48c
497550e
b4dc48c
497550e
 
b4dc48c
497550e
b4dc48c
67bf52c
b4dc48c
497550e
b4dc48c
497550e
b4dc48c
67bf52c
b4dc48c
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
license: llama3.1
---
### Merged as below:


-----
slices:
  - sources:

  - 
      - model: athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit
      - 
        layer_range: [0, 23]
        
  - sources:

  - 
      - model: athirdpath/Llama-3.1-Techne-RP-8b-v1
      - 
        layer_range: [9, 31]

merge_method: passthrough

dtype: float16

tokenizer_source: athirdpath/Llama-3.1-Techne-RP-8b-v1

-----

Then pretrained for 1 epoch on the Iambe dataset as an 11b model