navicore commited on
Commit
449c9e7
·
verified ·
1 Parent(s): c302eac

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +33 -78
README.md CHANGED
@@ -1,102 +1,57 @@
1
  ---
2
- license: apache-2.0
3
  language:
4
  - en
5
- library_name: transformers
 
6
  tags:
7
- - charles-fort
8
- - anomalous-phenomena
9
  - fortean
10
- - lora
11
- - text-generation
12
- base_model: Qwen/Qwen3-8B
13
- widget:
14
- - text: "Human: What do you think about UFOs?\n\nFort:"
15
- - text: "Human: Tell me about mysterious disappearances.\n\nFort:"
16
- - text: "Human: Why do scientists reject anomalous data?\n\nFort:"
17
  ---
18
 
19
- # Fortean Qwen3 8B Advanced
20
-
21
- A fine-tuned language model that responds in the style of Charles Fort, the early 20th century researcher of anomalous phenomena and author of "The Book of the Damned."
22
 
23
- ## Model Description
24
 
25
- This model captures Charles Fort's distinctive writing style:
26
- - Skeptical of scientific orthodoxy
27
- - Collector of "damned" data that science excludes
28
- - Uses temporal abstractions ("about 1889" instead of specific dates)
29
- - Philosophical observations about humanity's place in the cosmos
30
- - Dry wit and ironic observations
31
 
32
- ## Intended Use
33
-
34
- - Creative writing in Fort's style
35
- - Educational exploration of Fortean philosophy
36
- - Entertainment and literary experimentation
37
- - NOT for factual information or scientific accuracy
38
-
39
- ## Training Data
40
-
41
- The model was fine-tuned on:
42
- - Passages from Fort's four books: The Book of the Damned (1919), New Lands (1923), Lo! (1931), Wild Talents (1932)
43
- - 500 synthetic Q&A pairs generated from Fort's writings
44
- - Temporal abstractions applied to all specific dates
45
-
46
- ## Training Details
47
-
48
- - **Base Model:** Qwen/Qwen3-8B
49
- - **Fine-tuning Method:** LoRA (r=32, alpha=64)
50
- - **Training Hardware:** Apple M4 / NVIDIA GPU
51
- - **Training Duration:** ~8-12 hours
52
- - **Framework:** HuggingFace Transformers + PEFT
53
-
54
- ## Example Usage
55
 
 
56
  ```python
57
- from transformers import AutoTokenizer, AutoModelForCausalLM
58
  from peft import PeftModel
59
 
60
- # Load base model
61
  base_model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen3-8B")
62
  model = PeftModel.from_pretrained(base_model, "navicore/fortean-qwen3-8b-advanced")
63
  tokenizer = AutoTokenizer.from_pretrained("navicore/fortean-qwen3-8b-advanced")
64
 
65
- # Generate Fortean response
66
- prompt = "Human: What are your thoughts on coincidences?\n\nFort:"
67
- inputs = tokenizer(prompt, return_tensors="pt")
68
- outputs = model.generate(**inputs, max_length=200, temperature=0.8)
69
- response = tokenizer.decode(outputs[0], skip_special_tokens=True)
70
- print(response)
71
  ```
72
 
73
- ## Sample Outputs
74
-
75
- **Human:** What do you think about red rain?
76
- **Fort:** I have collected numerous instances where red rains have fallen, containing matter that the orthodox scientists cannot adequately explain. In about 1819, and again in the middle of the decade of the 1880s, such phenomena were reported across Europe. The exclusionists tell us it is merely sand from the Sahara, but the microscopic examination reveals organic matter of unknown origin. I think we are property.
77
-
78
- **Human:** Tell me about the Super-Sargasso Sea.
79
- **Fort:** Above our heads, in the region of atmospheric suspension that I call the Super-Sargasso Sea, there are fields of ice and stones and derelicts of space. From these regions fall the things that perplex the scientists - the red rains, the black rains, the stones inscribed with unknown symbols. The orthodox explain them away, but I accept that we live beneath an ocean of the unexplained.
80
-
81
- ## Limitations
82
 
83
- - Responses reflect early 20th century perspectives and knowledge
84
- - May generate anachronistic or scientifically inaccurate statements
85
- - Should not be used as a source of factual information
86
- - Tends to be skeptical of all scientific explanations, even well-established ones
87
-
88
- ## Citation
89
-
90
- ```bibtex
91
- @misc{fortean-Qwen3-8B-2024,
92
- author = {Your Name},
93
- title = {Fortean Qwen3 8B Advanced: Charles Fort Style Language Model},
94
- year = {2024},
95
- publisher = {HuggingFace},
96
- url = {https://huggingface.co/navicore/fortean-qwen3-8b-advanced}
97
- }
98
  ```
99
 
100
- ## Acknowledgments
 
 
 
 
 
 
 
101
 
102
- This project uses texts from Project Gutenberg. Charles Fort's works are in the public domain.
 
1
  ---
2
+ license: other
3
  language:
4
  - en
5
+ library_name: peft
6
+ base_model: Qwen/Qwen3-8B
7
  tags:
 
 
8
  - fortean
9
+ - charles-fort
10
+ - paranormal
11
+ - creative-writing
 
 
 
 
12
  ---
13
 
14
+ # Fortean Qwen3-8B Advanced
 
 
15
 
16
+ A fine-tuned version of Qwen3-8B that responds in the style of Charles Fort (1874-1932), the writer who collected reports of anomalous phenomena.
17
 
18
+ ## Features
19
+ - Sees paranormal connections in any topic
20
+ - Cites dubious 19th-century sources
21
+ - Applies Fort's skeptical perspective to modern topics
22
+ - Dry wit and unique voice
 
23
 
24
+ ## Usage
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25
 
26
+ ### With Transformers
27
  ```python
28
+ from transformers import AutoModelForCausalLM, AutoTokenizer
29
  from peft import PeftModel
30
 
 
31
  base_model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen3-8B")
32
  model = PeftModel.from_pretrained(base_model, "navicore/fortean-qwen3-8b-advanced")
33
  tokenizer = AutoTokenizer.from_pretrained("navicore/fortean-qwen3-8b-advanced")
34
 
35
+ prompt = "Question: What's your take on artificial intelligence?\n\nCharles Fort:"
36
+ # Generate response...
 
 
 
 
37
  ```
38
 
39
+ ### With Ollama
40
+ GGUF versions available at: [navicore/fortean-qwen3-8b-advanced-GGUF](https://huggingface.co/navicore/fortean-qwen3-8b-advanced-GGUF)
 
 
 
 
 
 
 
41
 
42
+ ```bash
43
+ # Download GGUF and Modelfile from the GGUF repo
44
+ ollama create fortean -f Modelfile
45
+ ollama run fortean "What do you think about UFOs?"
 
 
 
 
 
 
 
 
 
 
 
46
  ```
47
 
48
+ ## Training Details
49
+ - Base model: Qwen3-8B (not instruct)
50
+ - Method: LoRA fine-tuning
51
+ - Training examples: ~200 high-quality Q&A pairs
52
+ - Hardware: Apple M4 Pro
53
+
54
+ ## Example Output
55
+ **Q: What's your view on cryptocurrency?**
56
 
57
+ **A:** Digital gold that appears from nothing, vanishes into cryptographic ether, yet somehow purchases real coffee. I've documented seventeen mining facilities where the clocks run backward. In 1898, a banker in Prague reported similar temporal anomalies when counting certain denominations. The blockchain, they say, is immutable—yet I have records of transactions that preceded their own origins. Perhaps we're not mining coins but excavating something that was always there, waiting.