This is the LoRA adapter for Andy-4-micro, any use of this requires full adherence to the Andy license
π€ Andy-4-micro βοΈ
Andy-4-micro is a compact and efficient model designed specifically for Minecraft gameplay via the Mindcraft framework. Built on the Qwen2.5 1.5B architecture, it delivers surprising capabilities given its size, offering strong reasoning and reliable task execution despite being only 21% the size of the full Andy-4 model.
Andy-4-micro is licensed under the [Andy License 1.0], which requires derivatives to include the license terms and attribute the original author.
π§ Overview
Andy-4-micro was originally an experimentβa testbed to see how much performance could be retained when compressing the Andy-4 training process into a significantly smaller base model. The result is a fast, efficient model with an excellent ability to play Minecraft and respond with logic and clarity, even under limited hardware constraints.
Despite its size, Andy-4-micro is able to:
- Understand and navigate Minecraft environments.
- Reason before responding, rather than simply reacting.
- Play autonomously or in collaboration with user input.
- Maintain high efficiency even on low-VRAM GPUs.
π Training Details
Andy-4-micro was trained on the same data as Andy-4-base, but with smaller architecture and unique training hyperparameters to optimize for compact performance.
- Base Model:
Qwen2.5 1.5B Instruct
- Total Training Epochs: 4
- Andy-4-base-2 dataset: 2 epochs at
4e-5
learning rate - FT dataset: 2 epochs at
8e-6
learning rate
- Andy-4-base-2 dataset: 2 epochs at
- Optimizer: AdamW with cosine decay schedule
- Quantization: 4-bit (
bnb-4bit
)
This configuration allows Andy-4-micro to punch well above its weight class, especially when compared with other small models in the Minecraft AI space.
π Differences from Andy-4-base
Feature | Andy-4-base | Andy-4-micro |
---|---|---|
Model Size | 8B | 1.5B |
Base Architecture | DeepSeek-R1 Distill Llama 8B | Qwen2.5 1.5B |
Training Duration | ~3 weeks | ~3 days |
Usage Type | General Mindcraft agent base | Fast, efficient model for light gameplay |
Reasoning | High | Medium-high |
Hardware Requirements | 8+ | Very low (4-6GB VRAM works!) |
Andy-4-micro is not a replacement for Andy-4-base or the full Andy-4 model, but instead a supplementary option for users who prioritize speed and accessibility over scale.
π Installation
You can install and run Andy-4-micro through Hugging Face and Ollama, following the same method as Andy-4-base. Here's how:
π§° Hugging Face Quick Start
- Go to the model page on Hugging Face.
- Click on the "Use this model" dropdown.
- Choose Ollama and select your quantization level:
Quantization | Minimum VRAM Required |
---|---|
F16 | 8+ GB |
Q5_K_M | 6+ GB |
Q4_K_M | 4-6 GB |
Q3_K_M | 4 GB (low settings) |
Q2_K | 2-4 GB (ultra-light) |
- Follow the Mindcraft Discord Guide for cloud-based options if needed.
π οΈ Manual Installation via Modelfile
- Download your chosen
.GGUF
quantization file and theModelfile
. - Open
Modelfile
and replace theFROM
path with the exact path to your.GGUF
file, e.g.:
FROM C:\Users\you\Downloads\Andy-4-micro.Q3_K_M.gguf
- In a terminal, navigate to the
Modelfile
directory and run:
ollama create andy-4-micro -f Modelfile
This will register the model as andy-4-micro
for use with Ollama.
π§ͺ Experimental Nature
Note: Andy-4-micro is an experimental release.
Although initial performance was overwhelmingly positive, Andy-4-micro was not originally designed for public deployment. However, due to strong community feedback and impressive early benchmarks, it has been released for broader use and further testing.
Expect updates, bugfixes, and potentially further fine-tuned versions based on user reports.
βοΈ License
This model is licensed under the Andy License 1.0.
Usage must include credit to the original creator, and derivatives must remain open and under the same license.
- Author: Sweaterdog
- Acknowledgments: This model utilizes datasets and techniques developed in the Andy-4 ecosystem.
- License type: Derived from Apache 2.0 with added attribution & openness clauses.
π Final Notes
- Andy-4-micro is ideal for lightweight deployments, educational use, and embedded inference.
- Itβs a great fit for users who want a responsive, low-resource model that still understands Minecraft deeply.
- For best results, use within the Mindcraft framework or alongside other Andy models.
We hope you enjoy exploring and building with Andy-4-micro!
"Big things come in small packages." π‘
## License
This model is licensed under the Andy 1.0 License.
Credit: https://huggingface.co/Sweaterdog
Acknowledgment: This work uses data and models created by @Sweaterdog.
Model tree for Sweaterdog/Andy-4-micro-0427-LoRA
Base model
Qwen/Qwen2.5-1.5B