ModularBrainAgent / README.md
Almusawee's picture
Update README.md
eb4c056 verified
---
tags:
- brain-inspired
- spiking-neural-network
- biologically-plausible
- modular-architecture
- reinforcement-learning
- vision-language
- pytorch
- curriculum-learning
- cognitive-architecture
- artificial-general-intelligence
license: mit
datasets:
- mnist
- imdb
- synthetic-environment
language:
- en
library_name: transformers
widget:
- text: "The first blueprint and the bridge to Neuroscience and Artificial Intelligence."
- text: "I’m sure this model architecture will revolutionize the world."
model-index:
- name: ModularBrainAgent
results:
- task:
type: image-classification
name: Vision-based Classification
dataset:
type: mnist
name: MNIST
metrics:
- type: accuracy
value: 0.98
- task:
type: text-classification
name: Language Sentiment Analysis
dataset:
type: imdb
name: IMDb
metrics:
- type: accuracy
value: 0.91
- task:
type: reinforcement-learning
name: Curiosity-driven Exploration
dataset:
type: synthetic-environment
name: Synthetic Environment
metrics:
- type: cumulative_reward
value: 112.5
---
# 🧠 ModularBrainAgent: A Brain-Inspired Cognitive AI Model
ModularBrainAgent (SynCo) is a biologically plausible, spiking neural agent combining vision, language, and reinforcement learning in a single architecture. Inspired by human neurobiology, it implements multiple neuron types and complex synaptic pathways, including excitatory, inhibitory, modulatory, bidirectional, feedback, lateral, and plastic connections.
It’s designed for researchers, neuroscientists, and AI developers exploring the frontier between brain science and general intelligence.
---
## 🧩 Model Architecture
- **Total Neurons**: 66
- **Neuron Types**: Interneurons, Excitatory, Inhibitory, Cholinergic, Dopaminergic, Serotonergic, Feedback, Plastic
- **Core Modules**:
- `SensoryEncoder`: Vision, Language, Numeric integration
- `PlasticLinear`: Hebbian and STDP local learning
- `RelayLayer`: Spiking multi-head attention module
- `AdaptiveLIF`: Recurrent interneuron logic
- `WorkingMemory`: LSTM-based temporal memory
- `NeuroendocrineModulator`: Emotional feedback
- `PlaceGrid`: Spatial grid encoding
- `Comparator`: Self-matching logic
- `TaskHeads`: Classification, regression, binary outputs
---
## 🧠 Features
- 🪐 Multi-modal input (images, text, numerics)
- 🔁 Hebbian + STDP local plasticity
- ⚡ Spiking simulation via surrogate gradients
- 🧠 Biologically inspired synaptic dynamics
- 🧬 Curriculum and lifelong learning capability
- 🔍 Fully modular: plug-and-play cortical units
---
## 📊 Performance Summary
*Note: Metrics shown below are for illustrative purposes from synthetic and internal tests.*
| Task | Dataset | Metric | Result |
|-----------------------|----------------------|-------------------|----------|
| Digit Recognition | MNIST | Accuracy | 0.98 |
| Sentiment Analysis | IMDb | Accuracy | 0.91 |
| Exploration Task | Gridworld Simulation | Cumulative Reward | 112.5 |
---
## 💻 Training Data
- **MNIST**: Handwritten digit classification
- **IMDb**: Sentiment classification from text
- **Synthetic Environment**: Grid-based exploration with feedback
---
## 🧪 Intended Uses
| Use Case | Description |
|-----------------------------|------------------------------------------------------------|
| Neuroscience AI Research | Simulating cortical modules and spiking dynamics |
| Cognitive Simulation | Experimenting with memory, attention, and decision systems |
| Multi-task Agents | One-shot learning across vision + language + control |
| Education + Demos | Accessible tool for learning about bio-inspired AI |
---
## ⚠️ Limitations
- Early-stage architecture (prototype stage)
- Unsupervised/local learning only (no gradient-based finetuning yet)
- Synthetic data only for now
- Accuracy and metrics not benchmarked on large-scale public sets
---
## ✨ Credits
Built by **Aliyu Lawan Halliru**, an independent AI researcher from Nigeria.
SynCo was created to demonstrate that anyone, anywhere, can build synthetic intelligence.
---
## 📜 License
MIT License © 2025 Aliyu Lawan Halliru
Use freely. Cite or reference when possible.
.