Möbius Markov Chain - The Classics Revival

Non-Euclidean Probabilistic Systems with Dynamic Geometry

Experimental Research Code - Functional but unoptimized, expect rough edges

What Is This?

Möbius Markov Chain operates Markov processes in complex space with dynamically warped geometry via Möbius transformations. The state space itself evolves based on current states, creating probabilistic dynamics that adapt their geometric structure during evolution.

Core Innovation: Markov transition probabilities computed in continuously warped complex space, where the geometry itself learns to optimize transition dynamics.

Architecture Highlights

  • Complex State Space: States positioned in the complex plane
  • Dynamic Möbius Transformations: Learnable conformal mappings f(z) = (az+b)/(cz+d)
  • Geometric Transition Matrices: Distances computed in transformed space
  • State-Dependent Warping: Geometry evolves based on current state distribution
  • Conformal Invariance: Preserves angles while warping distances
  • Distance Kernels: Multiple kernel options for probability computation

Quick Start

from mobius_markov import MobiusMarkovSystem

# Create non-Euclidean Markov system
system = MobiusMarkovSystem(
    num_states=8,
    state_embedding_dim=32,
    evolution_steps=5
)

# Initialize state distribution
initial_state = torch.zeros(batch_size, num_states)
initial_state[:, 0] = 1.0  # Start in state 0

# Evolve through warped space
output = system(initial_state, return_full_trajectory=True)

# Generate sequence predictions
sequence = system.predict_sequence(initial_state, sequence_length=10)

Current Status

  • Working: Complex plane dynamics, Möbius transforms, distance-based transitions, state evolution
  • Rough Edges: Conservative parameter defaults, could use more dramatic geometry warping for demos
  • Still Missing: Advanced kernel functions, multi-scale temporal dynamics, visualization tools
  • Performance: Mathematically sound and stable, good for research foundation
  • Memory Usage: Moderate, dominated by complex number operations
  • Speed: Reasonable for small state spaces, optimization needed for large systems

Mathematical Foundation

Möbius transformations are defined as:

f(z) = (az + b)/(cz + d)

where a,b,c,d are learnable complex parameters with ad - bc ≠ 0.

State positions in complex space determine transition probabilities:

P_ij = kernel(d_transformed(z_i, z_j))

Distance kernels include:

  • Gaussian: P ∝ exp(-d²/2σ²)
  • Inverse: P ∝ 1/d^α
  • Linear: P ∝ max(0, 1-d)

The geometry evolves according to:

∂θ/∂t = η × state_embedding_evolution(current_distribution)

where θ represents the Möbius transformation parameters.

Research Applications

  • Non-Euclidean machine learning
  • Adaptive probabilistic models
  • Complex systems with geometric constraints
  • Hyperbolic neural networks
  • Conformal prediction systems

Installation

pip install torch numpy matplotlib
# Download mobius_markov.py from this repo

The Classics Revival Collection

Möbius Markov Chain is part of a larger exploration of foundational algorithms enhanced with modern neural techniques:

  • Evolutionary Turing Machine
  • Hebbian Bloom Filter
  • Hopfield Decision Graph
  • Liquid Bayes Chain
  • Liquid State Space Model
  • Möbius Markov Chain ← You are here
  • Memory Forest

Citation

@misc{mobiusmarkov2025,
  title={Möbius Markov Chain: Non-Euclidean Probabilistic Systems},
  author={Jae Parker 𓅸 1990two},
  year={2025},
  note={Part of The Classics Revival Collection}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including 1990two/mobius_markov