AI & ML interests

German NLP and beyond

Recent Activity

We advance German NLP through transparent, open-source research with three flagship projects:

🐑 LLäMmlein - Comprehensive family of German-only transformer models (120M, 1B, 7B parameters) trained transparently from scratch with full training data and code documentation.

📊 SuperGLEBer - First comprehensive German benchmark suite featuring 29 diverse NLP tasks across domains, providing systematic evaluation for German language models.

🤖 ModernGBERT - Transparent encoder models (138M, 1B parameters) based on modernBERT architecture, specifically optimized for German language understanding.

Beyond our German NLP ecosystem, we specialize in LLM-Knowledge Graph Integration for text mining applications. Our work combines language models with explicit knowledge representations, developing:

  • Character Analysis: Models like LitBERT for understanding character networks in novels
  • Temporal Text Analysis: Tracking narrative development through relation detection and scene segmentation
  • Sentiment & Engagement Analysis: Measuring emotional dynamics in streaming platforms and social media
  • Knowledge Enrichment: Semantic Web technologies for ontology learning and KG enhancement

🌸 We foster reproducible, collaborative research by open-sourcing models, datasets, and evaluation frameworks, establishing German as a first-class language in the global AI ecosystem while advancing the intersection of symbolic knowledge and neural language understanding.