A geometric adapter for transformer language models. Fiber bundle topology, hyperbolic latent spaces, and information-geometric optimization — injected at a single layer.
Hidden states are projected into a 64-dimensional Poincare ball with constant negative curvature. Hierarchical concepts get exponentially more room at the boundary — a natural fit for language semantics.
Each base point carries a fiber: K=16 categorical sections with P=8 mixture components. Parallel transport moves fiber state between tokens while preserving geometric consistency.
Fiber state evolves via symplectic integration with a Lorentz-factor speed limiter. Energy conservation prevents gradient explosion; the manifold stays stable during training.
A log-determinant Laplacian estimator measures curvature of the learned metric tensor. Curvature loss regularizes toward κ = -1, ensuring genuine hyperbolic geometry.
At inference time, a GENERIC-compliant homeostatic controller reads curvature K and entropy S to modulate temperature and sampling — no retraining required.
Biologically-inspired memory using FitzHugh-Nagumo excitable dynamics, FFT spectral forgetting, and Gabor retroactive interference. Details decay before gist — like human memory.
The geometric constraints produce measurable, non-trivial structure — not decorative.
| Metric | Value |
|---|---|
| Curvature K | -5.63 |
| Entropy S | 0.95 target: 1.39 |
| Jensen-Shannon Div. | 0.424 |
| Parallel Transport | 0.041 near-zero holonomy |
| Faithfulness Tests | 6/6 PASS |
The adapter preserves base Qwen 2.5-7B capabilities with minimal degradation.
| Benchmark | Score |
|---|---|
| ARC-Challenge | 54.86% = baseline |
| TruthfulQA (MC2) | 64.78% |
| Winogrande | 71.03% |
| GSM8K | 75.51% |
Qwen 2.5-7B Layers 0-11 ──▶ Layer 12 + IGBundle Adapter ──▶ Layers 13-27 ──▶ LM Head │ ▲ ▼ │ ┌──────────────┐ │ │ Input Proj │ H → 256 │ │ Poincare Ball │ H^64, κ=-1 │ │ Fiber Sect. │ K=16 × P=8 │ │ Hamiltonian │ Symplectic integration │ │ Output Proj │ 256 → H │ └──────┬───────┘ │ │ h + α·δ (clamped ≤10% of ‖h‖) │ └──────────────────────────────────────────┘ │ GSP Controller reads K, S ──▶ modulates temp, top_p
IGBundle is built by a human-AI team: one researcher orchestrating two AI agents in a continuous multi-agent development loop.