DREAM
NEURAL
Continuous-time. Surprise-driven. Adaptive.
DREAM-NN is an open-source PyTorch library implementing a new family of networks inspired by how biological brains learn — through adaptation, prediction error, and dynamic time constants.
Unlike traditional RNNs (LSTM, GRU) with fixed dynamics, DREAM cells continuously adapt during both training and inference — reacting faster to surprising inputs and stabilizing around familiar patterns.
Core Mechanisms
Surprise-Driven Plasticity
Synaptic updates are modulated by prediction error. High surprise triggers rapid learning; expected patterns stabilize into memory. Mimics biological neuromodulation.
Liquid Time-Constants
Each neuron's integration speed adapts dynamically. The network accelerates to capture novelty and slows to preserve long-term dependencies — all without retraining.
Hebbian Fast Weights
Low-rank weight decomposition (U × Vᵀ) enables high-speed, in-context adaptation during inference. Only U updates — V stays fixed. Efficient meta-learning in practice.
Sleep Consolidation
During low-surprise periods, memory is stabilized. The model consolidates frequently-used patterns — analogous to memory consolidation during biological sleep.
Mathematical Core
Continuous-Time
Dynamics.
State evolution follows a differential equation, discretized via Euler integration. The time constant τ is not fixed — it adapts based on surprise, controlling how fast the network integrates new information.
Full Documentation→Quick Start
Drop-in
PyTorch Module.
DREAM is designed to be used just like nn.LSTM or nn.GRU. Replace and gain continuous adaptive dynamics with no extra training overhead.
Explore the
Architecture.
Full API reference, benchmarks, and configuration guides are available on the documentation portal.