Speakers
Abstract
This project explores deep learning approaches for modelling continuous-time dynamical systems, with a focus on handling missing data and quantifying uncertainty. The models developed in this project use recurrent neural networks to learn temporal dependencies in time-series data, providing a flexible alternative to parametric state-space models such as those implemented in the ctsem R package. By integrating methods for uncertainty estimation and interpolation, the models allow for better comparison with structured approaches in psychological and biological research.
A key advantage of these models is their ability to adaptively infer system dynamics without strong parametric assumptions. Compared to traditional latent-variable models, deep learning approaches can accommodate complex, nonlinear dependencies while still enabling meaningful interpretation through tools such as impulse response functions and conditional predictions. Interpolation techniques further enhance prediction smoothness and enable retrospective analysis of missing time points, making these models applicable to real-world scenarios where measurement irregularities are common.
Beyond predictive accuracy, these models contribute to the interpretability of learned representations by providing uncertainty estimates, confidence intervals, and dynamic response characteristics. Their ability to approximate probability distributions over trajectories allows for better risk assessment and decision-making in scientific forecasting applications. By bridging deep learning with established continuous-time modelling frameworks, this work highlights the potential for hybrid approaches that integrate the strengths of both paradigms.
Poster | Deep Learning -Based Approaches for Continuous-Time Dynamical Systems |
---|---|
Author | Charles Driver, Manaswi Mondol |
Affiliation | University of Zurich |
Keywords | deep learning, continuous-time dyanamical systems |