Browsing by Author "Wu, Mo Han"
Now showing 1 - 1 of 1
- Results Per Page
- Sort Options
Item Probabilistic Methods of Parameter Inference for Differential Equations(University of Waterloo, 2025-08-26) Wu, Mo HanParameter estimation for differential equations is a fundamental problem in many scientific fields. This thesis is concerned with developing statistically and computationally efficient methods of parameter inference for ordinary differential equations (ODEs) and stochastic differential equations (SDEs). For ODEs, this often involves numerically solving the differential equation at each likelihood evaluation, a task traditionally handled with deterministic numerical solvers. Here we consider likelihood approximations based on probabilistic numerical solvers, which have been shown to produce more reliable parameter estimates by better accounting for numerical uncertainty. In contrast, parameter inference for SDEs requires integrating over high-dimensional latent variables in state-space models, which is computationally expensive. We propose a variational inference framework that significantly reduces this computational burden by converting the integration problem into an optimization problem. Chapter 2 is a review of several existing probabilistic ODE solvers and associated parameter inference methods. In particular, we detail the commonly-used paradigm of approximating the ODE solution with a nonlinear state-space model, and then linearizing it to easily perform relevant computations via the Kalman filtering and smoothing recursions. In the data-free setting, we extend convergence results previously established only for the forward pass of the Kalman algorithm to the backward pass as well. This is a key result for establishing the convergence of several associated parameter learning methods. We provide empirical evidence that supports our theory and demonstrate that the backward pass estimator is more accurate than using a forward pass alone. We also propose a novel variation of a probabilistic method for parameter inference. In Chapter 3, we present a novel probabilistic approximation to the ODE likelihood that reduces the parameter sensitivity inherent in the true likelihood by directly learning from noisy observations. Leveraging the efficient Kalman filter algorithm, our method scales linearly in both ODE variables and time discretization points. Furthermore, it is applicable to ODE problems with partially unobserved components and arbitrary measurement noise. Several numerical experiments demonstrate that our method produces more reliable estimates when compared to other probabilistic methods and, in extremely sensitive problems, exhibits greater robustness than the exact ODE likelihood. In Chapter 4, we introduce a scalable stochastic variational inference framework for estimating parameters in SDE models. Our method effectively captures the forward-backward information propagation in state-space models by using a recurrent neural network (RNN) to estimate the quantities needed for the forward-backward recursions. The procedure scales linearly with the number of SDE discretization steps. Experimental results show that it produces more reliable parameter posteriors than a number of competing variational and non-variational methods, particularly in high-dimensional random effects models.