Statistical Inference for Stochastic Volatility via Backward Filtering Forward Guiding
In this talk, we will discuss the implementation of the Backward Filtering Forward Guiding (BFFG) algorithm, introduced in Schauer et al. (2017) and Mider et al. (2021), to infer latent states in a discrete-time stochastic volatility model. Although Doob's transform for this model is not readily available, efficient inference can be performed via BFFG by leveraging the tractability of an `approximate' (namely, linear and Gaussian) framework. In particular, a recursive procedure first allows to obtain an approximate transform, denoted by g, at each discrete time step (backward filtering), which can in turn be used to perform a change of measure on the transitions of the generative model to ensure taking observations into account (forward guiding). Moreover, note that the likelihood ratio between the transition density of the `true' conditioned process and the guided one can be computed, which then yields an importance sampling estimator for the state process. The relatively simple stochastic volatility application considered already highlights how the performance of this estimator depends crucially on the choice of g-maps, and we will thus conclude by discussing regularization procedures to guarantee finite estimator variance.
Area: CS23 - Stochastic processes and their applications (Katia Colaneri)
Keywords: Backward Filtering Forward Guiding, Stochastic Volatility, Statistical Inference for Stochastic Processes
Please Login in order to download this file