Briazkalo, Mykhailo2025-09-222025-09-222025-09-222025-09-15https://hdl.handle.net/10012/22497Financial time series demonstrate complex stochastic dynamics such as volatility clustering, heavy tails, and sudden jumps that are difficult to capture with traditional parametric models. Deep generative models offer a flexible alternative for learning unknown data distributions, but their application to financial data remains limited. In this thesis, we propose a diffusion-based generative framework for modeling financial time series. Building on the Elucidated Diffusion Model, originally developed for image synthesis, we adapt its architecture to multivariate sequential data and integrate the Ambient Diffusion framework as a variance correction mechanism. We provide a theoretical analysis connecting excess variance in standardized outputs with volatility bias and derive an analytical rule for selecting the ambient noise level. We evaluate the proposed approach using a comprehensive framework that combines statistical similarity, parameter recovery, option pricing, and risk metrics. Across synthetic models (GBM, Heston, Merton) and real-world datasets (SPY, AAPL, NVDA, BTC), our Ambient Diffusion-based method consistently improves distributional alignment, volatility recovery, and option pricing over the EDM baseline, highlighting its potential for quantitative modeling, scenario generation, and risk management.endiffusion modelsgenerative modelingfinancial time seriesquantitative financedeep learningDiffusion-Based Generative Modeling of Financial Time SeriesMaster Thesis