Papers
2025
- Adaptive Quantile Guidance in Diffusion Models: Multi-Dataset Learning for Pandemic Time Series (under peer review)J. Odum*† , and J. Miller*. More Information can be found here , 2025
Probabilistic forecasting of epidemiological time series remains challenging due to non-stationary disease dynamics, sparse observations, and heterogeneous data regimes. We present Adaptive Quantile Diffusion (AQDiff), a novel diffusion-based framework that addresses these challenges through two key innovations: (1) multi-dataset pre-training of a unified time series diffusion model across heterogeneous epidemiological domains, and (2) adaptive quantile guidance that dynamically adjusts prediction intervals based on local residual statistics. AQDiff introduces a feedback loop where recent forecast errors inform quantile adjustments through an exponentially weighted buffer, enabling real-time adaptation to volatility shifts. Pre-training on influenza-like illness (ILI), COVID-19, and respiratory syncytial virus (RSV) data from the U.S. Centers for Disease Control and Prevention (CDC) allows cross-domain knowledge transfer. Experiments demonstrate that AQDiff reduces mean absolute error (MAE) by as much as 73.3% compared to state-of-the-art baselines (CSDI, PatchTST) across three pandemic forecasting tasks. Our analysis reveals that adaptive quantile guidance provides statistically significant improvements during outbreak surges, while pre-training enhances robustness to limited training data. The framework maintains computational efficiency, adding less than 10% overhead compared to fixed-quantile diffusion models.
- channelImproving Channel-Independent Transformer via Statistical Preprocessing for Time Series Forecasting (under peer review)S. Rana*, and J. Odum*2025
Effective data preprocessing is crucial in time series forecasting, as it enhances model performance by addressing data inconsistencies and scaling issues. Traditional forecasting models often rely on simple scaling techniques like StandardScaler to normalize data, but these methods are sensitive to outliers and implicitly assume a normal distribution, which can limit their effectiveness in real-world data. In this work, we propose an innovative approach that integrates multiple statistical preprocessing techniques, including logarithmic, Box-Cox, Yeo-Johnson, square root, and differencing, directly into the state-of-the-art model PatchTST (channel-independent patch time series Transformer) for time series forecasting. To our knowledge, this is the first time such comprehensive data preprocessing techniques have been added to a transformer model. By addressing issues like skewness, non-stationarity, and heteroscedasticity, our enhanced PatchTST achieves notable improvements in forecast accuracy. Experiments reveal substantial reductions in errors by 38% for the single target variable and 24% for all variables, underscoring the potential of statistical preprocessing in transformers.