Local divergences With time series applications
| Authors | |
|---|---|
| Supervisors |
|
| Cosupervisors | |
| Award date | 06-03-2026 |
| ISBN |
|
| Series | Tinbergen Institute Research Series, 891 |
| Number of pages | 307 |
| Organisations |
|
| Abstract |
This thesis develops a general framework for localized evaluation of probabilistic forecasts and for constructing observation-driven time series models. Chapter 2 shows that applying a strictly proper scoring rule to a censored distribution yields a strictly locally proper scoring rule whose score divergence is a localized version of the original divergence. The censored likelihood ratio test is uniformly most powerful, and simulation and empirical evidence indicate strong power properties in Diebold-Mariano comparisons. Chapter 3 studies score-driven filters using an expected Kullback-Leibler criterion with a two-sample interpretation. It proves that expected reductions occur if and only if the expected parameter update aligns with the expected score, uniquely characterizing score-driven updates, including scaled and clipped variants, and providing moment-based upper bounds for learning rates. Chapter 4 generalizes beyond the score by introducing Proper and Robust Autoregressive Derivative Adaptive (PRADA) models, which guarantee reductions under misspecification in expected local divergences induced by strictly locally proper scoring rules or strictly consistent scoring functions, and serve as an online analog of M- and Z-estimation.
|
| Document type | PhD thesis |
| Language | English |
| Downloads | |
| Permalink to this page | |
