Login / Signup

On the limits of probabilistic forecasting in nonlinear time series analysis II: Differential entropy.

José M AmigóYoshito HirataKazuyuki Aihara
Published in: Chaos (Woodbury, N.Y.) (2018)
In a previous paper, the authors studied the limits of probabilistic prediction in nonlinear time series analysis in a perfect model scenario, i.e., in the ideal case that the uncertainty of an otherwise deterministic model is due to only the finite precision of the observations. The model consisted of the symbolic dynamics of a measure-preserving transformation with respect to a finite partition of the state space, and the quality of the predictions was measured by the so-called ignorance score, which is a conditional entropy. In practice, though, partitions are dispensed with by considering numerical and experimental data to be continuous, which prompts us to trade off in this paper the Shannon entropy for the differential entropy. Despite technical differences, we show that the core of the previous results also hold in this extended scenario for sufficiently high precision. The corresponding imperfect model scenario will be revisited too because it is relevant for the applications. The theoretical part and its application to probabilistic forecasting are illustrated with numerical simulations and a new prediction algorithm.
Keyphrases
  • primary care
  • machine learning
  • deep learning
  • quality improvement
  • big data
  • artificial intelligence