Login / Signup

A new method of Bayesian causal inference in non-stationary environments.

Shuji ShinoharaNobuhito ManomeKouta SuzukiUng-Il ChungTatsuji TakahashiHiroshi OkamotoYukio Pegio GunjiYoshihiro NakajimaShunji Mitsuyoshi
Published in: PloS one (2020)
Bayesian inference is the process of narrowing down the hypotheses (causes) to the one that best explains the observational data (effects). To accurately estimate a cause, a considerable amount of data is required to be observed for as long as possible. However, the object of inference is not always constant. In this case, a method such as exponential moving average (EMA) with a discounting rate is used to improve the ability to respond to a sudden change; it is also necessary to increase the discounting rate. That is, a trade-off is established in which the followability is improved by increasing the discounting rate, but the accuracy is reduced. Here, we propose an extended Bayesian inference (EBI), wherein human-like causal inference is incorporated. We show that both the learning and forgetting effects are introduced into Bayesian inference by incorporating the causal inference. We evaluate the estimation performance of the EBI through the learning task of a dynamically changing Gaussian mixture model. In the evaluation, the EBI performance is compared with those of the EMA and a sequential discounting expectation-maximization algorithm. The EBI was shown to modify the trade-off observed in the EMA.
Keyphrases
  • single cell
  • machine learning
  • deep learning