An asymptotic theory for estimation and inference in adaptive learning models with strong mixing regressors and martingale difference innovations is developed. The maintained polynomial gain specification provides a unified framework which permits slow convergence of agents' beliefs and contains recursive least squares as a prominent special case. Reminiscent of the classical literature on co-integration, an asymptotic equivalence between two approaches to estimation of long-run equilibrium and short-run dynamics is established. Notwithstanding potential threats to inference arising from non-standard convergence rates and a singular variance-covariance matrix, hypotheses about single. as well as joint restrictions remain testable. Monte Carlo evidence confirms the accuracy of the asymptotic theory in finite samples.

ESTIMATION AND INFERENCE IN ADAPTIVE LEARNING MODELS WITH SLOWLY DECREASING GAINS

Mayer, A
2022

Abstract

An asymptotic theory for estimation and inference in adaptive learning models with strong mixing regressors and martingale difference innovations is developed. The maintained polynomial gain specification provides a unified framework which permits slow convergence of agents' beliefs and contains recursive least squares as a prominent special case. Reminiscent of the classical literature on co-integration, an asymptotic equivalence between two approaches to estimation of long-run equilibrium and short-run dynamics is established. Notwithstanding potential threats to inference arising from non-standard convergence rates and a singular variance-covariance matrix, hypotheses about single. as well as joint restrictions remain testable. Monte Carlo evidence confirms the accuracy of the asymptotic theory in finite samples.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10278/5007324
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact