Understanding time-series regression estimators

Askar H. Choudhury, Robert Hubata, Robert St Louis

Research output: Contribution to journalArticlepeer-review

21 Scopus citations


A large number of methods have been developed for estimating time-series regression parameters. Students and practitioners have a difficult time understanding what these various methods are, let alone picking the most appropriate one for their application. This article explains how these methods are related. A chronology for the development of the various methods is presented, followed by a logical characterization of the methods. An examination of current computational techniques and computing power leads to the conclusion that exact maximum likelihood estimators should be used in almost all cases where regression models have autoregressive, moving average, or mixed autoregressive-moving average error structures.

Original languageEnglish (US)
Pages (from-to)342-348
Number of pages7
JournalAmerican Statistician
Issue number4
StatePublished - Nov 1999


  • Approximate and exact estimators
  • Autoregressive and moving average error models
  • Cholesky decomposition
  • Computational convenience
  • Generalized least squares and maximum likelihood estimators
  • Linear and nonlinear optimization methods
  • Transformations to obtain uncorrelated errors

ASJC Scopus subject areas

  • Statistics and Probability
  • General Mathematics
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Understanding time-series regression estimators'. Together they form a unique fingerprint.

Cite this