Skip to main content

A Hilbert Transform Method for Estimating Distributed Lag Models With Randomly Missed or Distorted Observations

  • Conference paper
Time Series Analysis of Irregularly Observed Data

Part of the book series: Lecture Notes in Statistics ((LNS,volume 25))

Abstract

Least-squares estimation of the lag coefficients of a distributed lag model is not a straightforward regression problem when the sample has missed or distorted observations. Even though the normal equations can be computed using sums of the available cross products, the asymptotic properties of a solution of these equations are unknown. Since a special computer program is required to compute these normal equations when observations are missed, it is practical to consider another approach to the problem.

Paper prepared for the Office of Naval Research Symposium on Time Series Analysis of Irregularly Observed Data, Texas A&M University.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  • Bloomfield, P. (1970). “Spectral Analysis with Randomly Missing Observations,” Journal of the Royal Statistical Society, Series B, 32, 369–380.

    MathSciNet  MATH  Google Scholar 

  • Brillinger, D. (1975). Time Series: Data Analysis and Theory, New York: Holt, Rinehart and Winston.

    MATH  Google Scholar 

  • Fuller, W.A. (1976). Introduction to Statistical Time Series, New York: John Wiley.

    MATH  Google Scholar 

  • Gold, B. and C.M. Rader (1969). Digital Processing of Signals, New York: McGraw-Hill.

    MATH  Google Scholar 

  • Hinich, M.J. (1982). “Estimating Distributed Lag Coefficients when there are Errors in the Observed Time Series,” ONR Technical Report No. 26.

    Google Scholar 

  • — and W.E. Weber (1982). “A Method for Estimating Distributed Lags when Observations are Randomly Missing,” ONR Technical Report No. 21.

    Google Scholar 

  • Parzen, E. (1963). “On Spectral Analysis with Missing Observations and Amplitude Modulation,” Sankhya, Series A, 25, 383–392.

    MathSciNet  MATH  Google Scholar 

  • Robinson, E.A. and S. Treitel (1980). Geophysical Signal Analysis, Englewood Cliffs: Prentice-Hall.

    Google Scholar 

  • Scheinok, P.A. (1965). “Spectral Analysis with Randomly Missing Observations: The Binomial Case,” Annals of Mathematical Statistics, 36, 971–977.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1984 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hinich, M.J., Weber, W.E. (1984). A Hilbert Transform Method for Estimating Distributed Lag Models With Randomly Missed or Distorted Observations. In: Parzen, E. (eds) Time Series Analysis of Irregularly Observed Data. Lecture Notes in Statistics, vol 25. Springer, New York, NY. https://doi.org/10.1007/978-1-4684-9403-7_7

Download citation

  • DOI: https://doi.org/10.1007/978-1-4684-9403-7_7

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-0-387-96040-1

  • Online ISBN: 978-1-4684-9403-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics