Author | Laszlo Matyas | |

Isbn | 9780521660136 | |

File size | 7.22MB | |

Year | 1999 | |

Pages | 332 | |

Language | English | |

File format | ||

Category | mathematics |

GENERALIZED METHOD
OF MOMENTS ESTIMATION
Editor:
LASZLO MATYAS
Budapest University of Economics
CAMBRIDGE
UNIVERSITY PRESS
CAMBRIDGE UNIVERSITY PRESS
Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, Sao Paulo
Cambridge University Press
The Edinburgh Building, Cambridge CB2 8RU, UK
Published in the United States of America by Cambridge University Press, New York
www.cambridge.org
Information on this title: www.cambridge.org/9780521660136
© Cambridge University Press 1999
This publication is in copyright. Subject to statutory exception
and to the provisions of relevant collective licensing agreements,
no reproduction of any part may take place without the written
permission of Cambridge University Press.
First published 1999
A catalogue record for this publication is available from the British Library
Library of Congress Cataloguing in Publication data
Generalized methods of moments estimation / editor, Laszlo Matyas.
p.
cm.
Includes bibliographical references and index.
ISBN 0-521-66013-0 (hardbound)
1. Econometric models.
3. Estimation theory.
HB141.G463
2. Moments method (Statistics)
I. Matyas, Laszlo.
1998
330'.01'5195-dc21
ISBN 978-0-521-66013-6 hardback
ISBN 978-0-521-66967-2 paperback
Transferred to digital printing 2007
98-51529
CIP
Contents
Preface
1. Introduction to the Generalized Method
of Moments Estimation
David Harris and Laszlo Matyas
1.1. The Method of Moments
1.1.1. Moment Conditions
1.1.2. Method of Moments Estimation . . .
1.2. Generalized Method of Moments (GMM)
Estimation
1.3. Asymptotic Properties of the GMM
Estimator
1.3.1. Consistency
1.3.2. Asymptotic Normality
1.3.3. Asymptotic Efficiency
1.3.4. Illustrations
1.3.4.1. GMM With i.i.d. Observations
.
1.3.4.2. Regression With Instrumental
Variables
1.3.5. Choice of Moment Conditions . . . .
1.4. Conclusion
References
2. G M M Estimation Techniques
Masao Ogaki
2.1. GMM Estimation
2.1.1. GMM Estimator
2.1.2. Important Assumptions
2.2. Nonlinear Instrumental Variable
Estimation
2.3. GMM Applications with Stationary
Variables
2.3.1. Euler Equation Approach
2.3.2. Habit Formation
2.3.3. Linear Rational Expectations Models
1
3
4
4
7
9
11
12
17
21
22
23
24
25
29
29
31
31
31
34
35
36
36
37
39
iv
Contents
2.4. GMM in the Presence of Nonstationary
Variables
2.4.1. The Cointegration-Euler Equation
Approach
2.4.2. Structural Error Correction Models .
2.4.3. An Exchange Rate Model with Sticky
Prices
2.4.4. The Instrumental Variables Methods
2.5. Some Aspects of GMM Estimation
...
2.5.1. Numerical Optimization
2.5.2. The Choice of Instrumental Variables
2.5.3. Normalization
References
3. Covariance Matrix Estimation
Matthew J. Cushing and Mary G.
McGarvey
3.1. Preliminary Results
3.2. The Estimand
3.3. Kernel Estimators of VT
3.3.1. Weighted Autocovariance Estimators
3.3.2. Weighted Periodogram Estimators . .
3.3.3. Computational Considerations . . . .
3.3.4. Spectral Interpretation
3.3.5. Asymptotic Properties of Scale
Parameter Estimators
3.4. Optimal Choice of Covariance Estimator
3.4.1. Optimal Choice of Scale Parameter
Window
3.4.2. Optimal Choice of Scale Parameter .
3.4.3. Other Estimation Strategies
3.5. Finite Sample Properties of HAC
Estimators
3.5.1. Monte Carlo Evidence
3.5.2. Regression Model
3.5.3. Heteroskedastic but Serially
Uncorrelated Errors
3.5.4. Autocorrelated but Homoskedastic
Errors
41
42
44
45
51
55
55
56
57
59
63
63
64
67
68
69
71
72
73
77
77
78
80
82
82
84
86
88
Contents
3.5.5. Autocorrelated and Heteroskedastic
Errors
3.5.6. Summary
References
4. Hypothesis Testing in Models Estimated
by GMM
Alastair R. Hall
4.1. Identifying and Overidentifying
Restrictions
4.2. Testing Hypotheses About E[f(xu60)] . .
4.3. Testing Hypotheses About Subsets of
E[f(xt,60)]
4.4. Testing Hypotheses About the Parameter
Vector
4.5. Testing Hypotheses About Structural
Stability
4.6. Testing Non-nested Hypotheses
4.7. Conditional Moment and Hausman Tests
4.7.1. Conditional Moment Tests
4.7.2. Hausman Tests
References
5. Finite Sample Properties of GMM
estimators and Tests
Jan M. Podivinsky
5.1. Related Theoretical Literature
5.2. Simulation Evidence
5.2.1. Asset Pricing Models
5.2.2. Business Cycle Models
5.2.3. Stochastic Volatility Models
5.2.4. Inventory Models
5.2.5. Models of Covariance Structures . . .
5.2.6. Other Applications of GMM
5.3. Extensions of Standard GMM
5.3.1. Estimation
5.3.2. Testing
5.4. Concluding Comments
References
90
93
94
96
99
101
104
108
Ill
118
122
123
124
125
128
129
131
132
134
135
138
139
140
141
141
144
145
145
vi
Contents
6. GMM Estimation of Time Series Models
David Harris
6.1. Estimation of Moving Average Models . .
6.1.1. A Simple Estimator of an MA(1)
Model
6.1.2. Estimation by Autoregressive
Approximation
6.1.2.1. The Durbin Estimator
6.1.2.2. The Galbraith and Zinde-Walsh
Estimator
6.1.3. A Finite Order Autoregressive
Approximation
6.1.4. Higher Order MA Models
6.2. Estimation of ARMA Models
6.2.1. IV Estimation of AR Coefficients . .
6.2.2. The ARMA(1,1) Model
6.2.2.1. A Simple IV Estimator
6.2.2.2. Optimal IV Estimation
6.3. Applications to Unit Root Testing . . . .
6.3.1. Testing for an Autoregressive Unit
Root
6.3.2. Testing for a Moving Average Unit
Root
Appendix: Proof of Theorem 6.1
References
149
7. Reduced Rank Regression Using GMM
Frank Kleibergen
7.1. GMM-2SLS Estimators in Reduced Rank
Models
7.1.1. Reduced Rank Regression Models . .
7.1.2. GMM-2SLS Estimators
7.1.3. Limiting Distributions for the
GMM-2SLS Cointegration Estimators
7.2. Testing Cointegration Using GMM-2SLS
Estimators
7.3. Cointegration in a Model with
Heteroskedasticity
7.3.1. Generalized Least Squares
Cointegration Estimators
171
150
150
153
154
155
155
156
157
157
158
159
160
162
162
165
167
169
173
173
174
178
184
187
187
Contents
7.4. Cointegration with Structural Breaks
7.5. Conclusion
Appendix
References
vii
. .
8. Estimation of Linear Panel Data Models
Using G M M
Seung C. Ahn and Peter Schmidt
8.1. Preliminaries
8.1.1. General Model
8.1.2. GMM and Instrumental Variables . .
8.1.2.1. Strictly Exogenous Instruments and
Random effects
8.1.2.2. Strictly Exogenous Instruments and
Fixed Effects
8.2. Models with Weakly Exogenous
Instruments
8.2.1. The Forward Filter Estimator . . . .
8.2.2. Irrelevance of Forward Filtering . . .
8.2.3. Semiparametric Efficiency Bound . .
8.3. Models with Strictly Exogenous Regressors
8.3.1. The Hausman and Taylor, Amemiya
and MaCurdy and Breusch, Mizon and
Schmidt Estimators
8.3.2. Efficient GMM Estimation
8.3.3. GMM with Unrestricted £
8.4. Simultaneous Equations
8.4.1. Estimation of a Single Equation . . .
8.4.2. System of Equations Estimation . . .
8.5. Dynamic Panel Data Models
8.5.1. Moment Conditions Under Standard
Assumptions
8.5.2. Some Alternative Assumptions . . . .
8.5.3. Estimation
8.5.3.1. Notation and General Results . .
8.5.3.2. Linear Moment Conditions and
Instrumental Variables
8.5.3.3. Linearized GMM
8.6. Conclusion
191
196
197
209
211
212
213
214
217
217
218
219
220
221
222
223
226
229
230
231
234
235
236
238
241
241
243
244
245
viii
Contents
References
9. Alternative GMM Methods for Nonlinear
Panel Data Models
Jorg Breitung and Michael Lechner
9.1. A Class of Nonlinear Panel Data Models
9.2. GMM Estimators for the Conditional Mean
9.3. Higher Order Moment Conditions . . . .
9.4. Selecting Moment Conditions: The
Gallant-Tauchen Approach
9.5. A Minimum Distance Approach
9.6. Finite Sample Properties
9.6.1. Data Generating Process
9.6.2. Estimators
9.7. Results
9.8. An Application
9.9. Concluding Remarks
References
10. Simulation Based Method of Moments
R o m a n Liesenfeld and Jorg Breitung
10.1. General Setup and Applications
10.2. The Method of Simulated Moments
(MSM)
10.3. Indirect Inference Estimator
10.4. The SNP Approach
10.5. Some Practical Issues
10.5.1. Drawing Random Numbers and
Variance Reduction
10.5.2. The Selection of the Auxiliary Model
10.5.3. Small Sample Properties of the
Indirect Inference
10.6. Conclusion
References
11. Logically Inconsistent Limited
Dependent Variables Models
J. S. Butler and Gabriel Picone
11.1. Logical Inconsistency
246
248
250
253
255
256
258
259
259
260
263
265
267
273
275
277
281
286
290
292
292
294
295
296
297
301
302
Contents
11.2. Identification
11.3. Estimation
11.4. Conclusion and Extensions
References
Index
ix
305
309
310
311
313
Contributors
SEUNG C. AHN, Arizona State University
JOERG BREITUNG, Humboldt University, Berlin
J.S. BUTLER, Vanderbilt University
MATTHEW J. CUSHING, University of Nebraska-Lincoln
ALASTAIR HALL, North Carolina State University and
University of Birmingham
DAVID HARRIS, University of Melbourne
FRANK KLEIBERGEN, Erasmus University
MICHAEL LECHNER, Mannheim University
ROMAN LIESENFELD, University of Tubingen
LASZLO MATYAS, Budapest University of Economics and
Erudite, Universite de Paris XII
MARY McGARVEY, University of Nebraska-Lincoln
MASAO OGAKI, Ohio State University
GABRIEL PICONE, University of South Florida
JAN M. PODIVINSKY, University of Southampton
PETER SCHMIDT, Michigan State University
PREFACE
The standard econometric modelling practice for quite a long time
was founded on strong assumptions concerning the underlying data
generating process. Based on these assumptions, estimation and
hypothesis testing techniques were derived with known desirable,
and in many cases optimal, properties. Frequently, these assumptions were highly unrealistic and unlikely to be true. These
shortcomings were attributed to the simplification involved in any
modelling process and therefore inevitable and acceptable. The
crisis of econometric modelling in the seventies led to many well
known new, sometimes revolutionary, developments in the way
econometrics was undertaken. Unrealistically strong assumptions
were no longer acceptable. Techniques and procedures able to deal
with data and models within a more realistic framework were badly
required. Just at the right time, i.e., the early eighties when all
this became obvious, Lars Peter Hansen's seminal paper on the
asymtotic properties of the generalized method of moments (GMM)
estimator was published in Econometrica. Although the basic idea
of the GMM can be traced back to the work of Denis Sargan in the
late fifties, Hansen's paper provided a ready to use, very flexible
tool applicable to a large number of models, which relied on mild
and plausible assumptions. The die was cast. Applications of the
GMM approach have mushroomed since in the literature, which has
been, as so many things, further boosted recently by the increased
availability of computing power.
Nowadays there are so many different theoretical and practical
applications of the GMM principle that it is almost impossible to
keep track of them. What started as a simple estimation method
has grown to be a complete methodology for estimation and hypothesis testing. As most of the best known "traditional" estimation
methods can be regarded as special cases of the GMM estimator, it
can also serve as a nice unified framework for teaching estimation
theory in econometrics.
The main objective of this volume is to provide a complete and
up-to-date presentation of the theory of GMM estimation, as well
as insights into the use of these methods in empirical studies.
2
Preface
The editor has tried to standardize the notation, language,
exposition, and depth of the chapters in order to present a coherent
book. We hope, however, that each chapter is able to stand on its
own as a reference work.
***
We would like to thank all those who helped with the creation of
this volume: the contributors, who did not mind being harassed by
the editor and produced several quality versions of their chapters
ensuring the high standard of this volume; the Erudite, Universite
de Paris XII in Prance and the Hungarian Research Fund (OTKA)
in Hungary for providing financial support; and the editors' colleagues and students for their valuable comments and feedback.
The camera-ready copy of this volume was prepared by the
editor with the help of Erika Mihalik using T^X and the Initbook
(Gabor Korosi and Laszlo Matyas) macro package.
Laszlo Matyas
Budapest, Melbourne and Paris, August 1998
Chapter
1
INTRODUCTION TO THE
GENERALIZED METHOD OF
MOMENTS ESTIMATION
David Harris and Laszlo Matyas
One of the most important tasks in econometrics and statistics is
to find techniques enabling us to estimate, for a given data set,
the unknown parameters of a specific model. Estimation procedures based on the minimization (or maximization) of some kind of
criterion function (M-estimators) have successfully been used for
many different types of models. The main difference between these
estimators lies in what must be specified of the model. The most
widely applied such estimation technique, the maximum likelihood,
requires the complete specification of the model and its probability
distribution. The Generalized Method of Moments (GMM)- does
not require this sort of full knowledge. It only demands the specification of a set of moment conditions which the model should
satisfy.
In this chapter, first, we introduce the Method of Moments
(MM) estimation, then generalize it and derive the GMM estimation procedure, and finally, analyze its properties.
4
Introduction to the Generalized Method of Moments Estimation
1.1 The Method of Moments
The Method of Moments is an estimation technique which suggests
that the unknown parameters should be estimated by matching
population (or theoretical) moments (which are functions of the
unknown parameters) with the appropriate sample moments. The
first step is to define properly the moment conditions.
1.1.1 Moment Conditions
DEFINITION
1.1 Moment Conditions
Suppose that we have an observed sample {xt : t =
1,..., T} from which we want to estimate an unknown p x 1
parameter vector 9 with true value 90. Let f(xt,9) be a
continuous g x l vector function of 9, and let E(f(xt,9))
exist and be finite for all t and 9. Then the moment
conditions are that E(f(xt,90)) = 0.
Before discussing the estimation of 90, we give some examples
of moment conditions.
EXAMPLE 1.1
Consider a sample {xt : t = 1,...,T} from a gamma
7(p*, q*) distribution with true values p* = p^ and g* = q^.
The relationships between the first two moments of this
distribution and its parameters are:
In the notation of Definition 1.1, we have 9 = (p*, q*)' and
a vector of q* = 2 functions:
Then the moment conditions are E(f(xt,9o)) = 0.
The Method of Moments
EXAMPLE 1.2
Another simple example can be given for a sample {xt :
t = 1,...,T} from a beta j3(p*, q*) distribution, again
with true values p* — p^ and q* = q£. The relationships
between the first two moments and the parameters of this
distribution are:
E{xt) = ^Szz
,2,
E(x2t) =
PoiPo + 1)
(Po + Qo)(Po
With e = (p*,q*)' we have
f(x
ff\-(x-
P
x2 -
*
^ C ^ + l)
V
EXAMPLE 1.3
Consider the linear regression model
yt = x't(3Q + ut,
where xt is a p x 1 vector of stochastic regressors, /?0 is the
true value of a p x 1 vector of unknown parameters /?, and
i£t is an error term. In the presence of stochastic regressors,
we often specify
E{ut\xt) = 0,
so that
E(yt\xt)
= x't(30.
Using the Law of Iterated Expectations we find
E(xtut) =
=
E(xtE(ut\xt))-
=0
The equations
E(xtut) = E(xt(yt - x%)) = 0,
6
Introduction to the Generalized Method of Moments Estimation
are moment conditions for this model. That is, in terms of
Definition 1.1, 9 = (3 and f({xt,yt),0) = xt(yt - x't(3).
m
Notice that in this example E(xtut) = 0 consists of p equations since
xt is a p x 1 vector. Since (3 is a p x 1 parameter, these moment
conditions exactly identify (3. If we had fewer than p moment
conditions, then we could not identify /?, and if we had more than
p moment conditions, then (3 would be over-identified. Estimation
can proceed if the parameter vector is exactly or over-identified.
Notice that, compared to the maximum likelihood approach (ML),
we have specified relatively little information about ut. Using ML,
we would be required to give the distribution of uu as well as parameterising any autocorrelation and heteroskedasticity, while this
information is not required in formulating the moment conditions.
However, some restrictions on such aspects of the model are still
required for the derivation of the asymptotic properties of the GMM
estimator.
It is common to obtain moment conditions by requiring that
the error term of the model has zero expectation conditional on certain observed variables. Alternatively, we can specify the moment
conditions directly by requiring the error term to be uncorrelated
with certain observed instrumental variables. We illustrate with
another example.
EXAMPLE 1.4
As in the previous example, we consider the linear regression model
yt = x't(30 + ut,
but we do not assume E(ut\xt) = 0. It follows that ut and
xt may be correlated. Suppose we have a set of instruments
in the g x 1 vector zt. These may be defined to be valid
instruments if E(ztut) = 0. Thus the requirement that
zt be a set of valid instruments immediately provides the
appropriate moment conditions
E(ztut) =
E(zt(yt-x't0o))=O.
That is,
,y u zt),0) = zt(yt - x't(3).
The Method of Moments
There are q equations here since there are q instruments,
so we require q > p for identification.
1.1.2 Method of Moments Estimation
We now consider how to estimate a parameter vector 9 using moment conditions as given in Definition 1.1. Consider first the case
where q = p, that is, where 9 is exactly identified by the moment
conditions. Then the moment conditions E(f(xu9)) = 0 represent
a set of p equations for p unknowns. Solving these equations would
give the value of 9 which satisfies the moment conditions, and this
would be the true value 90. However, we cannot observe E(f(.,.)),
only f(xt,9). The obvious way to proceed is to define the sample
moments of f(xu 9)
t=l
which is the Method of Moments (MM) estimator of E(f(xu9)).
If the sample moments provide good estimates of the population
moments, then we might expect that the estimator 6T that solves
the sample moment conditions fr(9) = 0 would provide a good
estimate of the true value 90 that solves the population moment
conditions E(f(xt,9)) = 0.
In each of Examples 1.1-1.3, the parameters are exactly identified by the moment conditions. We consider each in turn.
EXAMPLE 1.5
For the gamma distribution in Example 1.1, fr(9) = 0
implies
and
t=l

Author Laszlo Matyas Isbn 9780521660136 File size 7.22MB Year 1999 Pages 332 Language English File format PDF Category Mathematics Book Description: FacebookTwitterGoogle+TumblrDiggMySpaceShare The generalized method of moments (GMM) estimation has emerged over the past decade as providing a ready to use, flexible tool of application to a large number of econometric and economic models by relying on mild, plausible assumptions. The principal objective of this volume, the first devoted entirely to the GMM methodology, is to offer a complete and up to date presentation of the theory of GMM estimation as well as insights into the use of these methods in empirical studies. It is also designed to serve as a unified framework for teaching estimation theory in econometrics. Contributors to the volume include well-known authorities in the field based in North America, the UK/Europe, and Australia. Download (7.22MB) Stochastic versus Deterministic Systems of Differential Equations Mathematical Modelling of Inelastic Deformation Theory Of Factorial Design: Single- And Multi-stratum Experiments Degradation Processes in Reliability Nonlinear Partial Differential Equations and Free Boundaries. Volume 1: Elliptic Equations Load more posts