×
Well done. You've clicked the tower. This would actually achieve something if you had logged in first. Use the key for that. The name takes you home. This is where all the applicables sit. And you can't apply any changes to my site unless you are logged in.

Our policy is best summarized as "we don't care about _you_, we care about _them_", no emails, so no forgetting your password. You have no rights. It's like you don't even exist. If you publish material, I reserve the right to remove it, or use it myself.

Don't impersonate. Don't name someone involuntarily. You can lose everything if you cross the line, and no, I won't cancel your automatic payments first, so you'll have to do it the hard way. See how serious this sounds? That's how serious you're meant to take these.

×
Register


Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.
  • Your password can’t be too similar to your other personal information.
  • Your password must contain at least 8 characters.
  • Your password can’t be a commonly used password.
  • Your password can’t be entirely numeric.

Enter the same password as before, for verification.
Login

Grow A Dic
Define A Word
Make Space
Set Task
Mark Post
Apply Votestyle
Create Votes
(From: saved spaces)
Exclude Votes
Apply Dic
Exclude Dic

Click here to flash read.

We review Quasi Maximum Likelihood estimation of factor models for
high-dimensional panels of time series. We consider two cases: (1) estimation
when no dynamic model for the factors is specified \citep{baili12,baili16}; (2)
estimation based on the Kalman smoother and the Expectation Maximization
algorithm thus allowing to model explicitly the factor dynamics
\citep{DGRqml,BLqml}. Our interest is in approximate factor models, i.e., when
we allow for the idiosyncratic components to be mildly cross-sectionally, as
well as serially, correlated. Although such setting apparently makes estimation
harder, we show, in fact, that factor models do not suffer of the {\it curse of
dimensionality} problem, but instead they enjoy a {\it blessing of
dimensionality} property. In particular, given an approximate factor structure,
if the cross-sectional dimension of the data, $N$, grows to infinity, we show
that: (i) identification of the model is still possible, (ii) the
mis-specification error due to the use of an exact factor model log-likelihood
vanishes. Moreover, if we let also the sample size, $T$, grow to infinity, we
can also consistently estimate all parameters of the model and make inference.
The same is true for estimation of the latent factors which can be carried out
by weighted least-squares, linear projection, or Kalman filtering/smoothing. We
also compare the approaches presented with: Principal Component analysis and
the classical, fixed $N$, exact Maximum Likelihood approach. We conclude with a
discussion on efficiency of the considered estimators.

Click here to read this post out
ID: 648382; Unique Viewers: 0
Unique Voters: 0
Total Votes: 0
Votes:
Latest Change: Dec. 31, 2023, 7:33 a.m. Changes:
Dictionaries:
Words:
Spaces:
Views: 8
CC:
No creative common's license
Comments: