Functional Gradient Descent (FGD) is a method of nonparametric time series analysis, useful in particular for estimating conditional mean, variances and covariances for very high-dimensional time series. FGD is a kind of hybrid of nonparametric statistical function estimation and numerical optimization. In fact, the idea of FGD comes from the fact that boosting can be viewed as an optimization algorithm in function space. This method employs an iterative refitting of generalized residuals, based on a given statistical procedure called base learner, to approximate the first two conditional moment functions of a multivariate process. An appealing feature of this expansion is that it is a nonlinear nonparametric model that directly nests the Gaussian diagonal VAR model, the Gaussian GARCH model and the multivariate CCC-GARCH as simple, starting special cases. The FGD model is fitted using conventional maximum likelihood together with a cross-validation strategy that determines the appropriate number of additive terms in the final expansions.
Interested ppl shall download the Splus codes and data at http://www.raffonline.altervista.org/fgd/
PS: to be honest, this is not the area i am family with at all, download at your own risk
Tags – nonparametric
Read the full post at Nonparametric High-Dimensional Time Series Analysis .