**Quantitative Finance Collector**, and kindly contributed to R-bloggers)

Functional Gradient Descent (FGD) is a method of **nonparametric time series analysis**, useful in particular for estimating conditional mean, variances and covariances for very high-dimensional time series. FGD is a kind of hybrid of nonparametric statistical function estimation and numerical optimization. In fact, the idea of FGD comes from the fact that boosting can be viewed as an optimization algorithm in function space. This method employs an iterative refitting of generalized residuals, based on a given statistical procedure called base learner, to approximate the first two conditional moment functions of a multivariate process. An appealing feature of this expansion is that it is a nonlinear nonparametric model that directly nests the Gaussian diagonal VAR model, the Gaussian GARCH model and the multivariate CCC-GARCH as simple, starting special cases. The FGD model is fitted using conventional maximum likelihood together with a cross-validation strategy that determines the appropriate number of additive terms in the final expansions.

Interested ppl shall download the Splus codes and data at http://www.raffonline.altervista.org/fgd/

PS: to be honest, this is not the area i am family with at all, download at your own risk

Tags – nonparametric**Read the full post at Nonparametric High-Dimensional Time Series Analysis **.

**leave a comment**for the author, please follow the link and comment on their blog:

**Quantitative Finance Collector**.

R-bloggers.com offers

**daily e-mail updates**about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...