BMS 0.3.0 Released

(This article was first published on BMS Add-ons » BMS Add-ons, and kindly contributed to R-bloggers)

Version of 0.3.0 of the Bayesian Model Averaging package BMS has been released. Apart from numerous bugfixes, BMS 0.3.0 includes two main additions:

  • The ability in bms to keep certain fixed regressors to be included in all sampled models
  • The option to calculate predictive densities with function pred.density.

Moreover, the interrnal structure has been redesigned to accommodate user-defined priors and samplers.

Updating / Installing the new Version

If you are running R version 2.13 or above under Windows or any R version under Linux or Mac OS X, then simply type the R command:

install.packages("BMS")

If you are running an older R version under Windows, then use the following command, or refer to the manual installation page.

  • Windows R version 2.10 to 2.12: type:
    install.packages("BMS", contriburl="http://bms.zeugner.eu/getBMS/tarballs/windows/windowsR210/")
  • Windows R version 2.5 to 2.9: type:
    install.packages("BMS", contriburl="http://bms.zeugner.eu/getBMS/tarballs/windows/windowsR29/")

After installation, you may verify the version number by typing the following commands:

library(BMS)
?BMS

The corresponding help entry should then display the version number 0.3.0.

New Feature: Keeping Fixed Regressors

The new argument fixed.reg in function bms allows for keeping a fixed set of regressors to be included in all models. Take, for instance, the built-in attitude data set. A researcher might be certain that any model for the dependent variable rating should be based at least on the regressors learning and complaints. She just is not sure whether to add the remaining four regressors and wants to preform Bayesian Model Averaging over these four variables. The corresponding command is:

m1 = bms(attitude, mprior = "uniform", fixed.reg = c("complaints","learning"))

The resulting output then shows that the Posterior Inclusion Probabilities for learning and complaints are indeed at 100%. The resulting object m1 is a standard bma object and may be processed with any of the BMS functions.
For more details, consult chapter A.4 of the BMS tutorial.

New Feature: Predicitve Density

BMS now allows for creating predicitive densities that conform to the many available priors. For illustration, consider again the built-in dataset attitude. We might use the first 27 observations to forecast the final 3 observations.

m2 = bms(attitude[1:27,])
pp = pred.density(m2, newdata=attitude[28:30, -1])

The object pp now holds the prediction for these 3 observations, conditional on what we know from the first 27 observations and the explanatory data. Several functions are available to extract meaningful information. For instance, the following command provides a plot of predictive densities for each data point:

plot(pp)

Other functions are available to extract quantiles, densities for realized outcomes, standard errors, etc. For more details compare help(pred.density) or chapter 6 in the BMS tutorial.

User-defined Priors and Samplers

For more R-savy users, BMS now offers the possibility to write customized 'plug-in' priors: this applies to both the coefficient (g) priors and model priors. Similarly, the new version also implements limited customizability for MCMC samplers. For more details consider the tutorials on custom priors.

Other changes vs previous versions

The functionality of the BMS version 0.3.0 is compatible with earlier versions (0.2.5), therefore any previously user-written BMS code should continue to function.
Nonetheless, numerous bug-fixes to deal with exceptional situations have been implemented and required changes to many internal objects. For more details, consider the package technical News file.

To leave a comment for the author, please follow the link and comment on his blog: BMS Add-ons » BMS Add-ons.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Tags:

Comments are closed.