ESSA2013 Conference

[This article was first published on R snippets, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

It has been just announced that during ESSA2013 conference I am planning to organize a special track on “Statistical analysis of simulation models”. I hope to get some presentations using GNU R to promote it in social simulation community.

It is obvious that GNU R excels in analysis of simulation data. However, very often it can be neatly used to implement simulations themselves.

For instance I have recently implemented a simulation model proposed in Section 4 of Volatility Clustering in Financial Markets: Empirical Facts and Agent–Based Models paper by Rama Cont. The model is formulated as follows (I give only its brief description; please refer to the paper for more details).

Consider market with n trading agents and one asset. We simulate market for times periods. In each period each agent can buy asset, sell it or do nothing.
Asset return r[i] in period i equals to number of buy orders minus number of sell orders divided by number of agents n and multiplied by normalizing constant max.r. Thus it will always lie in the interval [-max.r,max.r].
Agents make buy and sell decisions based on random public information about an asset. The stream of signals are IID normal random variables with mean 0 and standard deviation Each investor holds an internal non negative decision making threshold. If signal is higher than threshold level buy decision is made. If it is lower than minus threshold level asset is sold. If signal is not strong enough investor does nothing.
After return r[i] is determined each investor with probability p.update performs threshold update to abs(r[i]).
As you can see the description is quite lengthily. However, the implementation of the model in GNU R is a genuine snippet as can be seen below:

cont <- function(times, n,, max.r ,p.update) {
      threshold <- vector(“numeric”, n)
      signal <- rnorm(times, 0,
      r <- vector(“numeric”, times)
      for (i in 1:times) {
            r[i] <- max.r * (sum(signal[i] > threshold)
                         sum(signal[i] < (-threshold))) / n
            threshold[runif(n) < p.update] <- abs(r[i])

And an additional benefit is that one can analyze the simulation results in GNU R also. Here is a very simple example showing the relationship between and standard deviation of simulated returns (the initial burn in period in the simulation is discarded): <- function( {
    sd(cont(10000, 1000,, 0.1, 0.05)[1000:10000])
} <- runif(100, 0.01, 0.1)
sd.out <- sapply(,

and here is the resulting plot:

To leave a comment for the author, please follow the link and comment on their blog: R snippets. offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)