**I**n connection with the Bernoulli factory post of last week, Richard Brent arXived a short historical note recalling George Forsythe’s algorithm for simulating variables with density when (the extension to any upper bound is straightforward). The idea is to avoid computing the exponential function by simulating uniforms until

since the probability of this event is

its expectation is and the probability that *n* is even is . This turns into a generation method if the support of *G* is bounded. In relation with the Bernoulli factory problem, I think this has potential applications in that, when the function *G(x)* is replaced with an unbiased estimator the subsequent steps remain valid. This approach would indeed involve computing one single value of *G(x)*, but this is also the case with Latuszyński et al.’s and our solutions… So I am uncertain as to whether or not this has practical implications. (Brent mentions normal simulation but this is more history than methodology.)

Filed under: R, Statistics Tagged: Bernoulli factory, George Forsythe, John von Neumann, Monte Carlo methods, normal distribution, series expansion, simulation

*Related*

To

**leave a comment** for the author, please follow the link and comment on their blog:

** Xi'an's Og » R**.

R-bloggers.com offers

**daily e-mail updates** about

R news and

tutorials on topics such as:

Data science,

Big Data, R jobs, visualization (

ggplot2,

Boxplots,

maps,

animation), programming (

RStudio,

Sweave,

LaTeX,

SQL,

Eclipse,

git,

hadoop,

Web Scraping) statistics (

regression,

PCA,

time series,

trading) and more...

If you got this far, why not

__subscribe for updates__ from the site? Choose your flavor:

e-mail,

twitter,

RSS, or

facebook...

**Tags:** Bernoulli factory, George Forsythe, John von Neumann, Monte Carlo methods, normal distribution, R, series expansion, Simulation, statistics