**Learning Data Science**, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)

Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Robert Mathews said that : “Ronald Fisher gave scientists a mathematical machine for turning baloney into breakthroughs, and ukes into funding. It is time to pull the plug.”. He’s right.

In one previous life, I wrote a thesis in Philosophy. But, a specific area, Epistemology also called

theory of knowledge, because, It questions what knowledge is and how it can be acquired,

and the extent to which any given subject or entity can be known.

My thesis deal about : *The tradition, since Cournot in applying mathematical modelling to social sphere and more specific, how the climate modelling interact with interdiscplinary source of knowledge (mathematics, physics,geography, philosophy)*.

After reading this : http://www.academia.edu/1075253/Climate_Change_Epistemic_Trust_and_Expert_Trustworthiness

It seems that the use of bayesian statistics is misunderstood.

Assume two thesis :

- According to the Bayesian’s statistic is the science who deal about the degree of of proofs in the observations. That means that, Bayesian statistics self-contained paradigm providing tools and techniques for all statistical problems.
- In the classical frequentist view point of statistical theory, a statistical procedure is judged by averaging its performance over all possible data

However, the bayesian approach gives prime importance to how a given procedure performs for

the actual data observed in a given situation.

The core of this theory have been formalised by Popper (Karl) with two main principles :

- Knowledge cannot start from nothing — from a tabula rasa – nor yet from observation. The advance of knowledge consists, mainly, in the modification of earlier knowledge. Although we may sometimes, for example in archaeology, advance through a chance observation,the significance of the discovery will usually depend upon its power to modify our earlier theories.
- Any probability is a degree of belief about something; It’s not a property.That means that any scientific model produce data absolutely or conditionnaly on probability. It’s possible to measure how the data can modify the degrees of belief (baye’s principle).

One of the result of frequentist theory, may be the most criticized, is the fisher s p-value.

A problem can be to evaluate how the diploma is important to get in the first job

Hypothese :

– The diploma has no effect on the first job. In frequentist, we compute the p-value that can be

interpreted as the probability to observe a difference at least as important observed in the data if our hypothesis is true.

Read more: http://www.answers.com/topic/bayesian-statistics#ixzz2UKZDkq3v

So, let us talk about the p-value and this problem :**If you cross seven times a beautiful girl each day for 10 days at the same place, can you conclude that she is always there? And then, tomorrow, youalways have a chance to speak with her.**

We’re going to examine both approach : Fisher (based on p_value) or Jeffreys (Bayesian)

- With Fisher, we have :

H0 : p=0.5 and H1 :p=0.5

where p is “the probability to cross the girl”. If we reject H0 (fisher) we can conclude that the girl is always there.

the p_value is :

`> 2*pbinom(3,10,0.5)`

0.34375

which is greater than 5% (the arbitrary threshold that everybody likes). indeed, we cannot conclude... Thank's Fisher; I can't trust you.

With Bayes, we take our two hypothesis again and we suppose p(H0)=p(H1)=0.5. We have the same chance to cross her every day. With bayesian approach, wa need to compute the likelihood, wich require, a priori distribution, that can be sum up (in our example) by "your a priori belief of cross again and to talk to her"

Let s compute this :

> (p <- c(0.5,0.6,0.7))

[1] 0.5 0.6 0.7

> (apriori <- c(0.5,0.3,0.2))

[1] 0.5 0.3 0.2

> (vraisemblance <- dbinom(7,10,p))

[1] 0.1171875 0.2149908 0.2668279

> (loi.jointe <- apriori*vraisemblance)

[1] 0.05859375 0.06449725 0.05336559

> (p.y <- sum(loi.jointe))

[1] 0.1764566

> (aposteriori <- loi.jointe/p.y)

[1] 0.3320576 0.3655134 0.3024290

So, the conditionnal probability p(H0/y)=is 33% and P(H1/y) = 66% .

There are two out of three chance that the girl is always there. In other words;

I have 66% of chances to cross her tomorrow.

**Bayesian approach is hopeful. I like it; I can take my time with the girl next door.**

** **

** **

To **leave a comment** for the author, please follow the link and comment on their blog: ** Learning Data Science **.

R-bloggers.com offers **daily e-mail updates** about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.

Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.