Parametric Inference: Karlin-Rubin Theorem

July 20, 2015
By

(This article was first published on Analysis with Programming, and kindly contributed to R-bloggers)

A family of pdfs or pmfs ${g(t|theta):thetainTheta}$ for a univariate random variable $T$ with real-valued parameter $theta$ has a monotone likelihood ratio (MLR) if, for every $theta_2>theta_1$, $g(t|theta_2)/g(t|theta_1)$ is a monotone (nonincreasing or nondecreasing) function of $t$ on ${t:g(t|theta_1)>0;text{or};g(t|theta_2)>0}$. Note that $c/0$ is defined as $infty$ if $0< c$.
Consider testing $H_0:thetaleq theta_0$ versus $H_1:theta>theta_0$. Suppose that $T$ is a sufficient statistic for $theta$ and the family of pdfs or pmfs ${g(t|theta):thetainTheta}$ of $T$ has an MLR. Then for any $t_0$, the test that rejects $H_0$ if and only if $T >t_0$ is a UMP level $alpha$ test, where $alpha=P_{theta_0}(T >t_0)$.

Example 1
To better understand the theorem, consider a single observation, $X$, from $mathrm{n}(theta,1)$, and test the following hypotheses: $$ H_0:thetaleq theta_0quadmathrm{versus}quad H_1:theta>theta_0. $$ Then $theta_1>theta_0$, and the likelihood ratio test statistics would be $$ lambda(x)=frac{f(x|theta_1)}{f(x|theta_0)}. $$ And we say that the null hypothesis is rejected if $lambda(x)>k$. To see if the distribution of the sample has MLR property, we simplify the above equation as follows: $$ begin{aligned} lambda(x)&=frac{frac{1}{sqrt{2pi}}expleft[-frac{(x-theta_1)^2}{2}right]}{frac{1}{sqrt{2pi}}expleft[-frac{(x-theta_0)^2}{2}right]}\ &=exp left[-frac{x^2-2xtheta_1+theta_1^2}{2}+frac{x^2-2xtheta_0+theta_0^2}{2}right]\ &=expleft[frac{2xtheta_1-theta_1^2-2xtheta_0+theta_0^2}{2}right]\ &=expleft[frac{2x(theta_1-theta_0)-(theta_1^2-theta_0^2)}{2}right]\ &=expleft[x(theta_1-theta_0)right]timesexpleft[-frac{theta_1^2-theta_0^2}{2}right] end{aligned} $$ which is increasing as a function of $x$, since $theta_1>theta_0$.

Figure 1. Normal Densities with $mu=1,2$.

By illustration, consider Figure 1. The plot of the likelihood ratio of these models is monotone increasing as seen in Figure 2, where rejecting $H_0$ if $lambda(x)>k$ is equivalent to rejecting it if $Tgeq t_0$.

Figure 2. Likelihood Ratio of the Normal Densities.

And by factorization theorem the likelihood ratio test statistic can be written as a function of the sufficient statistics since the term, $h(x)$ will be cancelled out. That is, $$ lambda(t)=frac{g(t|theta_1)}{g(t|theta_0)}. $$ And by Karlin-Rubin theorem, the rejection region $R={t:t>t_0}$ is a uniformly most powerful level-$alpha$ test. Where $t_0$ satisfies the following: $$ begin{aligned} mathrm{P}(T>t_0|theta_0)&=mathrm{P}(Tin R|theta_0)\ alpha&=1-mathrm{P}(Xleq t_0|theta_0)\ 1-alpha&=int_{-infty}^{t_0}frac{1}{sqrt{2pi}}expleft[-frac{(x-theta_0)^2}{2}right]operatorname{d}x end{aligned} $$ Hence the quantile of the $1-alpha$ probability, which is $z_{alpha}$ is equal to $t_0$, that is $z_{alpha}=t_0$, and thus we reject $H_0$ if $T>z_{alpha}$.

Example 2
Now consider testing the hypotheses, $H_0:thetageq theta_0$ versus $H_1:theta< theta_0$ using the sample $X$ (single observation) from Beta($theta$, 2), and to be more specific let $theta_0=4$ and $theta_1=3$. Can we apply Karlin-Rubin? Of course! Visually, we have something like in Figure 3.

Figure 3. Beta Densities Under Different Parameters.

Note that for this test, $theta_1 k$ if and only if $T < t_0$. Where $t_0$ satisfies the following equations: $$ begin{aligned} mathrm{P}(T < t_0|theta_0)&=mathrm{P}(X < t_0|theta_0)\ alpha&=int_{0}^{t_0}frac{Gamma(theta_0+2)}{Gamma(theta_0)Gamma(2)}x^{theta_0-1}(1-x)^{2-1}operatorname{d}x\ alpha&=int_{0}^{t_0}frac{Gamma(6)}{Gamma(4)Gamma(2)}x^{3}(1-x)operatorname{d}x. end{aligned} $$ Hence the quantile of the $alpha$ probability, $x_{alpha}=t_0$. And thus we reject $H_0$ if $T < x_{alpha}$.

Figure 4. Likelihood Ratio of the Beta Densities.

Reference

  1. Casella, G. and Berger, R.L. (2001). Statistical Inference. Thomson Learning, Inc.

To leave a comment for the author, please follow the link and comment on their blog: Analysis with Programming.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.

Search R-bloggers


Sponsors

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)