Consider testing

$$H_0: \theta \in \Theta_0 \text{ versus } H_1: \theta \not\in \Theta_0.$$

The likelihood ratio statistic is

$$\lambda = 2 \log \left(\frac{sup_{\theta \in \Theta} \mathcal{L}(\theta)}{sup_{\theta \in \Theta_0} \mathcal{L}(\theta)} \right) = 2 \log \left( \frac{\mathcal{L}(\hat{\theta)}}{\mathcal{L}(\hat{\theta_0})} \right)$$

where $\hat{\theta}$ is the MLE and $\hat{\theta_0}$ is the MLE when $\theta$ is restricted to lie in $\Theta_0$.

## Problem

Density of a $Beta(α, β)$ distribution is given by:

$$f(x) = \frac{Γ(α + β)}{Γ(α)Γ(β)}x^{α−1}(1 − x)^{β−1}\mathbb{I}{0 < x < 1}.$$

Suppose that $X1, \dots , X_n$ is a random sample from a $Beta(μ, 1)$ population and $Y_1, \dots , Y_m$ is another random sample from a $Beta(θ, 1)$ population. Also assume that the two sample are independent of each other. How would you conduct the likelihood ratio test for $H_0 : θ = μ$ against $H_1 : θ \neq μ$?

## Solution

The PDF of the $Beta(\theta,1)$ distribution is given by

$$f(x;\theta) = \theta x^{\theta-1}\mathbb{I}{0 < x < 1}.$$

The likelihood function under the null hypothesis $H_0 : θ = μ$ is given by

$$\mathcal{L}(\theta) = \prod_{i=1}^n \theta x_i^{\theta -1} \prod_{i=1}^m \theta y_i^{\theta -1}$$

The log likelihood of the above function is given by

\begin{align*} \log\left(\mathcal{L}(\theta)\right) &= n \log(\theta) + \sum_{i=1}^n(\theta-1) \log(x_i) \\ &+ m \log(\theta) + \sum_{i=1}^m(\theta-1) \log(y_i) \end{align*}

To get the MLE, we differentiate the log likelihood function wrt $\theta$ and equate it to $0$. The MLE $\hat{\theta_0}$ is given by

$$\hat{\theta_0} = -\frac{n+m}{\sum_{i=1}^n \log(x_i) +\sum_{i=1}^m \log(y_i)}$$

The likelihood function under the alternative hypothesis $H_1 : θ \neq μ$ is given by

$$\mathcal{L}(\theta, \mu) = \prod_{i=1}^n \theta x_i^{\theta -1} \prod_{i=1}^m \mu y_i^{\mu -1}$$

To get the MLEs, we differentiate the log likelihood function wrt $\theta$ and $\mu$ equate each to $0$. The MLEs $\hat{\theta}$ and $\hat{\mu}$ are given by

$$\hat{\theta} = -\frac{n}{\sum_{i=1}^n \log(x_i)} \\ \hat{\mu} = -\frac{m}{\sum_{i=1}^m \log(y_i)}$$

The likelihood ratio statistic can be calculated as follows:

$$\lambda = 2 \log \left( \frac{\mathcal{L}(\hat{\theta}, \hat{\mu})}{\mathcal{L}(\hat{\theta_0})} \right).$$