The method of moments estimator $\theta_n$ is defined to be the value of $\theta$ such that

\begin{align*} \alpha_1(\hat{\theta_n}) &= \hat{\alpha_1} \\ \alpha_2(\hat{\theta_n}) &= \hat{\alpha_2} \\ &\vdots \\ \alpha_k(\hat{\theta_n}) &= \hat{\alpha_k} \\ \end{align*}

where $\hat{\alpha_j} = \frac{1}{n}\sum_{i=1}^n X_i^j$ and $\alpha_j(\theta) = \mathbb{E}_{\theta}(X^j)$.

The above defines a system of $k$ equations with $k$ unknowns.

## Problem

Let $f(x) = \left(\frac{\alpha m^{\alpha}}{x^{\alpha+1}} \right)\mathbb{I}{x ≥ m}$ where α and m are two unknown parameters. Estimate the values of these parameters using method of moments.

The mean of the pareto distribution is given by

$$\begin{cases} \infty, \alpha \leq 1 \\ \frac{\alpha m}{\alpha -1}, \alpha > 1 \end{cases}$$

The variance of the pareto distribution is given by

$$\begin{cases} \infty, \alpha \leq 2 \\ \frac{m^2 \alpha}{(\alpha-1)^2(\alpha-2)}, \alpha > 2 \end{cases}$$

Using the method of moments, we get the following equations

\begin{align} \frac{\alpha m}{\alpha-1} &= A \\ \frac{m^2 \alpha}{(\alpha-1)^2(\alpha-2)} + A^2 &= B \end{align}

where

$$A = \frac{1}{n}\sum_{i=1}^n x_i \\ B = \frac{1}{n}\sum_{i=1}^n x_i^2$$

Solving the above equations for $\alpha$ and $m$, we get

$$\alpha = 1 + \sqrt{\frac{B}{B-A^2}} \\ m = \frac{A \sqrt{B}}{\sqrt{B} + \sqrt{B-A^2}}$$