3
$\begingroup$

To prove sample mean is an unbiased estimator of the population mean, we use the following step, $\mathbb{E}[\sum \frac{X_i}{n}] = \frac{1}{n} \sum \mathbb{E}[X_i] = \mu$, given that $\mathbb{E}[X_i] = \mu$.

Now my question is if we take say first random variable $X_1$ as an estimate we can go ahead and say that since $\mathbb{E}{[X_1]} = \mu$, $X_1$ is an unbiased estimator. Does that mean $X_1$ or for that matter any value is an unbiased estimator?

I know it is not. I am doing a conceptual mistake in thinking, please let me know where I am going completely wrong.

[Edit] As commented below for iid's every sample is indeed an unbiased estimate of the mean.

$\endgroup$
3
  • 1
    $\begingroup$ The sample mean is not used to estimate the sample mean. That is a nonsense statement. If the X$_i$ are iid then any single observation is an unbiased estimator of the population mean (assuming it exists). The reason to take the average of all n is that it reduces the variance by a factor of 1/n. $\endgroup$ Commented Nov 22, 2017 at 5:32
  • $\begingroup$ Apologies, I meant population mean. Edited. $\endgroup$ Commented Nov 22, 2017 at 5:35
  • $\begingroup$ @MichaelChernick Also thanks, in case of iid's every random sample indeed is an estimation of sample mean!!! that is what I wanted to be assured of. Thanks!! $\endgroup$ Commented Nov 22, 2017 at 5:36

1 Answer 1

3
$\begingroup$

It looks like you are making the mistake of confusing data with the random variables.

Let a random variable $X$ have mean $\mu$. Then if $(x_1, \ldots x_n)$ are realizations from $X$, $\frac{1}{n} \sum x_i = \hat{\mu}$ is an unbiased estimator of $\mu$. This is because $\mathbb{E}[\mu - \hat{\mu}] = 0$, a fact that we will use later. In fact if the data is drawn IID from $X$ then each individual sample is an unbiased estimator of the mean; the reason we take many samples is to lower the variance of our estimate.

Now if, as you mentioned, we have multiple random variables, $X_1, \ldots, X_p$, each of which has mean $\mu$, then $\mathbb{E}\left[\frac{1}{p}\sum_{i=1}^p X_i \right] = \mu$ as you suggest. Furthermore, $\mathbb{E}X_1 = \mu$. That is, $\mathbb{E}[\mu - \mathbb{E}X_1]=0$. This implies that the observed realization of $X_1$ is an unbiased estimator of $\mu$. There is nothing startling here but it is distinct, I think, from the question you meant to be asking which was answered in the previous paragraph. This distinction is important and worth understanding well.

$\endgroup$
6
  • $\begingroup$ Thanks! this was insightful, but I have a question, so if we consider the equation $\mathbb{E}[\mu - X_1]$, does not that imply zero too ? by linearity we can take $\mu - \mathbb{E}[X_1] = 0$. I understand that expectation of $X_1$ is also an unbiased estimator. Apologies if my questions are really naive. $\endgroup$ Commented Nov 22, 2017 at 5:56
  • $\begingroup$ You forgot the expectation on the X again $\endgroup$ Commented Nov 22, 2017 at 6:07
  • $\begingroup$ Since $X_1$ is a random variable, therefore should not $\mu - X_1$, also be a random variable? $\endgroup$ Commented Nov 22, 2017 at 6:16
  • $\begingroup$ Sorry, yes it is. An estimate of mu must be a scalar though. Your math is correct in these comments but there might be a misunderstanding in terminology. Try the wiki article on estimators and estimates for a definition. That may clarify some confusion $\endgroup$ Commented Nov 22, 2017 at 6:31
  • $\begingroup$ @Stephan kolassa yes I agree. I tried to make that distinction clear. and it has been danced around further in these comments. $\endgroup$ Commented Nov 22, 2017 at 7:41

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.