If I want to calculate the sample variance such as below:
Which becomes: $\left(\frac{1}{n}\right)^2 \cdot n(\sigma^2)= \frac{\sigma^2}{n} $...
My question is WHY does it become $$\left(\frac{1}{n}\right)^2?$$ In other words, why does the $(1/n)$ inside the variance become $(1/n)^2$?
I've read that this is because:
When a random variable is multiplied by a constant, it's variance gets multiplied by the square of the constant.
Again, though, I want to know why?
I've looked in multiple sources but they all seem to gloss over this point. I want to visually see why this is done.
Could someone please demonstrate why the $1/n$ is squared using my example?
Update:
As @symplectomorphic points out in a comment under their answer, my confusion was the result of not realizing there was a difference between the variance of a set of data and the variance of a random variable.
- See @symplectomorphic's other comment for an explanation of the difference.
@symplectomorphic's answer provides a good conceptual walkthrough, while user @Tryss's answer provides the correct mathematical explanation. Thanks to both of you!
