# L16.3 LMS Estimation of One Random Variable Based on Another | Summary and Q&A

6.4K views
April 24, 2018
by
MIT OpenCourseWare
L16.3 LMS Estimation of One Random Variable Based on Another

## TL;DR

The conditional expectation of a random variable minimizes the mean squared error and is the optimal point estimate.

## Install to Summarize YouTube Videos and Get Transcripts

### Q: What is the difference between point estimation with and without an observation?

In the presence of an observation, we use the conditional distribution to calculate the posterior distribution of the unknown parameter. Without an observation, we rely on the prior distribution.

### Q: How do we obtain the optimal estimate in a conditional universe?

The optimal estimate is the conditional expectation, which minimizes the mean squared error. It takes into account the information provided by the observation.

### Q: How does the mean squared error of the optimal estimate compare to other estimates?

The mean squared error of the optimal estimate is less than or equal to the mean squared error of any other estimate. This holds true for all possible estimates.

### Q: What role does the conditional expectation play in point estimation?

The conditional expectation serves as the optimal estimator in a conditional universe. It minimizes the mean squared error and provides the most accurate estimate of the unknown parameter.

## Summary & Key Takeaways

• Point estimation involves finding a single value to estimate an unknown parameter.

• In a conditional universe where an observation is available, the optimal estimate is the conditional expectation.

• The mean squared error of the conditional expectation is less than or equal to the mean squared error of any other estimate.