Lecture 9: Expectation, Indicator Random Variables, Linearity | Statistics 110 | Summary and Q&A
Transcript
Read and summarize the transcript of this video on Glasp Reader (beta).
Summary
In this video, the speaker discusses random variables, their distributions, and how to compute the average result. They start by explaining the cumulative distribution function (CDF) and its properties. Then, they talk about independence of random variables and how to compute probabilities using CDFs and PMFs. The main topic of averages is introduced, and the concept of expected value is defined. The speaker provides examples of finding the expected value of different distributions, such as the Bernoulli and hypergeometric distributions. They also introduce the geometric distribution and show how to find its expected value. The video emphasizes the use of indicators, linearity, and symmetry in computing expected values.
Questions & Answers
Q: What is the cumulative distribution function (CDF)?
The CDF is a function that assigns probabilities to values of a random variable. It gives the probability that a random variable is less than or equal to a given value. The CDF is well-defined for all real numbers and has properties such as being increasing, right continuous, and approaching zero as the variable goes to negative infinity.
Q: How can the CDF be used to compute probabilities?
From the CDF, probabilities of events can be computed by subtracting the CDF values at the endpoints of the event. For example, to find the probability that a random variable is between two values, subtract the CDF value at the smaller value from the CDF value at the larger value.
Q: What are the properties of a CDF?
A CDF has three important properties. First, it is increasing, meaning that as the input value increases, the probability also increases. Second, it is right continuous, meaning that the probability does not jump abruptly at each value but rather continues smoothly. Third, as the input value goes to negative infinity, the CDF approaches zero, indicating that the probability of the random variable being smaller than any negative value is zero.
Q: How can the PMF be recovered from the CDF?
The PMF, or probability mass function, gives the probabilities of each individual value of a discrete random variable. In the CDF, the size of the jumps represents the probability of each value. By summing up the sizes of the jumps, the PMF can be recovered.
Q: What is the definition of independence for random variables?
Two random variables are independent if the probability of their joint event equals the product of their individual probabilities. This can be written as P(X <= x, Y <= y) = P(X <= x) * P(Y <= y). Independence means that the occurrence of one event does not affect the likelihood of the other event.
Q: How can the expected value of a random variable be computed?
The expected value, or mean, of a random variable can be computed by summing the product of each value and its corresponding probability. This can be done using the PMF of a discrete random variable or in some cases, using indicators and linearity.
Q: What is the expected value of a Bernoulli random variable?
The expected value of a Bernoulli random variable equals the probability of success. This means that if a Bernoulli random variable has probability P of being 1 and 1-P of being 0, its expected value is P.
Q: How can expected values be computed for distributions involving indicators?
For distributions involving indicators, the expected value can be computed by considering the probabilities of the events corresponding to the indicators. The expected value is equal to the probability of the event occurring.
Q: What is the expected value of a hypergeometric distribution?
The expected value of a hypergeometric distribution can be computed by multiplying the probability of the event of interest by the total number of trials. For example, if we are interested in the number of aces in a five-card hand, the expected value is 5 times the probability of drawing an ace.
Q: What is the expected value of a geometric distribution?
The expected value of a geometric distribution is equal to the reciprocal of the probability of success. In other words, for a geometric distribution with parameter P, the expected value is 1/P. The geometric distribution represents the number of failures before the first success in a series of independent Bernoulli trials.
Q: What is linearity of expectation?
Linearity of expectation is a property that allows the expected value of a sum of random variables to be computed by summing the expected values of each random variable individually. It states that the expected value of the sum of two random variables equals the sum of their expected values, even if the random variables are dependent.