
Week 9
Oct 13, 2025
In today’s lecture, we will learn about:
Textbook Reference: SDG 4.1–4.7
| Concept | Definition | Interpretation |
|---|---|---|
| \(E[X]\) | Mean | Long-run average |
| \(Var(X)\) | Variance | Spread around mean |
| \(Cov(X,Y)\) | Covariance | Joint variability |
| \(\rho_{XY}\) | Correlation | Strength of linear relationship |
| \(E[Y|X]\) | Conditional expectation | Best prediction of \(Y\) given \(X\) |
Expectation (or the population mean) of a random variable summarizes its central tendency.
For a discrete r.v. \(X\) with pmf \(p_X(x)\): \[E[X] = \sum_x x p_X(x)\]
For a continuous r.v. \(X\) with pdf \(f_X(x)\): \[E[X] = \int_{-\infty}^{\infty} x f_X(x) dx\]
Intuition: The expectation is the long-run average value if we repeatedly observe \(X\).
| Income (\(x\)) | Probability \(p(x)\) |
|---|---|
| 20,000 | 0.2 |
| 40,000 | 0.5 |
| 80,000 | 0.3 |
\[E[X] = 20{,}000(0.2) \\+ 40{,}000(0.5) \\+ 80{,}000(0.3) \\= 48{,}000.\]

Let \(a,b\) be constants and \(X,Y\) random variables.
A survey records the number of streaming subscriptions in 3 independent households. Let \(S_i\) be the subscriptions in household \(i\), where \(i \in \{1, 2, 3\}\). Each household has the following probability mass function (pmf): \[ P(S_i=0)=0.5,\quad P(S_i=1)=0.3,\quad P(S_i=2)=0.2. \] Let \(X = S_1 + S_2 + S_3\) be the total subscriptions across 3 independent households.
Since \(S_1,S_2,S_3\) are independent and identically distributed (i.i.d) following the given pmf, we have \[E(X)=E(S_1+S_2+S_3)=E(S_1)+E(S_2)+E(S_3)=3E(S_i).\]
Using the given pmf we can compute \(E(S_i)=\sum_{j}s_j P(S_i=s_j).\)
Variance measures the dispersion of a random variable around its mean.
\[Var(X) = E[(X - E[X])^2] = E[X^2] - (E[X])^2.\]
The standard deviation is: \[\sigma_X = \sqrt{Var(X)}.\]

Let \(a,b\) be constants and \(X,Y\) random variables.
For random variable \(X\) and positive integer \(k\), \(E(X^k)\) is called the \(k\)-th moment of \(X\).
Moments provide additional information about the shape of a distribution.
Higher moments describe: - Skewness (asymmetry) - Kurtosis (tailedness)
For those interested, proof available on SDG pp244-245
Covariance measures joint variability between two variables: \[Cov(X,Y) = E[(X - E[X])(Y - E[Y])]=E[XY]-E[X]E[Y].\]
Correlation standardizes covariance: \[\rho_{XY} = \frac{Cov(X,Y)}{\sigma_X \sigma_Y}.\]
Let \(a,b\) be constants and \(X,Y\) random variables.
Conditional expectation is the expected value of \(Y\) given information about \(X\).
\[E[Y|X=x] = \sum_y y \, p_{Y|X}(y|x) \quad \text{or} \quad \int y f_{Y|X}(y|x) dy.\]
It represents the best predictor of \(Y\) given \(X\).
In other words, predictor \(d(X)=E[Y|X]\) minimize the mean squared error (MSE) or \(E[(Y-d(X))^2]\).
Let \(Y\) be wage and \(X\) be years of education.
| Concept | Definition | Interpretation |
|---|---|---|
| \(E[X]\) | Mean | Long-run average |
| \(Var(X)\) | Variance | Spread around mean |
| \(Cov(X,Y)\) | Covariance | Joint variability |
| \(\rho_{XY}\) | Correlation | Strength of linear relationship |
| \(E[Y|X]\) | Conditional expectation | Best prediction of \(Y\) given \(X\) |
Next lecture: Exam II Review
ECON2250 Statistics for Economics - Fall 2025 - Maghfira Ramadhani