Final Exam Study Guide
Our final exam is Friday December 12 from 9:00 AM to 12:00 PM. There will be 11 problems, and they will break down as follows:
- (an undisclosed check on your visual intuition)
- Probability spaces and rules
- Counting (ugh!)
- Conditional probability and independence
- Discrete random variables
- Absolutely continuous random variables
- The moment-generating function
- Transformations of random variables
- Joint distributions
- Maximum likelihood estimation
- Bayesian inference
Problems 0 - 8 will be similar to the midterms. Since you’ve had more time to live with this material, I reserve the right to include problems that might stretch you a bit. For Problems 9 and 10, I assure you there will be no surprises. You can expect problems that are exactly like the examples you’ve seen in lecture, lab, Problem Set 7, and this study guide. The distributions may be different, and the details might change, but the format and the steps are the same. If you work every practice problem from scratch on your own and then study the solutions carefully, you are ready.
If your final exam score is better than your lowest midterm score, I will replace the low midterm with the final.
1. Probability spaces and rules
Problem 1
Suppose the events \(A\) and \(B\) are disjoint. Under what conditions are \(A^{c}\) and \(B^{c}\) also disjoint events?
Problem 2
Let \(A_1,\,A_2,\,...,\,A_n\subseteq S\) be a finite collection of possibly overlapping events in some probability space. Show that
\[ P\left(\bigcap_{i=1}^n A_i\right)\geq \sum\limits_{i=1}^nP(A_i)-n+1. \]
Want more review?
- Lecture notes on axioms and rules;
- Problem Set 1: problems 4 - 8;
- Problem Set 2: problems 2 - 3;
- Midterm 1 Study Guide: problems 1 - 4;
- Midterm 1: problem 1.
2. Counting
Problem 3
Four people are going to play bridge, which begins with an entire 52-card deck being dealt at random, 13 cards per player. What is the probability that each player is dealt exactly one ace?
There are several ways to do this calculation. You should try it on your own, but then, go read Example 1.8.9 and the subsequent discussion on pages 37 - 39 of DeGroot & Schervish.
Problem 4
An evil cult has its secret meetings in the basement of a building that has 15 above-ground floors. When the meeting is over, the nine cult members enjoy coffee and donuts and then depart to sow the seeds of villainy. They board the elevator, and each person randomly and independently selects one of the 15 floors to visit. Compute the probability that…
- No two passengers share a floor;
- At least two passengers share a floor;
- Exactly two passengers share a floor;
- No passengers get off at the top floor;
- At least one passenger gets off at floor 1;
- All passengers get off on consecutive floors;
- All passengers get off below floor 10
Want more review?
- Lecture notes on counting theory;
- Lecture notes with birthday problem and other worked examples;
- Lab 2 worked examples;
- Problem Set 1: problems 9 - 10;
- Problem Set 2: problems 4 - 8;
- Problem Set 3: problems 2 - 3;
- Midterm 1 Study Guide: problems 5 - 7;
- Midterm 1: problems 2 - 3;
- Odd-numbered exercises in DeGroot & Schervish Chapters 1.6 - 1.8, 1.12.
3. Conditional probability and independence
Problem 5
A JP Morgan recruiter is walking around campus with three bone-colored business cards. Two of the cards are normal; they have writing on one side and they are blank on the other. The third card was misprinted; the writing was duplicated on both sides. The recruiter drops one of the cards and orders his valet to pick it up. When the valet reaches for the card, he sees writing facing up. What is the probability that the recruiter dropped the misprinted card?
Problem 6
Suppose that \(k\) events \(B_1\), \(B_2\), …, \(B_k\) form a partition of the sample space \(S\), and \(A\subseteq S\) is some event with positive probability \(P(A)>0\). Show that if \(P(B_1\mid A)<P(B_1)\), then that guarantees \(P(B_i\mid A)>P(B_i)\) for at least one \(i\in\{1,\,2,\,...,\,k\}\).
Want more review?
- Lecture notes with worked examples;
- The lil’ app on disease testing;
- Problem Set 2: problems 9 - 10;
- Problem Set 3: problems 4 - 8;
- Problem Set 4: problems 2 - 3;
- Midterm 1 Study Guide: problems 8 - 10;
- Midterm 1: problems 3 - 5;
- Odd-numbered exercises in DeGroot & Schervish Chapter 2.
4. Discrete random variables
Problem 7
Consider a sequence of \(n\) independent coin flips of a fair coin. Define a streak of heads as an uninterrupted run of heads, and efine a streak of tails analogously. We are interested in the number of streaks in a sequence of flips. For example, consider the sequence below:
\[ hhhttthhttthhhhhthtthhhh. \]
It contains 9 streaks. Here is the sequence again with the streaks separated:
\[ hhh\quad ttt\quad hh\quad ttt\quad hhhhh\quad t\quad h\quad tt\quad hhhh. \]
In a sequence of \(n\) flips of a fair coin, let \(Y\) be the number of streaks. What is the range of \(Y\), and what is its PMF?
Problem 8
Let \(X\sim\text{Exponential}(\lambda)\) for some arbitrary \(\lambda>0\). Define a new random variable \(Y=\lceil X\rceil\). Recall that the ceiling function \(\lceil\cdot\rceil\) is the function that rounds a number up to the next integer. So \(\lceil0.5\rceil=1\), \(\lceil13.1\rceil=14\), and so on.
- What is the range of \(Y\)?
- What is the PMF of \(Y\)?
- Does \(Y\) belong to a familiar distribution family?
Want more review?
- Lecture notes introducing random variables;
- Lecture notes on special families of discrete random variables;
- Lecture notes on expected value;
- Lab 5 exercises;
- Problem Set 2: problem 7;
- Problem Set 3: problems 9 - 10;
- Problem Set 4: problems 4 - 10;
- Midterm 1 Study Guide: problems 11 - 12;
- Midterm 2 Study Guide: problems 1 - 2;
- Midterm 1: problem 6;
- Midterm 2: problem 2.
5. Absolutely continuous random variables
Problem 9
Imagine \(X\) has this cdf:
\[ F(x)=\frac{1}{1+\exp\left(-\frac{x-\mu}{s}\right)},\quad x\in\mathbb{R}. \]
\(\mu\in\mathbb{R}\) and \(s>0\) are just constants.
- What is the pdf of \(X\)?
- What is the mean of \(X\)?
Problem 10
Here is the goofy pdf of some nonnegative random variable \(X\):
\[ f(x) = e^{1-e^x+x},\quad x\geq 0. \]
- Compute the cdf of \(X\);
- Compute the median of \(X\).
Want more review?
- Examples from lecture;
- Lab 8;
- Problem Set 5: problems 2 - 8;
- Problem Set 6: problems 1 - 2;
- Midterm 2 Study Guide: problems 3 - 4;
- Midterm 2: problem 1;
- Odd-numbered exercises in DeGroot & Schervish Chapters 3.2, 5.6, 5.7.
6. The moment-generating function
Problem 11
A certain random variable \(X\) has moment-generating function
\[ M(t)=\frac{e^{2t}}{1-3t},\quad t<1/3. \]
- Compute the mean and variance of \(X\).
- What is the moment generating function of the new random variable
\[ Y=\frac{\pi}{2}-\frac{X}{3}. \]
Problem 12
Consider the following:
\[ \begin{aligned} N&\sim\text{Poisson}(\lambda)\\ X_1,\,X_2,\,X_3,\,...&\overset{\text{iid}}{\sim}M\\ S&=\sum\limits_{i=1}^NX_i. \end{aligned} \]
\(N\) is independent of all of the \(X_i\). The \(X_i\) are an infinite sequence of iid random variables each sharing a common moment-generating function \(M(t)=E(e^{tX_1})\). \(S\) is a random sum of random variables. The terms are random, and so is \(N\), the number of terms being summed.
- What is the MGF of \(S\)?
- If I had an iid collection of \(S_1\), \(S_2\), …, \(S_m\) each possessing the same distribution that you just derived, what would be the distribution of \(T=S_1+S_2+...+S_m\)?
Want more review?
- worked examples from lecture;
- new theorems and examples from lecture about updating MGFs under linear transformations;
- Problem Set 5: problems 9 - 10;
- Problem Set 6: problems 3 - 5;
- Problem Set 7: problems 1 - 2;
- Midterm 2 Study Guide: problems 11 - 12;
- Midterm 2: problem 2;
- Odd-numbered exercises in DeGroot & Schervish Chapter 4.4.
7. Transformations of random variables
Problem 13
\(X\) possesses the goofy distribution if its PDF is:
\[ f(x)=k\frac{x^{k-1}}{\theta}\exp\left(-\frac{x^k}{\theta}\right),\quad x>0. \]
The constants \(k\) and \(\theta\) are both positive parameters. We denote this \(X\sim\text{GF}(k,\,\theta)\)
- If we define a new random variable \(Y=\ln X\), what is its distribution?
- If we define a new random variable \(Z = X^k\), what is its distribution?
Problem 14
Let \(X\sim\text{Gamma}(\alpha,\,\beta)\), and define a new random variable \(Y=X^r\) for any positive constant \(r>0\).
- What is the density of \(Y\)?
- Derive a formula for \(E(Y^n)\) that works for any \(n\in\mathbb{N}\).
Want more review?
8. Joint distributions
Problem 15
\(X\) and \(Y\) are jointly absolutely continuous with joint density
\[ f_{XY}(x,\, y) = \frac{1}{8} (y^2-x^2) e^{-y} ,\quad y>0 ;\, -y<x<y. \]
- Sketch \(\textrm{Range}(X,\, Y)\).
- Compute the marginal density of \(X\).
- Compute the marginal density of \(Y\).
- Compute the conditional density of \(X\) given \(Y = y\).
- Compute the conditional density of \(Y\) given \(X = x\).
Problem 16
Consider a random pair \((Q,\,Z)\) of continuous random variables whose joint distribution is given by this hierarchy:
\[ \begin{aligned} Q&\sim \text{Gamma}(a,\,b)\\ Z\mid Q=q&\sim\text{GF}(k, 1/q). \end{aligned} \]
So conditionally, \(Z\) has the distribution introduced in Problem 13 above, with \(1/q\) serving as the second parameter. \(k>0\) is just a constant throughout.
- What is the joint density of \((Q,\,Z)\)?
- What is the marginal distribution of \(Z\)?
- What is the conditional distribution of \(Q\) given \(Z=z\)?
Want more review?
- jointly discrete examples from lecture;
- jointly contnuous examples from lecture;
- mixed Poisson-gamma example from Lab 9;
- Problem Set 6: problems 7 - 10;
- Midterm 2 Study Guide: problems 8 - 9;
- Midterm 2: problem 4;
- Odd-numbered exercises in DeGroot & Schervish Chapters 3.4 - 3.6.
9. Maximum likelihood estimation
Problem 17
Consider these data:
\[ X_1,\,X_2,\,...,\,X_n\overset{\text{iid}}{\sim}\text{N}(\theta,\,1). \]
- What is the maximum likelihood estimator of \(\theta\in\mathbb{R}\)?
- What is the sampling distribution of the estimator?
- What is the MSE of the estimator?
- Based on the MSE, what are the statistical properties of this estimator?
Problem 18
Consider these data:
\[ X_1,\,X_2,\,...,\,X_n\overset{\text{iid}}{\sim}\text{GF}(k,\,\theta). \]
So again, we are recycling the distribution family from Problem 13 above. Throughout, just treat \(k>0\) as fixed and known.
- What is the maximum likelihood estimator of \(\theta\in\mathbb{R}\)?
- What is the sampling distribution of the estimator?
- What is the MSE of the estimator?
- Based on the MSE, what are the statistical properties of this estimator?
Want more review?
- Lecture examples;
- Lab 10;
- Problem Set 7: problems 3 - 4.
10. Bayesian inference
Problem 19
Consider this Bayesian model:
\[ \begin{aligned} \theta&\sim\text{N}(m_0,\,\tau_0^2)\\ X_{1:n}\mid\theta&\overset{\text{iid}}{\sim}\text{N}(\theta,\,1). \end{aligned} \]
- What is the posterior distribution for \(\theta\) conditional on the data?
- Derive the posterior mean and show that it is a convex combination of the prior mean \(m_0\) and the MLE.
Problem 20
Consider this Bayesian model:
\[ \begin{aligned} \theta&\sim\text{IG}(a_0,\,b_0)\\ X_{1:n}\mid\theta&\overset{\text{iid}}{\sim}\text{N}(0,\,\theta). \end{aligned} \]
- What is the posterior distribution for \(\theta\) conditional on the data?
- Derive the posterior mean and show that it is a convex combination of the prior mean and the MLE.
Problem 21
Recall this infernal distribution family that you met on Problem Set 6, Midterm 2, and again in lecture:
\[ f(x\mid \theta)=\theta(x+1)^{-(\theta+1)},\quad x>0. \]
We’ve done MLE for this, now let’s go Bayes. Consider this model:
\[ \begin{aligned} \theta&\sim\text{Gamma}(a_0,\,b_0)\\ X_{1:n}\mid\theta&\overset{\text{iid}}{\sim}f(x\mid \theta). \end{aligned} \]
- What is the posterior distribution for \(\theta\) conditional on the data?
- Derive the posterior mean and show that it is a convex combination of the prior mean and the MLE.
Want more review?
- Lecture examples;
- Lab 11;
- Problem Set 7: problems 5 - 6.