Home » Posts tagged 'JCM_math230_HW5_S15'
Tag Archives: JCM_math230_HW5_S15
Indicator Functions and Expectations – II
Let \(A\) and \(B\) be two events and let \(\mathbf{1}_A\) and \(\mathbf{1}_B\) be the associated indicator functions. Answer the following questions in terms of \(\mathbf{P}(A)\), \(\mathbf{P}(B)\), \(\mathbf{P}(B \cup A)\) and \(\mathbf{P}(B \cap A)\).
- Describe the distribution of \( \mathbf{1}_A\).
- What is \(\mathbf{E} \mathbf{1}_A\) ?
- Describe the distribution of \(\mathbf{1}_A \mathbf{1}_B\).
- What is \(\mathbf{E}(\mathbf{1}_A \mathbf{1}_B)\) ?
The indicator function of an event \(A\) is the random variable which has range \(\{0,1\}\) such that
\[ \mathbf{1}_A(x) = \begin{cases} 1 &; \text{if $x \in A$}\\ 0 &; \text{if $x \not \in A$} \end{cases}\]
Coin tosses: independence and sums
A fair coin is tossed three times. Let \(X\) be the number of heads on the first two tosses, \(Y\) the number of heads on the last two tosses.
- Make a table showing the joint distribution of \(X\) and \(Y\).
- Are \(X\) and \(Y\) independent ?
- Find the distribution of \(X+Y\) ?
Expected Value and Mean Error
Let \(X\) be a random variable with \(\mu_1=\mathbf{E}(X)\) and \(\mu_2=\mathbf{E}(X^2)\). For any number \(a\) define the mean squared error
\[J(a)=\mathbf{E}\big[(X-a)^2\big] \]
and the absolute error
\[K(a)=\mathbf{E}\big[|X-a|\big] \]
- Write \(J(a)\) in terms of \(a\), \(\mu_1\), and \(\mu_2\) ?
- Use the above answer to calculate \(\frac{d J(a)}{d\, a}\) .
- Find the \(a\) which is the solution to \(\frac{d J(a)}{d\, a}=0 ?\) Comment on this answer in light of the name “Expected Value” and argue that it is actually a minimum.
- Assume that \(X\) only takes values \(\{x_1,x_2,\dots,x_n\}\). Use the fact that
\[ \frac{d\ }{d a} |x-a| = \begin{cases} -1 & \text{if \(a < x\)}\\
1 & \text{if \(a > x\)}\end{cases}
\]
to show that as long as \(a \not\in \{x_1,x_2,\dots,x_n\}\) one has
\[ \frac{d K(a)}{d\, a} =\mathbf{P}(X<a) – \mathbf{P}(X>a)\] - Now show that if \( a \in (x_k,x_{k+1})\) then \(\mathbf{P}(X<a) – \mathbf{P}(X>a) = 2\mathbf{P}(X \leq x_k) – 1\).
- The median is any point \(a\) so that both \(\mathbf{P}(X\leq a) \geq \frac12 \) and \(\mathbf{P}(X\geq a) \geq\frac12\). Give an example where the median is not unique. (That is to say there is more than one such \(a\).
- Use the above calculations to show that if \(a\) is any median (not equal to one of the \(x_k\)), then it solves \(\frac{d K(a)}{d\, a} =0\) and that it is a minimizer.
Putting expectations together
Suppose \(\mathbf{E}(X^2)=3\), \(\mathbf{E}(Y^2)=4\) and \(\mathbf{E}(XY)=2\). What is \(\mathbf{E}[(X+Y)^2]\) ?
Expection and dice rolls
A standard 6 sided die is rolled three times.
- What is the expected value of the first roll ?
- What is the expected values of the sum of the three rolls ?
- What is the expected number of twos appearing in the three rolls ?
- What is the expected number of sixes appearing in the three rolls ?
- What is the expected number of odd numbers ?
Based on [Pitman, p. 182 #3]
Dice rolls: Explicit calculation of max/min
Let \(X_1\) and \(X_2\) be the number obtained on two rolls of a fair die. Let \(Y_1=\max(X_1,X_2)\) and \(Y_2=\min(X_1,X_2)\).
- Display the joint distribution tables for \( (X_1,X_2)\).
- Display the joint distribution tables for \( (Y_1,Y_2)\).
- Find the distribution of \(X_1X_2\).
Combination of [Pitman, p. 159 #4 and #5]
Blocks of Bernoulli Trials
In \(n+m\) independent Bernoulli \((p)\) trials, let \(S_n\) be the number of successes in the first \(n\) trials, \(T_n\) the number of successes in the last \(m\) trials.
- What is the distribution of \(S_n\) ? Why ?
- What is the distribution of \(T_m\) ? Why ?
- What is the distribution of \(S_n+T_m\) ? Why ?
- Are \(S_n\) and \(T_m\) independent ? Why ?
- Are \(S_n\) and \(T_{m+1}\) independent ? Why ?
- Are \(S_{n+1}\) and \(T_{m}\) independent ? Why ?
Based on [Pitman, p. 159, #10]