Tag Archives: JCM_math230_HW5_S15

Indicator Functions and Expectations – II

Let \(A\) and \(B\) be two events and let \(\mathbf{1}_A\) and \(\mathbf{1}_B\) be the associated indicator functions. Answer the following questions in terms of \(\mathbf{P}(A)\), \(\mathbf{P}(B)\), \(\mathbf{P}(B \cup A)\) and \(\mathbf{P}(B \cap A)\).

  1. Describe the distribution of \( \mathbf{1}_A\).
  2. What is \(\mathbf{E} \mathbf{1}_A\) ?
  3. Describe the distribution of \(\mathbf{1}_A \mathbf{1}_B\).
  4. What is \(\mathbf{E}(\mathbf{1}_A \mathbf{1}_B)\) ?

The indicator function of an event \(A\) is the random variable which has range \(\{0,1\}\) such that

\[ \mathbf{1}_A(x) = \begin{cases} 1 &; \text{if $x \in A$}\\ 0 &; \text{if $x \not \in A$} \end{cases}\]

Coin tosses: independence and sums

A fair coin is tossed three times. Let \(X\) be the number of heads on the first two tosses, \(Y\) the number of heads on the last two tosses.

  1. Make a table showing the joint distribution of \(X\) and \(Y\).
  2. Are \(X\) and \(Y\)  independent ?
  3. Find the distribution of \(X+Y\) ?

Expected Value and Mean Error

Let \(X\) be a random variable with \(\mu_1=\mathbf{E}(X)\) and \(\mu_2=\mathbf{E}(X^2)\). For any number \(a\) define the mean squared error

\[J(a)=\mathbf{E}\big[(X-a)^2\big] \]

and the absolute error

\[K(a)=\mathbf{E}\big[|X-a|\big] \]

  1. Write \(J(a)\) in terms of  \(a\), \(\mu_1\), and \(\mu_2\) ?
  2. Use the above answer to calculate \(\frac{d J(a)}{d\, a}\) .
  3. Find the \(a\) which is the  solution to \(\frac{d J(a)}{d\, a}=0 ?\) Comment on this answer in light of the name  “Expected Value” and argue that it is actually a minimum.
  4. Assume that \(X\) only takes values \(\{x_1,x_2,\dots,x_n\}\).  Use the fact that
    \[ \frac{d\ }{d a} |x-a| = \begin{cases} -1 & \text{if \(a < x\)}\\
    1 & \text{if \(a > x\)}\end{cases}
    \]
    to show that as long as \(a \not\in \{x_1,x_2,\dots,x_n\}\) one has
    \[ \frac{d K(a)}{d\, a} =\mathbf{P}(X<a) – \mathbf{P}(X>a)\]
  5. Now show that if \( a \in (x_k,x_{k+1})\) then \(\mathbf{P}(X<a) – \mathbf{P}(X>a) = 2\mathbf{P}(X \leq x_k) – 1\).
  6. The median is any point \(a\) so that both  \(\mathbf{P}(X\leq a) \geq \frac12 \) and \(\mathbf{P}(X\geq a) \geq\frac12\). Give an example where the median is not unique. (That is to say there is more than one such \(a\).
  7. Use the above calculations  to show that if \(a\) is any median (not equal to one of the \(x_k\)), then it solves  \(\frac{d K(a)}{d\, a} =0\) and that it is a minimizer.

 

Putting expectations together

Suppose \(\mathbf{E}(X^2)=3\), \(\mathbf{E}(Y^2)=4\) and \(\mathbf{E}(XY)=2\). What is  \(\mathbf{E}[(X+Y)^2]\) ?

 

Expection and dice rolls

A standard 6 sided die is rolled three times.

  1. What is the expected value of the first roll ?
  2. What is the expected values of the sum of the three rolls ?
  3. What is the expected number of twos appearing in the three rolls ?
  4. What is the expected number of sixes appearing in the three rolls ?
  5. What is the expected number of odd numbers ?

Based on [Pitman, p. 182 #3]

Dice rolls: Explicit calculation of max/min

Let \(X_1\) and \(X_2\) be the number obtained on two rolls of a fair die. Let \(Y_1=\max(X_1,X_2)\) and \(Y_2=\min(X_1,X_2)\).

  1. Display the joint distribution tables for \( (X_1,X_2)\).
  2. Display the joint distribution tables for \( (Y_1,Y_2)\).
  3. Find the distribution of \(X_1X_2\).

Combination of [Pitman, p. 159 #4 and #5]

Blocks of Bernoulli Trials

In \(n+m\) independent  Bernoulli \((p)\) trials, let \(S_n\) be the number of successes in the first \(n\) trials, \(T_n\) the number of successes in the last \(m\) trials.

  1. What is the distribution of \(S_n\) ? Why ?
  2. What is the distribution of \(T_m\) ? Why ?
  3. What is the distribution of \(S_n+T_m\) ? Why ?
  4. Are \(S_n\) and \(T_m\) independent ? Why ?
  5. Are \(S_n\) and \(T_{m+1}\) independent ? Why ?
  6. Are \(S_{n+1}\) and \(T_{m}\) independent ? Why ?

Based on [Pitman, p. 159, #10]