Home » Posts tagged 'JCM_math340_HW4_F13'

Tag Archives: JCM_math340_HW4_F13

Indicator Functions and Expectations – II

Let \(A\) and \(B\) be two events and let \(\mathbf{1}_A\) and \(\mathbf{1}_B\) be the associated indicator functions. Answer the following questions in terms of \(\mathbf{P}(A)\), \(\mathbf{P}(B)\), \(\mathbf{P}(B \cup A)\) and \(\mathbf{P}(B \cap A)\).

  1. Describe the distribution of \( \mathbf{1}_A\).
  2. What is \(\mathbf{E} \mathbf{1}_A\) ?
  3. Describe the distribution of \(\mathbf{1}_A \mathbf{1}_B\).
  4. What is \(\mathbf{E}(\mathbf{1}_A \mathbf{1}_B)\) ?

The indicator function of an event \(A\) is the random variable which has range \(\{0,1\}\) such that

\[ \mathbf{1}_A(x) = \begin{cases} 1 &; \text{if $x \in A$}\\ 0 &; \text{if $x \not \in A$} \end{cases}\]

Coin tosses: independence and sums

A fair coin is tossed three times. Let \(X\) be the number of heads on the first two tosses, \(Y\) the number of heads on the last two tosses.

  1. Make a table showing the joint distribution of \(X\) and \(Y\).
  2. Are \(X\) and \(Y\)  independent ?
  3. Find the distribution of \(X+Y\) ?

A simple mean calculation

Suppose that \(X \in \{1,2,3\}\) and \(Y = X+ 1\), and \(\mathbf{P}(X = 1) = 0.3, \ \mathbf{P}(X = 2) = 0.5,\ \mathbf{P}(X = 3) = 0.2.\)

(a) Find \(\mathbf{E}(X)\).

(b) Find \(\mathbf{E}(Y)\).

(c) Find \(\mathbf{E}(X + Y)\).

[Author Mark Huber. Licensed under Creative Commons.]

Only Pairwise Independence

Let \(X_1\) and \(X_2\) be two independent tosses of a fair coin. Let \(Y\) be the  random variable  equal to 1 if exactly one of those coin tosses resulted in heads, and 0 otherwise. For simplicity, let 1 denote heads and 0 tails.

  1. Write down the joint probability mass function for \((X_1,X_2,Y)\).
  2. Show that \(X_1\) and \(X_2\) are independent.
  3. Show that \(X_1\) and \(Y\) are independent.
  4. Show that \(X_2\) and \(Y\) are independent.
  5. The three variables are mutually independent if  for any \(x_1 \in Range(X_1), x_2\in Range(X_2), y \in Range(Y)\) one has
    \[ \mathbf{P}(X_1=x_1,X_2=x_2,Y=y) = \mathbf{P}(X_1=x_1) \mathbf{P}(X_2=x_2) \mathbf{P}(Y=y) \]
    Show that \(X_1,X_2,Y\) are not mutually independent.

Expected Value and Mean Error

Let \(X\) be a random variable with \(\mu_1=\mathbf{E}(X)\) and \(\mu_2=\mathbf{E}(X^2)\). For any number \(a\) define the mean squared error

\[J(a)=\mathbf{E}\big[(X-a)^2\big] \]

and the absolute error

\[K(a)=\mathbf{E}\big[|X-a|\big] \]

  1. Write \(J(a)\) in terms of  \(a\), \(\mu_1\), and \(\mu_2\) ?
  2. Use the above answer to calculate \(\frac{d J(a)}{d\, a}\) .
  3. Find the \(a\) which is the  solution to \(\frac{d J(a)}{d\, a}=0 ?\) Comment on this answer in light of the name  “Expected Value” and argue that it is actually a minimum.
  4. Assume that \(X\) only takes values \(\{x_1,x_2,\dots,x_n\}\).  Use the fact that
    \[ \frac{d\ }{d a} |x-a| = \begin{cases} -1 & \text{if \(a < x\)}\\
    1 & \text{if \(a > x\)}\end{cases}
    \]
    to show that as long as \(a \not\in \{x_1,x_2,\dots,x_n\}\) one has
    \[ \frac{d K(a)}{d\, a} =\mathbf{P}(X<a) – \mathbf{P}(X>a)\]
  5. Now show that if \( a \in (x_k,x_{k+1})\) then \(\mathbf{P}(X<a) – \mathbf{P}(X>a) = 2\mathbf{P}(X \leq x_k) – 1\).
  6. The median is any point \(a\) so that both  \(\mathbf{P}(X\leq a) \geq \frac12 \) and \(\mathbf{P}(X\geq a) \geq\frac12\). Give an example where the median is not unique. (That is to say there is more than one such \(a\).
  7. Use the above calculations  to show that if \(a\) is any median (not equal to one of the \(x_k\)), then it solves  \(\frac{d K(a)}{d\, a} =0\) and that it is a minimizer.

 

Introduction to Geometric random variables

Consider flipping a coin that is either heads (H) or tails (T), each with probability 1/2.  The coin is flipped over and over (independently) until a head comes up.  The outcome space is
\[ \Omega = \{H,TH,TTH,TTTH,\ldots\}. \]

(a) What is \( \mathbf{P}(TTH)\)?

(b) What is the chance that the coin is flipped exactly \(i\) times?

(c) What is the chance that the coin is flipped more than twice?

(d) Repeat the previous three questions for a unfair coin which has probability \(p\) of getting Tails.

[Author Mark Huber. Licensed under Creative Commons]

Putting expectations together

Suppose \(\mathbf{E}(X^2)=3\), \(\mathbf{E}(Y^2)=4\) and \(\mathbf{E}(XY)=2\). What is  \(\mathbf{E}[(X+Y)^2]\) ?

 

Dice rolls: Explicit calculation of max/min

Let \(X_1\) and \(X_2\) be the number obtained on two rolls of a fair die. Let \(Y_1=\max(X_1,X_2)\) and \(Y_2=\min(X_1,X_2)\).

  1. Display the joint distribution tables for \( (X_1,X_2)\).
  2. Display the joint distribution tables for \( (Y_1,Y_2)\).
  3. Find the distribution of \(X_1X_2\).

Combination of [Pitman, p. 159 #4 and #5]

Blocks of Bernoulli Trials

In \(n+m\) independent  Bernoulli \((p)\) trials, let \(S_n\) be the number of successes in the first \(n\) trials, \(T_n\) the number of successes in the last \(m\) trials.

  1. What is the distribution of \(S_n\) ? Why ?
  2. What is the distribution of \(T_m\) ? Why ?
  3. What is the distribution of \(S_n+T_m\) ? Why ?
  4. Are \(S_n\) and \(T_m\) independent ? Why ?
  5. Are \(S_n\) and \(T_{m+1}\) independent ? Why ?
  6. Are \(S_{n+1}\) and \(T_{m}\) independent ? Why ?

Based on [Pitman, p. 159, #10]

 

Topics