Home » Posts tagged 'JCM_math230_HW9_S13'

# Tag Archives: JCM_math230_HW9_S13

## conditional densities

Let \(X\) and \(Y\) have the following joint density:

\[ f(x,y)=\begin{cases}2x+2y -4xy & \text{for } 0 \leq x\leq 1 \ \text{and}\ 0 \leq y \leq 1\\ 0& \text{otherwise}\end{cases}\]

- Find the marginal densities of \(X\) and \(Y\)
- find \(f_{Y|X}( y \,|\, X=\frac14)\)
- find \( \mathbf{E}(Y \,|\, X=\frac14)\)

[Pitman p426 # 2]

## Expected max/min given min/max

Let \(X_1\) and \(X_2\) be the numbers on two independent fair-die rolls. Let \(M\) be the maximum and \(N\) the minimum of \(X_1\) and \(X_2\). Calculate:

- \(\mathbf{E}( M| N=x) \)
- \(\mathbf{E}( N| M=x) \)
- \(\mathbf{E}( M| N) \)
- \(\mathbf{E}( N| M) \)

## Beta-binomial

You have a sequence of coins \(X_1,…,X_n\) drawn iid from a Bernouli distribution with unknown parameter \(p\) and known fixed \(n\). Assume a priori that the coins parameter \(p\) follows a Beta distribution with parameters \(\alpha,\beta\).

- Given the sequence \(X_1,…,X_n\) what is the posterior pdf of \(p\) ?
- For what value of \(p\) is the maximum of the posterior pdf attained.

Hint: If \(X\) is distributed Bernoulli(p) then for \(x=1,0\) one has \(P(X=x)=p^x(1-p)^{(1-x)}\). Furthermore, if \(X_1,X_2\) are i.i.d. Bernoulli(p) then

\[P(X_1=x_1, X_2=x_2 )=P(X_1=x_1)P(X_2=x_2 )=p^{x_1}(1-p)^{(1-x_1)}p^{x_2}(1-p)^{(1-x_2)}\]

## Conditioning and Polya’s urn

An urn contains 1 black and 2 white balls. One ball is drawn at random and its color noted. The ball is replaced in the urn, together with an additional ball of its color. There are now four balls in the urn. Again, one ball is drawn at random from the urn, then replaced along with an additional ball of its color. The process continues in this way.

- Let \(B_n\) be the number of black balls in the urn just before the \(n\)th ball is drawn. (Thus \(B_1= 1\).) For \(n \geq 1\), find \(\mathbf{E} (B_{n+1} | B_{n}) \).
- For \(n \geq 1\), find \(\mathbf{E} (B_{n}) \). [Hint: Use induction based on the previous answer and the fact that \(\mathbf{E}(B_1) =1\)]
- For \(n \geq 1\), what is the expected proportion of black balls in the urn just before the \(n\)th ball is drawn ?

[From pitman p 408, #6]

## Expectation of hierachical model

Consider the following hierarchical random variable

- \(\lambda \sim \mbox{Geometric}(p)\)
- \(Y \mid \lambda \sim \mbox{Poisson}(\lambda)\)

## Expectation of mixture distribution

Consider the following mixture distribution.

- Draw \(X \sim \mbox{Ber}(p=.3)\)
- If \(X=1\) then \(Y \sim \mbox{Geometric}(p_1)\)
- If \(X= 0\) then \(Y \sim \mbox{Bin}(n,p_2)\)

What is \(\mathbf{E}(Y)\) ?. (*) What is \(\mathbf{E}(Y | X )\) ?.