Consider a binomial\((10,p)\) distribution. If \(p\) is chosen uniformly at random from the interval \((0,1)\), what is the likelihood that the most likely number of the binomial distribution will be less than the mean of the binomial distribution?

# Category Archives: Basic probability

## Mean and Mode of Binomial

Comments Off on Mean and Mode of Binomial

Posted in Binomial, Mean and Variance, Uniform

## Shaq Free Throws

Over his career, Shaquille O’Neal made about 53% of his free throws. Assume his probability of making a single free throw is 53%. Suppose Shaq shot a round of 20 free throws and you’re told he made 15 of them.

- What is the likelihood he made the first free throw, given that he made 15?
- What is the likelihood he made at least 1 out of his first 5 free throws, given that he made 15?

Comments Off on Shaq Free Throws

Posted in Binomial, Conditioning

## Loaded Dice

You have a pair of fair dice and a pair of loaded dice. But you forgot which pair is which. You do remember that when you bought the loaded dice, the company that makes them claimed the dice would land on a sum of 7 approximately 1/3 of the time.

- You choose one of the pairs at random and roll it once. You get a sum of 7. What is the likelihood that you picked the loaded dice?
- You choose one of the pairs at random and roll the pair three times. You get exactly one sum of 7. What is the likelihood that you picked the loaded dice?

Comments Off on Loaded Dice

Posted in Bayes Theorem, Binomial, Conditioning, Dice Rolls

## Outcome space

Let \(\Omega\) be an outcome space with 16 outcomes. \(A\) and \(B\) are events inside of \(\Omega\). Event \(A\) has 10 outcomes and event \(B\) has 10 outcomes.

- Determine all the possible values of \(\# (A\cap B).\)
- Determine all the possible values of \(\# (A\cup B).\)
- Determine all the possible values of \(\#(A^c\cup B^c).\)
- Determine all the possible values of \(\# (A^c\cap B^c).\)

Comments Off on Outcome space

Posted in Algebra of events, Counting

## Dice rolling addition rule

You roll a fair 6-sided die 3 times. What is the likelihood of getting exactly one 4, exactly one 5, or exactly one 6?

Comments Off on Dice rolling addition rule

Posted in Addition rule, Counting, Dice Rolls

## Classroom Surveys

A researcher is collecting data from 10 high school classrooms. Each classroom contains 30 people. The researcher asks each student to fill out a survey. Suppose each student has about a 40% chance of completing the survey (independent of other students). What is the probability that at least 4 classrooms have at least 15 students who complete the survey?

Comments Off on Classroom Surveys

Posted in Binomial

## Coin flipping game

Your friend challenges you to a game in which you flip a fair coin until you get heads. If you flip an even number of times, you win. Let \(A\) be the event that you win. Let \(B\) be the event that you flip the coin 3 or more times. Let \(C\) be the event that you flip the coin 4 or more times.

- Compute \(\mathbb{P}(A)\).
- Are \(A\) and \(B\) independent?
- Are \(A\) and \(C\) independent?

Comments Off on Coin flipping game

Posted in Coin Flips, Geometric Distribution, Independence, Series

## Repeated Quiz Questions

Each week you get multiple attempts to take a two-question quiz. For each attempt, two questions are pulled at random from a bank of 100 questions. For a single attempt, the two questions are distinct.

- If you attempt the quiz 5 times, what is the probability that within those 5 attempts, you’ve seen at least one question two or more times?
- How many times do you need to attempt the quiz to have a greater than 50% chance of seeing at least one question two or more times?

Comments Off on Repeated Quiz Questions

Posted in Conditioning, Counting, Multiplication rule

## Dice Rolling Events

Consider rolling a fair 6-sided die twice. Let \(A\) be the event that the first roll is less than or equal to 3. Let \(B\) be the event that the second roll is less than or equal to 3. Find an event \(C\) in the same outcome space as \(A\) and \(B\) with \(0<\mathbb{P}(C)<1\) and such that \(A\), \(B\) and \(C\) are mutually independent, or show that no such event exists.

Comments Off on Dice Rolling Events

Posted in Dice Rolls, Independence

## Cognitive Dissonance Among Monkeys

Assume that each monkey has a strong preference between red, green, and blue M&M’s. Further, assume that the possible orderings of the preferences are equally distributed in the population. That is to say that each of the 6 possible orderings ( R>G>B or R>B>G or B>R>G or B>G>R or G>B>R or G>R>B) are found with equal frequency in the population. Lastly assume that when presented with two M&Ms of different colors they always eat the M&M with the color they prefer.

In an experiment, a random monkey is chosen from the population and presented with a Red and a Green M&M. In the first round, the monkey eats the one based on their personal preference between the colors. The remaining M&M is left on the table and a Blue M&M is added so that there are again two M&M’s on the table. In the second round, the monkey again chooses to eat one of the M&M’s based on their color preference.

- What is the chance that the red M&M is not eaten in the first round?
- What is the chance that the green M&M is not eaten in the first round?
- What is the chance that the Blue M&M is not eaten in the second round?

[Mattingly 2022]

Comments Off on Cognitive Dissonance Among Monkeys

Posted in Conditioning

Tagged JCM_math230_HW9_F22

## Which deck is rigged ?

Two decks of cards are sitting on a table. One deck is a standard deck of 52 cards. The other deck (called the rigged deck) also has 52 cards but has had 4 of the 13 Harts replaced by Diamonds. (Recall that a standard deck has 4 suits: Diamonds, Harts, Spades, and Clubs. normal there are 13 of each suit.)

- What is the probability one chooses 4 cards from the rigged deck and gets exactly 2 diamonds and no hearts?
- What is the probability one chooses 4 cards from the standard deck and gets exactly 2 diamonds and no hearts?
- You randomly chose one of the decks and draw 4 cards. You obtain exactly 2 diamonds and no hearts.
- What is the probability you chose the cards from the rigged deck?
- What is the probability you chose the cards from the standard deck?
- If you had to guess which deck was used, which would you guess? The standard or the rigged ?

Comments Off on Which deck is rigged ?

Posted in Basic probability, Bayes Theorem, Cards, Conditional Expectation, Counting, Drawing without replacement, Drawing without replacement

Tagged JCM_math230_HW4_F22

## Match play golf problem

This problem is motivated by the new format for the PGA match play tournament. The 64 golfers are divided into 16 pools of four players. On the first three days each golfer plays one 18 hole match against the other three in his pool. After 18 holes the game continues until there is a winner. At the end of these three days the possible records of the golfers in a pool could be: (a) 3-0, 2-1, 1-2, 0-3; (b) 3-0, 1-2, 1-2, 1-2; (c) 2-1, 2-1, 1-2, 1-2; (d) 2-1, 2-1, 2-1, 0-3. What are the probabilities for each of these four possibilities?

Comments Off on Match play golf problem

Posted in Counting

## Handing back tests

A professor randomly hands back test in a class of \(n\) people paying no attention to the names on the paper. Let \(N\) denote the number of people who got the right test. Let \(D\) denote the pairs of people who got each others tests. Let \(T\) denote the number of groups of three who none got the right test but yet among the three of them that have each others tests. Find:

- \(\mathbf{E} (N)\)
- \(\mathbf{E} (D)\)
- \(\mathbf{E} (T)\)

Comments Off on Handing back tests

Posted in Expectations, Indicator functions

## Up by two

Suppose two teams play a series of games, each producing a winner and a loser, until one time has won two more games than the other. Let \(G\) be the number of games played until this happens. Assuming your favorite team wins each game with probability \(p\), independently of the results of all previous games, find:

- \(P(G=n) \) for \(n=2,3,\dots\)
- \(\mathbf{E}(G)\)
- \(\mathrm{Var}(G)\)

[Pittman p220, #18]

## Population

A population contains \(X_n\) individuals at time \(n=0,1,2,\dots\) . Suppose that \(X_0\) is distributed as \(\mathrm{Poisson}(\mu)\). Between time \(n\) and \(n+1\) each of the \(X_n\) individuals dies with probability \(p\) independent of the others. The population at time \(n+1\) is comprised of the survivors together with a random number of new immigrants who arrive independently in numbers distributed according to \(\mathrm{Poisson}(\mu)\).

- What is the distribution of \(X_n\) ?
- What happens to this distribution as \(n \rightarrow \infty\) ? Your answer should depended on \(p\) and \(\mu\). In particular, what is \( \mathbf{E} X_n\) as \(n \rightarrow \infty\) ?

[Pittman [236, #18]

Comments Off on Population

Posted in Basic probability, Poisson

## Benford’s Law

Assume that the population in a city grows exponentially at rate \(r\). In other words, the number of people in the city, \(N(t)\), grows as \(N(t)=C e^{rt}\), where \(C<10^6\) is a constant.

1. Determine the time interval \(\Delta t_1\) during which \(N(t)\) will be between 1 and 2 million people.

2. For \(k=1,…,9\), determine the time interval \(\Delta t_k\) during which \(N(t)\) will be between k and k+1 million people.

3. Calculate the total time \(T\) it takes for \(N(t)\) to grow from 1 to 10 million people.

4. Now pick a time \(\hat t \in [0,T]\) uniformly at random, and use the above results to derive the following formula (also known as Benford’s law) $$p_k=\mathbb P(N(\hat t) \in [k, k+1] \,million)=\log_{10}(k+1)-\log_{10}(k).$$

Comments Off on Benford’s Law

Posted in Basic probability

## Using the Cauchy–Schwarz inequality

Recall that the Cauchy–Schwarz inequality stats that for any two random variable \(X\) and \(Y\) one has that

\[ \mathbf E |XY| \leq \sqrt{\mathbf E [X^2]}\,\sqrt{ \mathbf E [Y^2]}\]

- Use it to show that

\[ \mathbf E |X| \leq \sqrt{\mathbf E [X^2]}\]

Comments Off on Using the Cauchy–Schwarz inequality

Posted in Basic probability, Expectations

## Conditioning a Poisson Arrival Process

Consider a Poisson process with parameter \(\lambda\). What is the conditional probability that \(N(1) = n\) given that \(N(3) = n\)? (Here, \(N(t) \) is the number of calls which arrive between time 0 and time \(t\). ) Do you understand why this probability does not depend on \(\lambda\)?

[Meester ex 7.5.4]

Comments Off on Conditioning a Poisson Arrival Process

Posted in Poisson arrivial process

Tagged JCM_math230_HW8_F22, JCM_math230_HW8_S15, JCM_math340_HW9_F13

## Poisson Thinning

Let \(N(t)\) be a Poisson process with intensity λ. For each occurrence, we flip a coin: if heads comes up we label the occurrence green, if tails comes up we label it red. The coin flips are independent and \(p\) is the probability to see heads.

- Show that the green occurrence form a Poisson process with intensity λp.
- Connect this with example 2.2.5 from Meester.
- We claim that the red occurrences on the one hand, and the green occurrences on the other hand form independent Poisson processes. Can you formulate this formally, and prove it , using Example 2.2.5 from Meester once more?

[Meester ex. 7.5.7]

Comments Off on Poisson Thinning

Posted in Poisson, Poisson arrivial process

## Geometric Branching Process

Consider a branching process with a geometric offspring distribution \( P(X=k) = (1-p)p^k\), for \(k=0,1,2,\dots\) . Show that the ultimate extinction is certain if \(p \leq \frac12\) and that the probability of extinction is \((1-p)/p \) if \(p > \frac12\).

[Meester ex. 6.6.5]

Comments Off on Geometric Branching Process

Posted in Basic Stochastic Processes, Branching Processes, Exponential Random Variables

Tagged JCM_math340_HW9_F13

## Joint, Marginal and Conditioning

Let \( (X,Y)\) have joint density \(f(x,y) = e^{-y}\), for \(0<x<y\), and \(f(x,y)=0\) elsewhere.

- Are \(X\) and \(Y\) independent ?
- Compute the marginal density of \(Y\).
- Show that \(f_{X|Y}(x,y)=\frac1y \), for \(0<x<y\).
- Compute \(E(X|Y=y)\)
- Use the previous result to find \(E(X)\).

Comments Off on Joint, Marginal and Conditioning

Posted in Conditional Expectation, Joint Distributions

Tagged JCM_math230_HW10_S15, JCM_math230_HW11_F22, JCM_math340_HW8_F13

## Three Random Variables

Let \(X\), \(Y\), and \(Z\) be independent uniform \( (0,1)\).

- Find the joint density of \(XY\) and \(Z^2\).
- Show that \(P(XY < Z^2) = \frac59\).

[Meester ex. 5.12.25]

## A joint density example I

Let \( (X,Y) \) have joint density \(f(x,y)=x e^{-x-y}\) when \(x,y>0\) and \(f(x,y)=0\) elsewhere. Are \(X\) and \(Y\) independent ?

[Meester ex 5.12.30]

Comments Off on A joint density example I

Posted in Independence, Joint Distributions

## Memory and the Exponential

Let \(X\) have an exponential distribution with parameter \(\lambda\). Show that

\[ P( X> t+ s \,|\, X>s) = P(X>t) \]

for all \(s,t >0\). Explain why one might call this property of the exponential “the lack of memory”.

Comments Off on Memory and the Exponential

Posted in Exponential Random Variables

Tagged JCM_math340_HW8_F13

## Hitting Zero

Consider a random walk making \(2n\) steps, and let \(T\) be the first

return to its starting point, that is

\[T =\ min\{1 ≤ k ≤ 2n : S_k = 0\},\]

and \(T = 0\) if the walk does not return to zero in the first \(2n steps.

Show that for all \( 1 ≤ k ≤ n\) we have,

\[P(T = 2k) =\frac1{2k − 1} \begin{pmatrix} 2k\\k\end{pmatrix} 2^{-2k}\]

[Meetster Ex 3.3.2]

## Geometric probability

in each case, consider a point picked uniformly randomly from the interior of the region. Find the probability density function for the \(x\)-coordinate.

- The square with corner : \( (-2,0), (0,2), (2,0), (0,-2) \)
- The triangle with corners: \( (-2,0), (1,0), (0,2) \)
- The polygon with corners: \( (0,2),(2,1), (1,-1), (-1,0)\)

[Pitman p277, # 12]

Comments Off on Geometric probability

Posted in probability density function

Tagged JCM_math340_HW6_F13

## Expected value of Random Walk

Consider a random walk \(S_i\), which begins at zero, and makes \(2n\) steps.

Let \(k < n\). Show that

- \(\mathbf{E}\Big( |S_{2k}| \ \Big|\ |S_{2k−1}| = r\Big) = r\);
- \( \mathbf{E}\Big(|S_{2k+1}| \ \Big|\ |S_{2k}| = r\Big) = 1 \text{ if } r = 0, \text{ and } \mathbf{E}\Big(|S_{2k+1}| \ \Big|\ |S_{2k}| = r\Big) = r\text{ otherwise}.\)

## A p.m.f. and expectation example

Let \(X\) be a random variable with probability mass function

\[p(n) = \frac{c}{n!}\quad \text{for $\mathbf{N}=0,1,2\cdots$}\]

and \(p(x)=0\) otherwise.

- Find \(c\). Hint use the Taylor series expansion of \(e^x\).
- Compute the probability that \(X\) is even.
- Computer the expected value of \(X\)

[Meester ex 2.7.14]

Comments Off on A p.m.f. and expectation example

Posted in Expectations, probability mass function

Tagged JCM_math340_HW5_F13

## Prime Dice

Suppose that we have a very special die which has exactly \(k\) faces where \(k\) is prime. The faces are numbered \(1,\dots,k\). We throw the die once and see which number comes up.

- What would be an appropriate outcome space and probability measure for this random experiment ?
- Suppose that the events \(A\) and \(B\) are independent. Show that \(\mathbf{P}(A)\) or \(\mathbf{P}(B)\) is always either 0 or 1. Or in other wards \(A\) or \(B\) is always either the full space or the empty set.

[ from Meester, ex 1.7.32]

## The chance a coin is fair

Suppose that I have two coins in my pocket. One ordinary, fair coin and one coin which has heads on both sides. I pick a random coin out of my pocket, throw it, and it comes up heads.

- What is the probability that I have thrown the fair coin ?
- If I throw the same coin again, and heads comes up again, what is the probability that I have thrown the fair coin ?
- If instead of throwing the same coin again, I reach into my pocket and throw the second coin. If it comes up heads, what is the chance the first coin is the fair coin ?

[ Modified version of Meester, ex 1.7.35]

Comments Off on The chance a coin is fair

Posted in Bayes Theorem, Coin Flips, Conditioning