COMP2804: Discrete Structures II
$\newcommand{\R}{\mathbb{R}}\DeclareMathOperator{\E}{\mathbf{E}}\DeclareMathOperator{\deg}{deg}\newcommand{\N}{\mathbb{N}}$

Assignment 4

Coin Tossing Games

The material from Lecture 15 (Infinite probability spaces) is highly relevant to these two questions.

In the game of Third-Head-Wins, two players $P_1$ and $P_2$ take turns tossing a coin, with $P_1$ tossing first. The person who tosses the third head is the winner. Let $A$ be the event "$P_1$ wins the game of Third-Head-Wins." What is $\Pr(A)$?

Click to toggle answer

In class, we saw that the probability that $P_1$ tosses the first head is $2/3$. If this happens, then the players continue to play a game of Second-Head-Wins with $P_2$ becoming the first player and $P_1$ becoming the second player.
On the other hand, with probability $1/3$, $P_2$ tosses the first head, after which they two players play a standard game of Second-Head-Wins.

These observations give the following probability of $P_1$ winning the game of Third-Head-Wins: \[ p_3 = \frac{2}{3}\cdot \frac{5}{9} + \frac{1}{3}\cdot \frac{4}{9} = \frac{14}{27} \]

In the game of $k$th-Head-Wins, two players $P_1$ and $P_2$ take turns tossing a coin, with $P_1$ tossing first. The person who tosses the $k$th head is the winner. Let $p_k$ be the probability that $P_1$ wins at the game of $k$th-Head-Wins. In class we found that $p_1=2/3$, $p_2=4/9$, and the previous question asks you to compute $p_3$. Give a recursive formula for $p_k$ that holds for all $k\ge 2$.

Click to toggle answer

In class, we saw that the probability that $P_1$ tosses the first head is $2/3$. If this happens, then the players continue to play a game of $(k-1)$th-Head-Wins with $P_2$ becoming the first player and $P_1$ becoming the second player.
On the other hand, with probability $1/3$, $P_2$ tosses the first head, after which they two players play a standard game of $(k-1)$th-Head-Wins.

These observations give the following recurrence equation for the probability $p_k$ of $P_1$ winning the game of $k$th-Head-Wins: \[ p_k = \frac{2}{3}\cdot (1-p_{k-1}) + \frac{1}{3}\cdot p_{k-1} \] If you evaluate this for $k=4$, you get $p_4=40/81$. With a good guess and a proof by induction, you can show that \[ p_k = \begin{cases} \frac{\lceil 3^k/2 \rceil}{3^k} & \text{when $k$ is odd} \\ \frac{\lfloor 3^k/2 \rfloor}{3^k} & \text{when $k$ is even} \\ \end{cases} \]

Friends having lunch

A group of $2n$ friends goes to dinner and they sit evenly spaced around a round table. Each friend tosses a coin and compares the result of their coin toss to that of the person sitting opposite them at the table.

What is the expected number of friends who get food?

Click to toggle answer

For each $i\in\{1,\ldots,2n\}$, define the indicator random variable $X_i$ that is equal to $1$ person $i$ gets to eat. Then $E(X_i)=\Pr(X_i=1)=3/4$. Let $X$ be the number of friends who get to eat. Then $X=\sum_{i=1}^{2n} X_i$, so we can turn the crank to get \[ \E(X) = \E(\sum_{i=1}^{2n} X_i) = \sum_{i=1}^{2n} \E(X_i) = \sum_{i=1}^{2n} 3/4 = 3n/2 \enspace . \]

Gashapons, again

You have a gashapon machine that contains 1000 horses, 900 of these are red and 100 of these are brown. You spend \$3 and get three random horses from the machine.

What is the expected number of red horses you get?

Click to toggle answer

For each $i\in\{1,2,3\}$, define the indicator random variable $X_i$ that is equal to $1$ if and only if the $i$th horse you receive from the machine is red. Then $\E(X_i)=900/1000=9/10$. If you don't believe this, count the number of $3$-horse sequences in which horse $i$ is red—it's $900\cdot 999\cdot 998$. On the other hand, the total number of three horse sequences is $1000\cdot999\cdot998$, so $\E(X_i)=\Pr(X_i=1)=(900\cdot 999\cdot 998)/(1000\cdot999\cdot998)=9/10$.

If $X$ is the expected number of red horses we get then we can turn the crank and deduce that \[ \E(X) = \E(\sum_{i=1}^{3} X_i) = \sum_{i=1}^{3} \E(X_i) = \sum_{i=1}^{3} 9/10 = 27/10 \enspace . \]

The second record

Let $x_1,\ldots,x_n$ be a uniformly random permutation of $\{1,\ldots,n\}$. Let \[ X = \begin{cases} i & \text{if $x_i>x_1>\max\{x_2,\ldots,x_{i-1}\}$} \\ n+1 & \text{if $x_1 = n$} \end{cases} \] In words, $X$ tells you the index of the first element that is greater than $x_1$.

What is $\E(X)$?

Click to toggle answer

Some people came to my office and I showed them the hard way to solve this problem. There's a sneaky easy way to solve it using indicator random variables that I will show here.

For each $i\in\{1,\ldots,n\}$, define the indicator random variable \[ X_i = \begin{cases} 1 & \text{if $x_1=\max\{x_1,\ldots,x_i\}$} \\ 0 & \text{otherwise} \end{cases} \] Now notice that $X = 1+\sum_{i=1}^n X_i$. So \begin{align*} \E(X) & = \E\left(1+\sum_{i=1}^n X_i\right) \\ & = 1+\sum_{i=1}^n \E(X_i) \\ & = 1+\sum_{i=1}^n \Pr(x_1=\max\{x_1,\ldots,x_i\}) \\ & = 1+\sum_{i=1}^n 1/i \\ & = 1+H_n \end{align*}

Remark: As $n$ goes to infinity, so does $\E(X)$. This has the following counterintuitive interpretation: You will meet a sequence $p_1,p_2,\ldots$ of people in your lifetime, each of whom could be your life partner. Suppose $x_i$ represents your level of compatibility with $p_i$ and suppose (big assumption) that you meet these people in uniformly random order. It might seem crazy to choose the first person $p_1$ you meet as a life partner. You're desperate, but not that desperate. Instead, you decide to wait for the first person who is more compatible with you than $p_1$. This result says that the expected number of partners you will meet be before finding someone better than $p_1$ is infinite!

Remark on remark: Don't take this as life advice and marry the kid you met at the park when you were five days old. I suspect that, as life goes on, the people we meet are more and more like us and therefore more compatible. Look around: most of the people around you are university students. If you graduate and get a job, most of the people you meet will also be university graduates working in the same industry as you.

Fixed points in random permutations

The next few questions are about a random permutation $\pi_1,\ldots,\pi_n$ of the integers $1,\ldots,n$.

What the is the expected number of indices $i\in\{1,\ldots,n\}$ such that $\pi_i=n+1-i$?

Hint: If this seems hard, first try to determine the expected number of indices such that $\pi_i=1$.)

Click to toggle answer

For each $i\in\{1,\ldots,n\}$, $n+1-i$ is also a number in $\{1,\ldots,n\}$. If we let $X_i$ be indicator random variable that is equal to $1$ if and only if $\pi_i=n+1-i$, then we see that $\E(X_i)=\Pr(X_i=1)=1/n$. If you don't believe this, count the number of permutations where $\pi_i=n+1-i$&\mdash;it's $(n-1)!$, so $\Pr(X_i=1)=(n-1)!/n! = 1/n$.

Let $X$ be the number of $i\in\{1,\ldots,n\}$ such that $\pi_i=n+1-i$. Then $X=\sum_{i=1}^n X_i$, so
\[ \E(X)=\E\left(\sum_{i=1}^n X_i\right) = \sum_{i=1}^n \E(X_i)=\sum_{i=1}^n 1/n = 1.
\]

Patterns in random permutations

Let \[ Q = \{(i,j,k): \text{$1\le i< j< k\le n$ and $\pi_i<\pi_j<\pi_k$}\} \enspace . \] In words, $Q$ contains all triples that appear in sorted order in the permutation $\pi$. Notice that $|Q|$ is a random variable that can take on a value as small as $0$ (when $\pi_1,\ldots,\pi_n=n,n-1,\ldots,2,1$) or as large as $\binom{n}{3}$ (when $\pi_1,\ldots,\pi_n=1,2,\ldots,n$).

What is $E(|Q|)$?

Click to toggle answer

For each $1\le i < j < k\le n$, let \[ X_{i,j,k}(\pi_1,\ldots,\pi_n) = \begin{cases} 1 & \text{if $\pi_i < \pi_j < \pi_k$} \\ 0 & \text{otherwise} \end{cases} \] Then $|Q|=\sum_{i=1}^{n-2}\sum_{j=i+1}^{n-1}\sum_{k=j+1}^n X_{i,j,k}$. By now, we know where this proof is going and we know that we will need to figure out $\E(X_{i,j,k})=\Pr(X_{i,j,k}=1)=\Pr(\pi_i < \pi_j<\pi_k)$. Intuitively, this probability should be $1/3!$ since we're asking the three values at $x_i$, $x_j$, and $x_k$ to appear in sorted order, but they can appear in any of $3!$ different orders.

To make this rigorous, we can count the permutations we're interested in using the Product Rule: To make a permutation $\pi_{1},\ldots,\pi_n$ for which $X_{i,j,k}(\pi_1,\ldots,\pi_n)=1$, we first choose integers $x,y,z$ with $1\le x< y<z\le n$ in one of $\binom{n}{3}$ ways. We then set $\pi_{i}=x$, $\pi_j=y$, and $\pi_k=z$ and complete the rest of permutation using $n-3$ values in $\{1,\ldots,n\}\setminus\{x,y,z\}$ in one of $(n-3)!$ ways. \begin{align*} |X_{i,j,k}=1| & = \binom{n}{3}\cdot (n-3)! \\ & = \frac{n!}{3!(n-3)!}\cdot (n-3)! \\ & = \frac{n!}{3!} \enspace . \end{align*} Therefore, \[ \Pr(X_{i,j,k}=1) = \frac{n!/3!}{n!}=\frac{1}{3!} \enspace . \] Now we can finish with \begin{align*} \E(X) & = \E\left(\sum_{i=1}^{n-2}\sum_{j=i+1}^{n-1}\sum_{k=j+1}^n X_{i,j,k}\right) \\ & = \sum_{i=1}^{n-2}\sum_{j=i+1}^{n-1}\sum_{k=j+1}^n \E(X_{i,j,k}) \\ & = \sum_{i=1}^{n-2}\sum_{j=i+1}^{n-1}\sum_{k=j+1}^n \Pr(X_{i,j,k}=1) \\ & = \sum_{i=1}^{n-2}\sum_{j=i+1}^{n-1}\sum_{k=j+1}^n 1/3! \\ & = \left.\binom{n}{3}\middle/ 3!\right. \\ \end{align*}

Let \[ R = \{i\in\{1,\ldots,n\}: \pi_i> i \} \enspace . \] Notice that $|R|$ is a random variable.

What is $E(|R|)$?

Click to toggle answer

For each $i\in\{1,\ldots,n\}$, let \[ X_i = \begin{cases} 1 & \text{if $\pi_i > i$} \\ 0 & \text{otherwise} \\ \end{cases} \] Then $|R|=\sum_{i=1}^n X_i$. Again, we know where this is going, so let's understand $E(X_i)$: \begin{align*} \E(X_i) & = \Pr(X_i=1) \\ & = \Pr(\pi_i > i) \\ & = \frac{(n-i)(n-1)!}{n!} \\ & = \frac{(n-i)}{n} \\ & = 1-i/n\\ \end{align*} (The numerator in the third line counts the number of permutations in which $\pi_i>i$ using the Product Rule; we have to choose $\pi_i$ to be one of the $n-i$ integers in $\{i+1,\ldots,n\}$ then we can permute the remaining $n-1$ integers any way we want.)

Now we can turn the crank: \begin{align*} \E(|R|) & = \E\left(\sum_{i=1}^n X_i\right) \\ & = \sum_{i=1}^n \E(X_i) \\ & = \sum_{i=1}^n (1-i/n) \\ & = n - \frac{1}{n}\sum_{i=1}^n i \\ & = n - \frac{1}{n}\frac{n(n+1)}{2} \\ & = n - \frac{(n+1)}{2} \\ & = (n-1)/2
\end{align*}

Let \[ T = \{i\in\{1,\ldots,n\}: \max\{\pi_1,\ldots,\pi_i\}> i \} \enspace . \] Notice that $|T|$ is a random variable.

What is $E(|T|)$?

Click to toggle answer

For each $i\in\{1,\ldots,n\}$, define \[ X_i = \begin{cases} 1 & \text{if $\max\{\pi_1,\ldots,\pi_i\}>i$} \\ 0 & \text{otherwise} \\ \end{cases} \] Then $|T|=\sum_{i=1}^n X_i$. Again, we know where this is going, so let's understand $E(X_i)$. For that we need to count the number of permutations where $\max\{\pi_1,\ldots,\pi_i\}>i$. This turns out to be fairly easy using the complement rule. If $\max\{\pi_1,\ldots,\pi_i\}\not>i$ then $\max\{\pi_1,\ldots,\pi_i\}\le i$. But $\pi_1,\ldots,\pi_i$ is an $i$-element subset of $\{1,\ldots,n\}$. Then $\max\{\pi_1,\ldots,\pi_i\}\le i$ if and only if $\{\pi_1,\ldots,\pi_i\}=\{1,\ldots,i\}$. We're not saying that $1,\ldots,i$ have to appear in order, just that they must appear in the first $i$ positions of the permutation. To make such a permutation, we first choose a permutation of $\pi_1,\ldots,\pi_i$ of $\{1,\ldots,i\}$ and then choose a permutation $\pi_{i+1},\ldots,\pi_{n}$ of $\{i+1,\ldots,n\}$. The number of ways to do this is $i!(n-i)!$. Therefore, \[ \Pr(\max\{\pi_1,\ldots,\pi_i\}\not>i) = \frac{i!(n-i)!}{n!} \] so, using the Complement Rule, \[ \E(X_i)=\Pr(\max\{\pi_1,\ldots,\pi_i\}>i) = 1-\frac{i!(n-i)!}{n!}=1-\left.1\middle/\binom{n}{i} \right. \] Now we can turn the crank. \begin{align*} \E(|R|) & = \E\left(\sum_{i=1}^n X_i\right) \\ & = \sum_{i=1}^n \E(X_i) \\ & = \sum_{i=1}^n \left(1-1\middle/\binom{n}{i} \right) \\ & = n-\sum_{i=1}^n \left(1\middle/\binom{n}{i} \right) \\ \end{align*} I don't know of any way to simplify this sum.


The questions from this point on are to help you prepare for the final exam. This material will not be covered on Quiz 4.

The maximum of two 20-sided dice

You roll two $20$-sided dice, $d_1$ and $d_2$.

What is the expected value of the random variable $Z = \max\{d_1,d_2\}$?

Hint: Remember, from our class on geometric random variables, since $\operatorname{Range}(Z)$ contains only non-negative integers, $\E(Z)=\sum_{i=1}^{\infty}\Pr(Z\ge i)$.

Hint: In case it's helpful, you can use this fact: $\sum_{i=0}^{19} i^2 = 2470$.

Click to toggle answer

This problem is about a uniform distribution over the sample space $S=\{(d_1,d_2):d_1,d_2\in\{1,\ldots,20\}\}$, which has size $|S|=20^2$.

The hint suggests that we should try to compute $\Pr(Z\ge i)$, but it looks easy to do this using the Complement Rule: $\Pr(Z\ge i)=1-\Pr(Z<i)$. For any $i\in\{1,\ldots,20\}$, $Z< i$ if and only if $d_1,d_2\in\{1,\ldots,i-1\}$. Therefore, $\Pr(Z< i)=(i-1)^2/20^2$. Therefore, $\Pr(Z\ge i)=1-\Pr(Z<i) = (20^2-(i-1)^2)/20^2$. Now we use the hints \begin{align*} \E(Z) & = \sum_{i=1}^{20}\Pr(Z\ge i) & \text{(suggested by hint)}\\ & = \sum_{i=1}^{20} \frac{20^2-(i-1^2)}{20^2}\\ & = \frac{20^3-\sum_{i=0}^{19} i^2}{20^2} \\ & = \frac{8000-\sum_{i=0}^{19} i^2}{400} & \text{(since $20^3=8000$ and $20^2=400$)}\\ & = \frac{8000-2470}{400} & \text{(second hint)}\\\\ & = \frac{5530}{400} = 13.825 \enspace . \end{align*} Note: As long as you can compute the sum $\sum_{i=0}^{k-1} i^c$, you can do a similar computation to compute the expected maximum of $c$ $k$-sided dice, for any $c,k\ge 1$.

Fishing a uniform lake.

You go fishing in a lake. There are four types of fish (trout, pike, walley, bass) in the lake. Each time you cast your line, you catch a fish, which is equally likely to be one of these four types. You throw every fish you catch back in the water, so the result of each cast is independent of which types of fish you caught on previous casts. You stop fishing once you've caught at least one fish of each type.

What is the expected number of casts you make before you stop fishing?

Click to toggle answer

This is the Coupon Collector's problem that we've discussed in class. The expected number of casts you have to make is $4 H_4$.

Fishing a non-uniform lake

You go fishing in a lake. There are two types of fish (trout and pike) in the lake. Each time you cast your line, you catch a trout with probability $4/5$ and you catch a pike with probability $1/5$. You throw every fish you catch back in the water, so the result of each cast is independent of which types of fish you caught on previous casts. You stop fishing once you've caught at least one trout and at least one pike.

What is the expected number of casts you make before you stop fishing?

Click to toggle answer

This is not a Coupon Collector's Problem since the probability of getting a a trout is not the same as the probability of getting a pike.

A slightly non-rigorous way to get the answer is to branch depending on the first fish you catch:

Here is a fully rigorous solution: In this problem, the sample space is \[ S=\{T^nP : n\ge 0\}\cup\{P^n T: n\ge 0\} \enspace . \]
Where the sequence of letters indicate the sequence of fish caught, $T$ for trout and $P$ for pike. The probability function is given by \[ \Pr(T^nP)=(4/5)^n(1/5),\qquad \Pr(P^nT)=(1/5)^n(4/5) \] The number of casts is the random variable $X$ defined by $X(T^nP)=X(P^nT)=n+1$ for each $n\ge 0$. Then the expected value of $X$ is \begin{align*} \E(X) & = \sum_{\omega\in S}\Pr(\omega)\cdot X(\omega) \\ & = \sum_{n=0}^\infty\Pr(T^nP)\cdot X(T^nP) + \sum_{n=0}^\infty\Pr(P^nT)\cdot X(P^nT) \\ & = \sum_{n=0}^\infty(4/5)^n(1/5)(n+1) + \sum_{n=0}^\infty (1/5)^n(4/5)(n+1) \\ & = \sum_{n=1}^\infty(4/5)^{n-1}(1/5)n + \sum_{n=1}^\infty (1/5)^{n-1}(4/5)n \\ & = (1/5)\sum_{n=0}^\infty(4/5)^n n + (4/5)\sum_{n=0}^\infty (1/5)^nn \\ & = (1/5)\cdot\frac{1}{(1-4/5)^2} + (4/5)\cdot\frac{1}{(1-1/5)^2} & \text{(Lecture 19)} \\ & = (1/5)\cdot\frac{1}{(1/5)^2} + (4/5)\cdot\frac{1}{(4/5)^2} \\ & = \frac{1}{1/5} + \cdot\frac{1}{4/5} \\ & = 5 + 5/4 = 21/4 \end{align*}