The Bank of Bath issues coins with an $H$ on one side and a $T$ on the other. Harry has $n$ of these coins arranged in a line from left to right. He repeatedly performs the following operation: if there are exactly $k>0$ coins showing $H$, then he turns over the $k$th coin from the left; otherwise, all coins show $T$ and he stops. For example, if $n=3$ the process starting with the configuration $THT$ would be $THT \to HHT \to HTT \to TTT$, which stops after three operations. (a) Show that, for each initial configuration, Harry stops after a finite number of operations. (b) For each initial configuration $C$, let $L(C)$ be the number of operations before Harry stops. For example, $L(THT) = 3$ and $L(TTT) = 0$. Determine the average value of $L(C)$ over all $2^n$ possible initial configurations $C$. Proposed by David Altizio, USA
Problem
Source: IMO 2019, problem 5
Tags: IMO 2019, combinatorics, IMO Shortlist, IMO Shortlist 2019, IMO
17.07.2019 15:22
Oh my gosh, this is mine. EDIT: I've attached the original submission I sent to Evan. The problem is worded differently and the solution is sloppily written but I need to go on a plane now and AAAAA IM SO EXCITED
Attachments:
flipping_bits.pdf (137kb)
17.07.2019 15:23
I assume it is meant $2^n$ initial configurations? EDIT: It has now been edited to reflect this.
17.07.2019 15:27
Congrats dj! The answer is $E_n = \frac{1}{2} (1 + \dots + n) = \frac14 n(n+1)$ which is finite. We'll represent the operation by a directed graph $G_n$ on vertices $\{0,1\}^n$ (each string points to its successor) with $1$ corresponding to heads and $0$ corresponding to tails. For $b \in \{0,1\}$ we let $\overline b = 1-b$, and denote binary strings as a sequence of $n$ symbols. The main claim is that $G_n$ can be described explicitly in terms of $G_{n-1}$: We take two copies $X$ and $Y$ of $G_{n-1}$. In $X$, we take each string of length $n-1$ and just append a $0$ to it. In symbols, we replace $s_1 \dots s_{n-1} \mapsto s_1 \dots s_{n-1} 0$. In $Y$, we toggle every bit, then reverse the order, and then append a $1$ to it. In symbols, we replace $s_1 \dots s_{n-1} \mapsto \overline s_{n-1} \overline s_{n-2} \dots \overline s_{1} 1$. Finally, we add one new edge from $Y$ to $X$ by $11 \dots 1 \to 11\dots110$. An illustration of $G_4$ is given below. [asy][asy] unitsize(0.8cm); label("$0000$", (0, 0), red); label("$1000$", (0, 1), red); label("$1100$", (0, 2), red); label("$1110$", (0, 3), red); label("$0100$", (2, 2), red); label("$1010$", (2, 3), red); label("$0010$", (4, 3), red); label("$0110$", (6, 3), red); label("$1111$", (0, 4), blue); label("$1101$", (0, 5), blue); label("$1001$", (0, 6), blue); label("$0001$", (0, 7), blue); label("$1011$", (2, 6), blue); label("$0101$", (2, 7), blue); label("$0111$", (4, 7), blue); label("$0011$", (6, 7), blue); label(scale(1.5)*"$\downarrow$", (0, 0.5), red); label(scale(1.5)*"$\downarrow$", (0, 1.5), red); label(scale(1.5)*"$\downarrow$", (0, 2.5), red); label(scale(1.2)*"$\leftarrow$", (1, 2), red); label(scale(1.2)*"$\leftarrow$", (1, 3), red); label(scale(1.2)*"$\leftarrow$", (3, 3), red); label(scale(1.2)*"$\leftarrow$", (5, 3), red); label(scale(1.5)*"$\downarrow$", (0, 4.5), blue); label(scale(1.5)*"$\downarrow$", (0, 5.5), blue); label(scale(1.5)*"$\downarrow$", (0, 6.5), blue); label(scale(1.2)*"$\leftarrow$", (1, 6), blue); label(scale(1.2)*"$\leftarrow$", (1, 7), blue); label(scale(1.2)*"$\leftarrow$", (3, 7), blue); label(scale(1.2)*"$\leftarrow$", (5, 7), blue); label(scale(1.7)*"$\Downarrow$", (0, 3.5), deepgreen); [/asy][/asy] To prove this claim, we need only show the arrows of this directed graph remain valid. The graph $X$ is correct as a subgraph of $G_n$, since the extra $0$ makes no difference. As for $Y$, note that if $s = s_1 \dots s_{n-1}$ had $k$ ones, then the modified string has $(n-1-k)+1 = n-k$ ones, ergo $\overline s_{n-1} \dots \overline s_1 1 \mapsto \overline s_{n-1} \dots \overline s_{k+1} s_k \overline s_{k-1} \dots \overline s_1 1$ which is what we wanted. Finally, the one edge from $Y$ to $X$ is obviously correct. To finish, let $E_n$ denote the desired expected value. Since $1 \dots 1$ takes $n$ steps to finish we have \[ E_n = \frac{1}{2} \left[ E_{n-1} + (E_{n-1}+n) \right] \]based on cases on whether the chosen string is in $X$ or $Y$ or not. By induction, we have $E_n = \frac{1}{2} (1 + \dots + n) = \frac14 n(n+1)$, as desired.
17.07.2019 15:50
17.07.2019 16:22
This was my favourite problem of the contest. We will just show (b), which obviously implies (a). Notional overload incoming. Let $S_n$ denote the the set of $2^n$ configurations of coins of length $n$. Let $S_n(H)$ and $S_n(T)$ be the sets of configurations that end with $H$ and $N$, respectively. By $S(C)$ we refer to the configuration obtained after a single operation is performed on $C$. Let $C(C)$ denote the complement of a configuration $C$, where every $H$ is flipped to $T$ and vice versa. Let $R(C)$ be the reverse of $C$ i.e. the first coin is switched with the last coin, and so on. Finally let $C_1 + C_2$ denote the configuration achieved by adding the line of coins $C_2$ to the right of $C_1$. We will let $L_n$ be the average we are looking for. The answer is $\frac{n^2+n}{4}$. It is easy to verify that $L_1 = \frac{1}{2}$, so it is enough to show that \[ L_n = L_{n-1} + \frac{n}{2} \]for all $n > 1$. There are two main ingredients. Firstly, note that \[ L(C + T) = L(C) \]for al $C$. If the final coin reads $T$ then it can never be flipped to an $H$. Thus it is safe to ignore it. The more interesting part comes when a string ends in $H$. Let \[ f(C) = R(C(C)) + H . \]The crucial claim is that \[ L(f(C)) = L(C) + n \]whenever $C \in S_{n-1}$. It is easy enough to check that $S(f(C)) = f(S(C))$. In order for $f(C)$ to reach $TT \dots T$, it must at some point reach $HH \dots H$. Equivalently $C$ must reach $TT \dots T$, when we compare $C$ and $f(C)$. This happens in $L(C)$ moves. After $f(C)$ hits $HH \dots H$, it takes exactly $n$ more moves to reach the desired $TT \dots T$. This proves the claim. It is easy to see that $C+T : S_{n-1} \to S_n(T)$ is a bijection. Moreover since $R, C, +H$ are all bijections, it follows that $f: S_{n-1} \to S_n(H)$ is also a bijection. Now proving the recursion is just a direct computation: \begin{align*} L_n &= \frac{1}{2^n} \sum_{ C \in S_n} L(C) \\ &= \frac{1}{2^n} \left( \sum_{ C \in S_n(T)} L(C) + \sum_{C \in S_n(H)} L(C) + {n} \right) \\ &= \frac{1}{2^n} \left( \sum_{ C \in S_{n-1}} 2L(C) + {n} \right) \\ &= L_{n-1} + \frac{n}{2}, \end{align*}as desired.
17.07.2019 17:01
This is a nice problem, but its difficulty is definitely lower than combinatorics P5's from recent years (2017, 2014). However combi P2's from recent years weren't that hard as these on 5th positions.
17.07.2019 17:06
For my proof of $(a)$, see a couple of comments above.
17.07.2019 18:28
My solution for part a: Note that the flip location goes back and forth left and rightward. If it flips $H$ to $T$ then the next location will be exactly one spot to the left. If this next location is $H$ to T again, it keeps going left. As it flips, it leaves a string of $T$s in its wake. The next time it flips $T$ to $H$, then it will have to go right, and encountering a bunch of $T$s that it left in its wake last time. So it goes through that string of $T$s until the next time it encounters $H$. Similarly, if it goes to the right, it leaves a string of $H$s in its wake, and on the next leftward segment it turns all of these $H$s into $T$s. So observation 1: every time the location changes direction (leftward to right and vice versa), the location of the segment strictly increases. Proof: suppose the last $m$ flips have been leftward, and now it is about to turn rightward. That means there are $m$ $T$s immediately to the right of the current position, and the current position is $T$ (about to be turned to $H$). So already we have $m+1$ $T$s in a row, possibly more depending on what's to the right of those $T$s. The proof for rightward to leftward is similar. Observation 2: ssppose $L$ is the location of the rightmost $H$, and $k$ is the location of current flip. Obviously $L \geq k$ at all times. But every time the flip location turns rightward to leftward, $L-k$ strictly decreases. Proof: consequence of observation 1. Observation 3: So eventually $L-k = 0$ which means we get a configuration that's $H...HT...T$ so it's leftward until the end from here. For part b, I only know the following observations. I can try to continue later. The idea is that we establish recursion from $n$ to $n-1$ or $n-2$ Observation 1: if I add $T$ to the end, it does not affect the sequence of operations. So here it can be reduced to $n-1$ case. Observation 2: If I add $H$ to the beginning, it affects the sequence of operations a little bit. The starting position increases, but everything is shifted 1 to the right. So we end up with $HT...T$, so here it can be reduced to $n-1$ case + 1 step. Observation 3: If I add $T$ to the beginning and $H$ to the end, the starting position increases by 1 but everything is also shifted, and so we end up with $TT...TH$, so here it can be reduced to $n-2$ case + a few steps. From there I feel like we can write the recursion formula. I don't have paper with me so I can't do it in my head.
17.07.2019 18:48
Interesting fact: If the positions of $H$ are $a_1,a_2,\ldots, a_k$, then the number of the operations is exactly $$ 2(a_1+a_2+\ldots+a_k)-k^2 $$
17.07.2019 18:51
What I did for a) is just say that when the location goes right and then back left, the moment you've flipped over the first coin twice the position is the same as the starting one except an H is now a T, repeat and the number of H just keeps decreasing. @double_above @above I used that in my b) solution, no induction and just lots of calculation.
17.07.2019 19:46
17.07.2019 19:47
djmathman in July 2019: Combinatorics is my worst subject and the three best problems I've written are geometry. Also djmathman in July 2019: *proposes epic combinatorics problem for the IMO*
17.07.2019 20:54
I think many if contestant accept this problem meanings Should solve this probarbly
17.07.2019 21:15
Does anyone have a solution using induction on number of heads for part a?
17.07.2019 22:36
@above An induction by the number of heads is equivalent to my solution. InkretBear wrote: What I did for a) is just say that when the location goes right and then back left, the moment you've flipped over the first coin twice the position is the same as the starting one except an H is now a T, repeat and the number of H just keeps decreasing. @double_above @above I used that in my b) solution, no induction and just lots of calculation.
18.07.2019 00:01
Very cool problem . The soution using the isomorphic trees is very cute, but I didn't find it. The main claim for my solution is the following: Claim. If an $HT$-string $S$ has $H$'s at positions $a_1, a_2, \dots, a_k$ and $T$'s at all other positions, then it takes exactly $$2(a_1 + a_2 + \dots + a_k) - k^2$$ operations to change $S$ into the all-$T$'s string. Proof. If $S$ is the all-$T$ string, this is clear. Assume then that $k \ge 1$, and WLOG let $a_1 < a_2 < \dots < a_k$. Let $\ell$ be minimum such that $a_{\ell} \ge k$. We claim that after exactly $2(a_{\ell} - k) + 1$ operations, the resulting string is exactly the same as $S$, but with the coin in position $a_{\ell}$ now showing $T$ instead of $H$. The proof of this essentially comes down to drawing the following diagram: $$\cdots\underset{k}{T}TT \cdots T\underset{a_\ell}{H}\cdots \to \cdots \underset{k}{H}TT \cdots T\underset{a_\ell}{H} \to \cdots \to \cdots \underset{k}{H}HH \cdots H\underset{a_\ell}{T} \cdots \to \cdots \underset{k}{H}HH \cdots H\underset{a_\ell}{H} \cdots$$ $$\cdots\underset{k}{H}H \cdots HH\underset{a_\ell}{H}\cdots \to \cdots \underset{k}{H}H \cdots HH\underset{a_\ell}{T} \to \cdots \to \cdots \underset{k}{H}T \cdots TT\underset{a_\ell}{T} \cdots \to \cdots \underset{k}{T}T \cdots T\underset{a_\ell}{T} \cdots$$ The first line has $a_\ell - k$ operations, while the second has $a_\ell - k + 1$. After this we can repeat the same step until all $H$'s have been eliminated. This proves part (a). It also follows that the number of steps taken will be equal to $2(b_1 - k) + 1 + 2(b_2 - (k - 1)) + 1 + \dots + 2(b_k - 1) + 1$, where $b_1, b_2, \dots, b_k$ is some permutation of $a_1, a_2, \dots, b_k$. Adding everything up we get $$2(b_1 + \dots + b_k) - 2(1 + \dots + k) + k = 2(a_1 + \dots + a_k) - k(k + 1) + k = 2(a_1 + a_2 + \dots + a_k) - k^2$$ Which proves the claim. $\blacksquare$ All that remains is to compute the expected value of this expression if we choose a string uniformly at random. Letting $X_i$ be the indicator variable for having an $H$ in the $i$-th position initially we are interested in computing $$\mathbb{E}\left[\sum_{i = 1}^n 2i X_i - \left(\sum_{i = 1}^n X_i \right)^2 \right]$$ By linearity of expectation this all separates, and we get $$\sum_{i = 1}^n 2i \mathbb{E}[X_i] - \mathbb{E}\left[\sum_{i = 1}^n X_i\right]^2$$ As $\mathbb{E}[X_i] = \frac{1}{2}$ for all $i$, the left sum is just $1 + 2 + \dots + n = \frac{n(n - 1)}{2}$. For the second sum we can either expand and use linearity of expectation again, or abuse the fact that $\sum_{i = 1}^n X_i$ follows a binomial distribution whose variance is known. Either way after some calculation we get that the other term is $\frac{n(n + 1)}{4}$, so the final answer is $\frac{n(n + 1)}{4}$.
18.07.2019 00:01
I have a very ugly solution which is BTW easy to come up.Let $a_n$ be the expected value of number of needed operations if the word begins with $H$ the first letter won't change unless all other letters are $T$ from which point we only need one operation to finish if the word ends with $T$ the last letter won't change so we need $a_{n-1}$ moves to finish if the word starts with $H$ and ends with $T$ then the last letter won't change and the first letter won't change neither unless all other letters are $T$ so we will need $a_{n-2}$ moves in average.if the word starts in $T$ and ends in $H$ the last letter won't change unless the first letter changes and the first letter won't change unless all the middle letters are $T$ so we will need $a_{n-2}+2n-1$ moves in average summing up we have: $a_{n}=a_{n-1}+\frac{n}{2}$ An easy induction will show that $a_n=\frac{n(n+1)}{4}$.
18.07.2019 00:31
18.07.2019 03:24
Here $1$ is instead of $H$ and $0$ instead of $T$, and changing $1$ to $0$ and vice versa will call changing. If ones are on positions $C_1,C_2,...,C_l$ then we will show that number of moves is $2(\sum_{i = 1}^n C_i)-l^2$ We start at the position $l$. We will be changing $0$ to $1$ until we reach closest $1$ to position $l$ (we are looking for closest .( $1$ can be on position $l$ and we call that one the closest) Then we turn back all ones that we turned and there is $0$ on position $l$.We made 2(C_l-l+1)-1 moves. Then process is repeated and we see that number of moves is $ \sum_{i = 1}^l 2(C_s-i+1)-1$ where $C_s$ is the position of $1$ closest to position $l$. But in sum we go over all $C_i$ (every $C_i$ is closest to some position $j$ in some moment) so: $$ \sum_{i = 1}^l 2(C_s-i+1)-1= \sum_{i = 1}^l 2(C_i-i+1)-1 $$$$ 2\sum_{i = 1}^l C_i-l(l+1)+l=2\sum_{i = 1}^l C_i-l^2 $$which we wanted to prove. Now we take sum over all possible configurations. The sum $2\sum_{i = 1}^l C_i$ over all possible configurations corresponds to sum of all sums of elements of all subsets of set ${1,2,3,....,n}$ which is $2^{n-1}n(n+1)$ and $-l^2$ shows up $\binom{n}{l}$ times. So we need to compute $\sum_{i = 1}^n l^2\binom{n}{l}$ This will be shown nicely. Take subset $X$ of ${1,2,3,....,n}$ with $l$ elements and take ordered pair $(a,b)$ of elements (possibly same) from $X$. Total number of pair $(X,(a,b))$ is $\sum_{i = 1}^n \binom{n}{l}$. Count this other way around: First we pick two different elements of ${1,2,3,....,n}$ and subset containg these two elements, there are $n(n-1)2^{n-2}$ ways. Next pick pair $(a,a)$ and subset containg it, number of this ways is $n2^(n-1)$. Adding these we get $$ \sum_{i = 1}^n l^2\binom{n}{l}=n(n-1)2^{n-2}+n2^{n-1}=n(n+1)2^{n-2}$$So number we seek is $$ 2 \sum\nolimits_{over all possible conf} \sum_{i = 1}^l C_i- \sum_{i = 1}^n l^2\binom{n}{l}=2^{n-1}n(n+1)-n(n+1)2^{n-2}=n(n+1)2^{n-2} $$and we are done.
18.03.2023 18:21
We can prove the process will terminate by determining the average number of steps. We claim that this value is $\frac{n(n+1)}{4}$. We can do so by using induction. Let $a_n$ denote the average number of moves required for sequences of length $n$ that end with $1$ to terminate, and $b_n$ denote the average number of moves of length $n$ that end with $0$ to terminate. Notice that for each $a_n$, we have that $a_{n+1}=\frac{1}{2}a_n+\frac{1}{2}b_n$, since for the every single move, we will never touch the last number in the sequence. Notice that for each $b_n$, we have that $b_{n+1}=\frac{1}{2}a_n+\frac{1}{2}b_n+n$. This is because if we instead rephrase the problem to that if there are $k>0$ tails, we take the $k$-th coin from the left (basically we just look at the sequence in reverse order) and flip it and then repeat this process until there are $n$ heads, we will have that this average is $\frac{1}{2}a_n+\frac{1}{2}b_n$. Since this is only to all heads, in order to terminate the whole sequence, we need $n$ more flips to get to all tails. Therefore, using this, we have that $c_n=\frac{1}{2}a_n+\frac{1}{2}b_n+\frac{n}{2}$, and since $c_1=\frac{1}{2}$, we may use this to complete the inductive step, and we are done. **$c_n$ is the number of average steps needed to terminate the process in general.
24.03.2023 21:27
Very nice problem. Part $(a)$ is obvious, so we can assume part $(a)$. First of all, we will use induction on n. Let $f(n)$ be the average value of $L(C)$. There are 2 cases $(i)$ in which the last coin is T, and $(ii)$ in which the last coin is H. $(i)$ Claim 1: The average sum of operations is $f(n-1)$. $Proof:$ As we work only on the first $n-1$ coins, we don't touch the last coin. So, the sum is $f(n-1)$. $(ii)$ Claim 2: The average sum is $f(n-1)+n$. $Proof:$ We will prove that, we simply add n operations. Here we will use one trick. We will turn sides of coins, hence the last will be T. In this case, we will go with the number of T. So the number of operations will be $f(n-1)$. But finally, all coins will be T. As, we turn sides, in the original case all coins are H. And this obviously requires $n$ moves to stop the process. So it is $f(n-1)+n$. As there are $2^{n-1}$ configurations in each case, $2^nf(n)$ must be equal to $2^{n-1}f(n-1)+2^{n-1}(f(n-1)+n)$. Considering $f(1)$ is equal to $\frac{1}{2}$, this is basic math. The answer is $\frac{n(n+1)}{4}$.
17.06.2023 19:43
Let $f_{xyd}(n)$ where $x,y\in\{H,T\}$ and $d\in\{L=\text{Left},R=\text{Right}\}$ be defined as the sum of the number of the operations for all initial configuration with the following operation: if there are exactly $k>0$ coins showing $x$, then he turns over the $k^\text{th}$ from the $d$ until all coins show $y$. Let $f^*_{xyd}(n)$ be the same as $f_{xyd}(n)$ except $(k+1)^\text{th}$ coin is turned over. Only $f_{HTL}$, $f_{THL}$, $f_{HTR}$, $f_{THR}$, $f^*_{HHL}$, $f^*_{TTL}$, $f^*_{HHR}$, $f^*_{TTR}$ make sense. Problem asks for $\alpha_n=\dfrac{f_{HTL}(n)}{2^n}$ Divide $2^n$ initial configurations into two classes, those that end with a $H$ and those that end with a $T$. Sum of the number of operations for $2^{n-1}$ initial configurations that end with a $T$ is clearly $f_{HTL}(n-1)$ as the last $T$ affects nothing. For the initial configurations that end with a $H$, to touch the last $H$, all coins should be showing $H$. After that point, it takes $n$ steps to make all of them $T$. To get to the point that all coins show $H$, we need a total of $f^*_{HHL}(n-1)$ operations, since the last $H$ contributes 1 to the total number of $H$s. $$f_{HTL}(n)=f_{HTL}(n-1)+f^*_{HHL}(n-1)+2^{n-1}\cdot n$$$$f_{HHL}^*(n-1)\overset{(1)}{=}f_{TTL}^*(n-1)\overset{(2)}{=}f_{TTR}^*(n-1)\overset{(3)}{=}f_{HTL}(n-1)$$ (1) Initial configurations are symmetric wrt $H$ and $T$. (2) Initial configurations are symmetric wrt $L$ and $R$ when $x=y$. (3) The $k^\text{th}$ coin from left, where $k$ is the number of $H$s is the same as the $(l+1)^{th}$ coin from right, where $l$ is the number of $T$s. Therefore, $f_{HTL}(n)=2\cdot f_{HTL}(n-1)+2^{n-1}\cdot n$ and $\alpha_n=\alpha_{n-1}+\dfrac{n}{2}$. Clearly, $\alpha_n=\dfrac{n^2+n}{4}$.
04.08.2023 08:51
We first compute $L(1)=1/2, L(2)=3/2, L(3)=3, L(4)=5$ by going through all possible cases. This leads us to the following claim: Claim: $L(n)=\frac{n(n+1)}{4}.$ We will show this by induction. We have already seen that this is satisfied for $n=1,2,3,4.$ Case 1: The string starts with H. Then, the expected number of moves is $L(n-1)+1$ since it will take on average $L(n-1)$ moves to make everything after it T and 1 additional move to finish. Case 2: The string ends with T. Then, the expected number of moves is $L(n-1)$ since the final $T$ has no effect. Case 3: The string starts with H and ends with T. Then, the expected number of moves is $L(n-2)+1$, since it takes $L(n-2)$ moves to convert the middle part into T's and one more move to finish. Case 4: The string starts with T and ends with H. Then, it will take $L(n-2)$ moves to turn it into $TTT\cdots H$, and then another $n-1$ moves to turn it into $HHH\cdots H$, and $n$ moves to finish, so it will take expected $L(n-2)+2n-1$ moves. Note that by PIE, (Case 1 + Case 2 - Case 3) covers all cases where it either starts with H or ends with T. Therefore, (Case 1 + Case 2 - Case 3 + Case 4) exactly covers all possible cases. Therefore, we have $$L(n)=\frac{1}{2}(L(n-1)+1)+\frac{1}{2}(L(n-1))-\frac{1}{4}(L(n-2)+1)+\frac{1}{4}(L(n-2)+2n-1).$$By our inductive hypothesis, we plug in $L(n-2)=\frac{(n-2)(n-1)}{4}$ and $L(n-1)=\frac{(n-1)n}{4},$ we get $$L(n)=\frac{n(n+1)}{4},$$so we are done by induction. Note that this also solves part (a) since if any case is infinite, then the expected value is infinite, contradiction.
13.08.2023 03:07
ah... the Bank of Bath problem. not sure why it's named that, nor why i feel like it had to say Oslo in it somewhere, and also glad that i solved it because i thought it was notorious for being some 50 MOHS but apparently not We claim the expected value of the process, $p_n$, is $\frac{n(n+1)}{4}$. We will proceed by induction, with base case $p_1=\frac12$. Here's the subcases in the induction: (1) Ends in tails: This has average $p_{n-1}$. (2) Starts in heads: This has average $p_{n-1}+1$ (since we can ignore the first coin and the process is the same, plus 1 when you need to flip the first at the end). (3) Starts in tails, ends in heads: This is the same as starting in heads, ending in tails (call this situation (4)), because indices in the middle string are the same, except when you flip the first heads, instead of being all tails, it is HTT...TH, which takes another 2(n-1) flips (easily checked manually, by say induction on the number of Ts). Then $$2^np_n=(1)+(2)+(3)-(4)=2^{n-1}p_{n-1}+2^{n-1}(p_{n-1}+1)+2^{n-2}((3)-(4))=2^np_{n-1}+2^{n-1}+2^{n-1}(n-1)\implies p_n=p_{n-1}+\frac n2=\frac{n(n+1)}{4},$$as desired. $\blacksquare$
18.01.2024 10:05
solved and wrote up (a) before (b) but whatever We first prove that the procedure always terminates. For sake of convenience, denote $H=1$ and $T=0.$ We use induction to prove that for any starting position, the sequence eventually becomes a string of $1$'s followed by a string of $0$'s, which obviously terminates. For base case $n=1,$ this is obviously true. For inductive step, assume that for $k$ coins, our hypothesis holds. Then, for $k+1$ coins, consider cases depending on if the first coin is $1$ or $0.$ If the first coin is $1,$ the rest of the coins represent a binary string of length $k,$ which from inductive assumption we know terminates. In this situation, all the indices of the binary string of length $k$ after the first coin flip are shifted up by $1,$ but since the first coin is has value $1,$ the total number of $1$'s is also shifted up by $1.$ Therefore, the procedure in this case will be identical to the procedure for the binary string of length $k$ until the final step, at which we change the $1$ at the first index to a $0.$ Thus all strings starting with a $1$ terminate. If the first coin is $0,$ insert a coin in front of the first coin with label $1.$ Then, since we know any string starting with a $1$ terminates with the string after the initial $1$ terminating completely first, it follows the string of length $k$ after the $1$ we inserted must terminate, as desired. It remains to determine the average number of steps it takes for the process to terminate, over all configurations. Let $f(n)$ be the average number of steps it takes for the process to terminate for a string of length $n.$ We prove that $f(n)=\tfrac{n(n+1)}{4}.$ Testing the first few cases, we find $f(1)=\tfrac{1}{2},$ $f(2)=\tfrac{3}{2},$ $f(3)=3,$ $f(4)=5.$ Proceed by strong induction. The base cases are evident as described. For inductive step, assume $f(k)=\tfrac{k(k+1)}{4}$ for a string of length $k.$ Then we prove that this holds for $k+1.$ Consider the following cases: If the sequence starts with a $1,$ then we have $f(k)+1.$ If the sequence ends with a $0,$ then we have $f(k).$ If the sequence starts with a $1$ and ends with a $0,$ observe the trailing zero once again does not do anything. The starting $1$ also does not change the operations performed of the $k$ trailing digits. Then in this case we have $f(k-1)+1,$ where the $+1$ accounts for the deletion of the initial $1.$ If the sequence starts with a $0$ and ends with a $1,$ observe that the final $1$ shifts the total number of $1$'s in the string up by $1,$ and the initial $0$ increases the index of each of the trailing $k$ numbers by $1.$ Thus it takes $f(k-1)$ operations to get to a string of $0$'s followed by one $1.$ At this point, it takes $2k+1$ operations to complete the process. Thus in this case we have $f(k-1)+2k+1.$ Now, by Principle of Inclusion and Exclusion, \begin{align*} f(k+1)&=\frac{1}{2}(f(k)+1)+\frac{1}{2}(f(k))-\frac{1}{4}(f(k-1)+1)+\frac{1}{4}(f(k-1)+2k+1) \\ &=\frac{(k+1)(k+2)}{4}. \end{align*}This completes the induction.
20.04.2024 19:44
I thought this one was really nice
01.05.2024 06:24
Let the desired answer be $a_n$, which we will prove is $\boxed{\frac{n^2+n}{4}}$ through recursion. Note the ending state contains all tails. It's easy to get the base cases $a_1 = \tfrac 12$ and $a_2 = \tfrac 32$. For $n \ge 3$, we use casework on the first and last flip: Starts with $T$, ends in $H$: Notice the ending $H$ will never be changed until all $n$ flips are heads, and this stage cannot be reached without changing the leading $T$, which only occurs after we have the middle $n-2$ flips are tails as well. Hence we expect $a_{n-2} + (n-1) + n$ moves in this case. Start with $H$ or ends in $T$: We use PIE: Starts with $H$: Never gets changed until the final $n-1$ coins are tails, so we expect $a_{n-1}+1$. Ends with $T$: Never changed, so we expect $a_{n-1}$. Both: Using our previous analysis, we get $a_{n-2}+1$. Thus our recursion is \[a_n = \tfrac 14 (a_{n-2} + 2n-1) + \tfrac 12 (a_{n-1}+1) + \tfrac 12 (a_{n-1}) - \tfrac 14 (a_{n-2}+1) = a_{n-1} + \tfrac n2,\] from which we get the desired closed form. $\blacksquare$
19.09.2024 22:17
Not sure if anybody posted this sol yet Let $E(n)$ be the wanted expected value, we claim that $E(n) = \dfrac{n(n +1)}{4}$ which is indeed finite and thus all configurations terminate. To prove this we induct, the $n = 1$ case is obvious. Now assume this is true for some $n$, and consider an arbitrary sequence $S$ of $n + 1$ coins. First, we consider when the last coin is $H$. Define the inverse of a row of coins as the result when all the coins are flipped and orderly reversed. For example, the inverse of $HHTTH$ is $THHTT$. Moreover, let $P$ be the sequence of the first $n$ coins in $S$, and let $Q$ be the inverse of $P$. If a row of coins has $h$ instances of $H$, define the operations: $f$ as flipping coin $h$. $g$ as flipping coin $h + 1$. Claim 1: $g^k(P)$ is the inverse of $f^k(Q)$ for all positive integers $k$. Proof. It is sufficient to show for $k = 1$, then iterating yields the claim. Suppose $P$ has $h$ instances of $H$. Then $g(P)$ has coin $h + 1$ flipped, so its inverse is $Q$ with coin $(n + 1) - (h + 1) = n - h$ flipped. On the other hand, $Q$ has $n - h$ instances of $H$, so $f(Q)$ is $T$ with coin $n - h$ flipped. So they are equivalent. Note that the inverse of the all-$H$ sequence is all-$T$. Now since applying $f$ onto $S$ applies $g$ onto $P$, the number of steps until it reaches the all-$H$ sequence is the same number of steps for $Q$ to reach the all-$T$ sequence. But since the inverse property is a bijection, the average number of such steps is simply $E_n$. There is an additional $n + 1$ steps to go from all-$H$ to all-$T$, so the cumulative expected value is $E_n + n + 1$. Now when the last coin is $T$, it is unaffected, and the average over all these cases is $E_n$. We have $E_{n + 1} = \dfrac{1}{2}(E_n + E_n + n + 1) = \dfrac{n(n + 1)}{4} + \dfrac{n + 1}{2} = \dfrac{(n + 1)(n + 2)}{4}$ and the induction is finished. It is done.
12.12.2024 00:43
We claim that the expected value of $L(C)$ over all possible $2^n$ configurations of length $n$ is $\frac{n(n+1)}{4}.$ This will prove part (a), since this value is finite, and finish part (b). Let $e_n$ be the answer for $n,$ $a_n$ be the expected value for a sequence of length $n$ that ends with $H,$ and let $b_n$ be the expected value for a sequence of length $n$ that ends with $T.$ Observe that $e_n = \frac12 (a_n+b_n).$ Now, suppose that $n \geq 3.$ Clearly, from $a_n,$ if the sequence starts at $T$ we can ignore the front and end of the sequence temporarily, and we will have on average $e_{n-2}$ operations for the middle $n-2$ coins to all become $T.$ Then, there will be $n-1$ operations to turn everything into $H,$ and finally $n$ operations to turn everything into $T.$ However, if from $a_n$ the sequence starts at $H,$ we are essentially in the same situation as $a_{n-1}$ by ignoring the first coin, and then one more operation is required to turn the first $H$ into a $T.$ Therefore, $$a_n=\frac12 (e_{n-2}+2n-1)+\frac12 (a_{n-1}+1) = n+\frac12 e_{n-2}+\frac12 a_{n-1}.$$ Meanwhile, it is easy to see that $$b_n=e_{n-1}=\frac12 (a_{n-1}+b_{n-1}).$$ Therefore, $$a_n = n+\frac12(b_{n-1}+a_{n-1})=n+b_n.$$Substituting this into the second recurrence yields $$2b_n = n-1+2b_{n-1} \implies 2(b_n-b_{n-1})=n-1 \implies 2(b_n-b_2)=(n-1)+(n-2)+\cdots+3+2.$$Therefore, $b_n=\frac{n(n-1)}{4}.$ From here, we have that $$e_n=\frac12(n+2b_n)=\frac{n(n+1)}{4}.$$QED
22.12.2024 13:01
First we compute the following. Claim : The number of steps it takes to reach $n$ $T$s from $n$ $H$s is exactly $n$. Proof : Clearly in each step the last coin turns from $H$ to $T$ since we go from $n$ $H$s to $0$ $H$s. Thus, we take \[HH\dots H \longrightarrow HH \longrightarrow \dots \longrightarrow HT \dots H TT\dots T \longrightarrow TT\dots TT\]which takes exactly $n$ steps. Let $M_n$ be the total sum of steps taken over all $2^n$ configurations to terminate. Now, we make the following key observation. Claim : Let a certain pair of configurations of coins \[[a_1 a_2 \dots a_{n-1}] H - [b_1 b_2 \dots b_{n-1}] T \]be \textit{paired} if for the first $n-1$ coins, exactly one of $a_i$ and $b_{n-i-1}$ is tails and the other is head. We claim that a pair of configurations remain paired until they reach their ends states (all $H$ and all $T$). Proof : Assume that before a certain move these are paired. Then, note that if we have $i$ Heads in $[a_1 a_2 \dots a_{n-1}] H$, then we have $n-1-i$ Heads in $[b_1 b_2 \dots b_{n-1}]T$ since the places which have Heads in the configuration paired to this has exactly $i-1$ Heads in places besides the final $H$. Now, note that this means we swap Position $i$ and Position $n-i-1$ respectively of these two configurations. Thus, before the move one of Position $i$ and $n-i-1$ had Heads and the other Tails, and since both are flipped in this move, this condition still holds. Thus, if a pair of configurations is paired (each configuration has a unique partner obviously) then they are paired until the end. Thus, the sum of moves taken by the configurations ending with $H$ to reach the all $H$ state is equal to the total number of moves taken by a configuration ending with $T$ to reach the all $T$ state. But this is just $M_{n-1}$. Thus, the sum of steps taken by all configurations ending with $H$ to reach all $H$ is $M_{n-1}$ from which each configuration takes exactly $n$ steps to finish. Thus, we have a total number of steps, \[M_{n} = 2M_{n-1} + 2^{n-1}n\]Now, it is easy to show that this recursion gives, \[M_n = 2^{n-2}n(n+1)\]which makes the average number of steps taken to reach termination state over all $2^n$ configurations just \[\frac{2^{n-2}n(n+1)}{2^n} = \boxed{\frac{n^2+n}{4}}\]
24.12.2024 01:03
Phenomenal problem, I think my solution is different than most above and very weird. I claim the answer is $\boxed{\frac{n(n+1)}{4}}$. In particular, I will prove that the sum of the number of operations over all permutations is $2^n \frac{n(n+1)}{4}$, implying the result. \begin{tabular}{|c c c c c c c c|} \hline HHH & HTH & THH & TTH & HHT & HTT & THT & TTT \\ [0.5ex] \hline \hline HHT & HHH & TTH & HTH & HTT & TTT & HHT & . \\ \hline HTT & HHT & HTH & HHH & TTT & . & HTT & .\\ \hline TTT & HTT & HHH & HHT & . & . & TTT & .\\ \hline . & TTT & HHT & HTT & . & . & . & . \\ \hline . & . & HTT & TTT & . & . & . & . \\ \hline . & . & TTT & . & . & . &. & . \\ \hline \end{tabular}Above is the table for $n=3$. I proceed part a) using induction. The base case is trivial. If the statement is true for $k$, then it is obviously true for $k+1$ in the cases where the string ends in a tail. This is because we can essentially ignore the tail at the end, reducing the string length down. For the ones that end in heads, looking at the table above we can make a crucial claim. Claim: For those which end in heads, the string of sequences until it reaches the all heads position is identical to one that ends in tails except with the sequences flipped across the midpoint of the sequence not including the last head and with the heads and tails inverted. Proof: First let's understand the claim. Looking at $TTH$ for example, the sequence up to $HHH$ is identical to that of $HHT$ up to $TTT$. However, every step of the sequence has the last coin different and the rest are reflected and inverted. For example, the $TT$ are related to $HH$ in that the $TT$ is reflected across its midline to obtain $TT$ and then the heads and tails are inverted. As another example, notice how $TTHH$ gets to $HHHH$ in the same amount of flips as $THHT$ gets to $TTTT$. This is because $TTH$ reflected is $HTT$ which when inverted becomes $THH$ as desired. To prove this claim, let there be $x$ heads in the first $n-1$ coins of a sequence that ends in heads. Then, the resulting sequence will consist of us flipping the $x+1$th coin. In our reflected and inverted sequence that ends in tails, we now have $n-1-x$ heads. Thus, we flip the $n-1-x$th coin. Note that these are symmetric across the midline because $$(x+1) + (n-1-x) = (1) + (n-1)$$Therefore, the sequences will always be analogous under our reflection and inversion and thus will take the same amount of flips to get to either all $H$ or all $T$'s. $\blacksquare$ Now part a) is done, because we have shown that all of the ones ending in $H$ get to all $H$ in the same amount of flips that the ones ending in $T$ take to get to all $T$ (note that once we get all $H$ the sequence is guarenteed to finish). To do part b), proceed again by induction: $$\frac{n(n+1)}{4} \cdot 2^n \cdot 2 + 2^n (n+1) = 2^{n+1} \cdot \frac{(n+1)(n+2)}{4}$$is an identity so we are done (the $n+1$ comes from the $n+1$ moves it takes to get from all $H$ to the finish).
05.01.2025 23:36
I claim the answer is $\frac{n(n+1)}{4}$, which would imply the process terminates. We prove the statement by induction on $n$, with base case $n = 1$ trivial. Hence, assume $n \ge 2$, and let the answer for $n = m$ be $a_m$, so $a_1 = \frac 12$. Claim: If the last coin is $T$, then the process takes an average of $a_{n-1}$ steps. Proof. The last coin never changes; if it could, take the first instance it becomes heads. Then the previous instance must have had $n$ heads, contradicting minimality of time. Thus, by the inductive hypothesis on the first $n-1$ coins, it takes $a_{n-1}$ steps, on average. $\square$ Claim: If the last coin is $H$, then the process takes $a_{n-1} + n$ steps. Proof. Let the first $n-1$ coins be $s_1$, $s_2$, \dots, $s_{n-1}$. Define coins $t_1$, $t_2$, \dots, $t_{n-1}$ such that \[ s_i \text{ heads } \iff t_{n-1-i} \text{ for all } i = 1, 2, \dots, n-1 \](in the natural binary string interpretation, $t$ is the reverse of the inverse of $s$). The key fact is that the condition on the $t_i$ holds. Suppose initially, $k$ of the $s_i$ are heads. Then $n-1-k$ of the $t_i$ are heads. Flip $s_k$ and $t_{n-1-k}$, and note If $s_k$ was heads then $t_{n-1-k}$ was tails. After the flip, then $s_k$ is tails and $t_{n-1-k}$ is heads. If $s_k$ has tails then $t_{n-1-k}$ was heads. After the flip, then $s_k$ is heads and $t_{n-1-k}$ is tails. Either way, the condition still holds. Yet we know the $t_i$ must become all tails with an average of $a_{n-1}$ steps, by the inductive hypothesis. Thus, the $s_i$ become all heads in an average of $a_{n-1}$ steps. So after an average of $a_{n-1}$ steps, all the $n$ coins are heads. Clearly after $n$ steps then we arrive at all tails. $\square$ We finish by computing \[ a_n = \frac{2^{n-1}}{2^n}\cdot a_{n-1} + \frac{2^{n-1}}{2^n}\cdot (a_{n-1} + n) = a_{n-1} + \frac n2 = \frac{n(n+1)}{4} \]as desired. $\blacksquare$
15.01.2025 05:43
Beautiful problem! We will first prove that no matter what, the sequence terminates. Define a reduction as a sequence of moves which changes exactly $1$ coin with $H$ facing up into a $T$ facing up, and leaving all other coins unchanged. We claim that in any situation, if there are $k > 0$ coins, we can obtain a reduction. Define a coin at index $i$ to be the $i^{th}$ coin from the left. Assume that the coins with $H$ facing upwards have indices of $a_1, a_2, \dots a_k$, with \[a_1 < a_2 < \dots < a_k\]Obviously, $a_k \geq k$. Now, let $a_j$ be the index of the first coin with $H$ facing upwards and $a_j \geq k$ (there exists such a $j$ since $a_k \geq k$). If $a_j = k$, we immediately obtain a reduction. Otherwise, consider the following sequence of moves (the subscript represents the index of the coin): \[\underset{k}TT\dots T \underset{a_j}{H} \rightarrow \underset{k}HT\dots T \underset{a_j}{H}\]Now, there are $k + 1$ coins, so we move from \[\underset{k}HT\dots T \underset{a_j}{H} \rightarrow \underset{k}{H}H\dots T \underset{a_j}H\]We see that we keep doing the moves until we reach \[\underset{k}HH\dots H \underset{a_j}{H}\]this takes ($a_j - k$) steps. There are now $k + a_j - k = a_j$ coins with $H$ facing up, so doing a move now gets us to \[\underset{k}HH\dots H \underset{a_j}{H} \rightarrow \underset{k}HH\dots H\underset{a_j}T\]Once again, we can keep doing moves until we eventually reach \[\underset{k}TT\dots T \underset{a_j}{T}\]which takes $(a_j - k + 1)$ moves. Therefore, this sequence of moves is indeed a reduction, since the coin at index $a_j$ is now a $T$, and everything else is unchanged. Therefore, we can keep performing reductions until we obtain no heads. Now, notice that each reduction takes exactly $2(a_j - k) + 1$ moves if there are $k$ coins. Therefore, to go from $k$ coins to $0$ coins, it would take \[2\sum_{i=1}^k a_ i - 2\sum_{i=1}^ki + k = 2\sum_{i=1}^k a_ i - k^2\]steps. Now, if we sum this over all possible arrangements of the $k$ coins, this is equal to \[-k^2 \cdot {n \choose k} + 2\sum_{i=1}^ni\cdot{n - 1 \choose k - 1} = n(n+1){n -1 \choose k-1} - nk{n-1 \choose k-1}\]Now, summing this over all possible values of $k$, \begin{align*} S &= \sum_{k=0}^n\left[n(n+1){n -1 \choose k-1} - nk{n-1 \choose k-1}\right] \\ &= n(n+1)\sum_{k=0}{n-1 \choose k -1} - n \sum_{k=0}^n k {n-1 \choose k - 1} \\ &= n(n+1)\cdot 2^{n-1} - n \cdot (n+1)2^{n-2} \\ &= n(n+1) \cdot 2^{n-2} \\ \end{align*}Therefore, the average number of steps over all $2^n$ sequences is \[\frac{n(n+1) \cdot 2^{n-2}}{2^n} = \boxed{\frac{n(n+1)}{4}}\]and we are done.
15.01.2025 21:23
Solution: $\frac{n(n+1)}{4}$ For the first part, suppose we are making moves until we have to turn a $T$ into an $H$ (if not we'd eventually have $HHHH \dots H$, which will clearly terminate). After that first move, we will create a chain of $H$'s until we get to an $H$, which will be turned into a $T$ (say $k$ $H$'s), hence we will then have a chain of, at least, $k+1$ $T$'s until we reach a $T$ which will be transformed into an $H$ and so we will have a chain of at least $k+2$ $H$'s before we have to turn again... The length of the chain is strictly increasing (except if it terminates), but its length must be $\leq n$, so it must terminate. For the second part, we prove by induction that the average is $av=\frac{n(n+1)}{4}$. We have to prove some claims before though. Claim 1: There're $n2^{n-1}$ $T$'s over all $2^n$ configurations Proof: There are $n2^n$ total letters and half are $T$'s and half are $H$'s, hence the total amount of $T$'s is $T_t=n2^{n-1}$. Claim 2: Suppose we add a final $H$ to a configuration $c$, which takes $k$ steps, then this new configuration will take $k+2t+1$ steps, where $t$ is the amount of $T$'s on $c$. Moreover, if we instead add a final $T$, the amount of steps doesn't change Proof: We prove by strong induction on $n$ an stronger result actually: if we add $l$ $H$'s, then we will have $k+2tl+l$. If $l=1$, the claim is achieved. Case $n=2$ is trivial. Suppose it for $1,2, \dots ,n$ and consider a configuration of $(n+1)$ coins: if it finishes on an $H$, it will have $2t+1$ more than the previous $(n-1)$ configuration (which will have $(k-2t-1)$, hence in that $(n-1)$ config, we add $(l+1)$ $H$'s, getting $k+2tl+l$, as desired. If it finishes on a $T$'s eliminate them all and suppose that config takes $k$ steps. We add an $l$ $H$, so we have $k+l+2l(t-1)$ as a result. Then, we add that remaining $T$ we removed before, which will increase the number of steps by 2 for each $H$, summing a total of $k+2lt+l$. The adding of the 2 for each $H$ is due to the following: Let's consider the first configuration without that $T$: Whenever it gets to the $H$ that will go in the place of the $T$, it will go down $a_1$ steps, then it will go up $a_1+1$ steps, then it will go down $a_2$ and up $a_2 +1$, \dots for all $l$ $H$'s, summing in that part up to $2\sum_{i=1}^la_i+l$. Whilst in the other configuration (with the $T$), it happens the following. As the $T$ doesn't affect the number of $H$'s, it will all go in the same way until we reach that $T$, which will turn into an $H$, then the next $H$ will turn into an $H$ and go down $a_1+1$ steps and up $(a_1+1)+1=a_1+2$, again down $a_2+1$ and up $a_2+2$, summing in that part up to $2\sum_{i=1}^la_i+3l$, as desired. Now, we finish by induction. Suppose $av(n)=\frac{n(n+1)}{4}$, then by adding a final $H$ and $T$ to every of the $2^n$ configurations, we have: $av(n+1)=av(n)+\frac{2T_t+2^n}{2^{n+1}}=\frac{n(n+1)}{4}+\frac{n+1}{2}=\frac{(n+1)(n+2)}{4}$, as desired.