There are 94 safes and 94 keys. Each key can open only one safe, and each safe can be opened by only one key. We place randomly one key into each safe. 92 safes are then randomly chosen, and then locked. What is the probability that we can open all the safes with the two keys in the two remaining safes? (Once a safe is opened, the key inside the safe can be used to open another safe.)
Problem
Source: Taiwan NMO 2006
Tags: probability, expected value, combinatorics proposed, combinatorics, Random walk, random walks
23.03.2006 22:16
It will be the number of permutations ${1,2,..,n}$ without fix point (take general case) $D_n=n!(1-\frac{1}{1!}+\frac{1}{2!}-..+(-1)^n\frac{1}{n!})$ plus $2\cdot D_{n-1}$ - number when one fixed point is one from where we have two keys. sum them and divide by $n!$.
24.03.2006 03:10
Consider the case when the keys inside the two safes are for each other. In this case there are no fixed points, yet no other safes can be opened. The only solution I've come up with is using recurrence...
24.03.2006 18:44
Yes, what is written is fake thing. Another idea. We need to have one or two cycles not more, such that we have one key from one of them and one from another. If we have $n$ number to form full cycle from them there are $\frac{(n-1)!}{2}$ possibilities (hope this estimations is corect). If there is only one cycle the numberof possibilities is $\frac{(n-1)!}{2}$, and if there two then $\displaystyle \sum_{i=0}^{n-2}C^i_{n-2}\cdot \frac{i!}{2}\cdot \frac{(n-2-i)!}{2} =\frac{n!}{4}\sum_{i=0}^{n-2}\frac{1}{(n-i)(n-i-1)}=\frac{n!}{4}(1-\frac{1}{n})=\frac{(n-1)!}{4}$. Total number is $\frac{(n-1)!}{2}+\frac{(n-1)!}{4}=\frac{3(n-1)!}{4}$.
25.03.2006 11:33
I agree with the general principle, but there are $(n-1)!$ circles, because they are directed (key 1 in safe 2 is a different event from key 2 in safe 1). Also there's a computation error: \[ \displaystyle \sum_{i=0}^{n-2} \binom{n-2}i \cdot i!\cdot (n-2-i)! = (n-1) \cdot (n-2)!=(n-1)! \] In particular, the probability for success comes out as \[ \frac{2(n-1)!}{n!}=\frac 2n. \] which calls for a much simpler explanation.
09.05.2006 13:31
I think that the length of cicles does not count, because if one key is for an open safe and the other can open all others safes, we are done. A GOOD configuration is one in which the permutation has only one or two cycles and the chosen numbers are in diferent cycles. It is easy to see that in this case we can open all safes. Conversely, if there are more than two cycles, there is a cylcle from which we cannot open any safe. The same thing holds if there are two cycles and both chosen numbers in the same cycle. The total number of cases is $94!\binom {94} {92}$. The number of GOOD cases is $94 \binom {94} 2$ (when there is only one cycle: 94 ways to define cycle and $\binom {94} 2$ ways to denote the keys) $+ \sum_{i=1}^{94}i \binom i 1\binom {94-i} 1 \binom {94} i$. (\binom {94} i ways to choose the elements of the cycle, i ways to define the cycle, and \binom i 1\binom {94-i} 1 ways to choose the keys ). The calculations are simple now. Hope I didn't any counting mistakes.
30.11.2006 12:47
If there are $n$ keys and at the beginning $k$ safes are open at random then the answer is $\frac{k}{n}$. I have a hypothesis that if one replaces n keys by some bundles of keys (say 3 bundles wich are placed at random and $n-3$ safes are empty) then the answer is again $\frac{k}{n}$. Any ideas of proof? Or counterexample?
01.12.2006 06:26
kunterbunt wrote: In particular, the probability for success comes out as \[\frac{2(n-1)!}{n!}=\frac{2}{n}. \] which calls for a much simpler explanation. You are right, there is a simple explanation, even for the general case of $k$ open safes. Construct a graph with the safes as nodes and a directed edge for each key. Each node has exactly one in-edge and one out-edge. Merge the nodes corresponding to the $k$ open safes into node $0$. Each solution maps to exactly one complete traversal of the graph starting from node $0$. Furthermore exactly $k!$ solutions map to each traversal, because each $0$ in the traversal except for the starting node corresponds to a distinct original node as each node can only be reached once in the original graph, giving exactly $k!$ permutations of the starting nodes for the $k$ $0$'s in the traversal. Consider the number of such traversals. It is simply $\binom{n-1}{k-1}(n-k)!=\frac{(n-1)!}{(k-1)!}$. So the original number of solutions is $\frac{(n-1)!}{(k-1)!}\times k!=k(n-1)!$, giving probability $\frac{k}{n}$
01.12.2006 14:28
Additional question: What is the expected proportion of open safes? Maybe something like $\frac{k(n+1)}{n(k+1)}$, but i'm not sure.
01.12.2006 16:39
roah wrote: Additional question: What is the expected proportion of open safes? Maybe something like $\frac{k(n+1)}{n(k+1)}$, but i'm not sure. Can you post your proof, or any ideas on how to go about it? It would help us a lot...
02.12.2006 10:28
I don't like my proof. But here it is: $e(k,n)$ - expected number of open safes the first open safe may: with probability $\frac{k}{n}$ give no useful keys with probability $\frac{n-k}{n}$ give one new useful key So we have: $e(k,n)=1+\frac{k}{n}e(k-1,n-1)+\frac{n-k}{n}e(k,n-1)$ Boundary conditions $e(n,n)=n$, $e(0,n)=0$. Just a guess (based on experiments with small $k$, $n$): $e(k,n)=\frac{k(n+1)}{k+1}$. The solution is unique. By the way, for probability the equation is: $p(k,n)=\frac{k}{n}p(k-1,n-1)+\frac{n-k}{n}p(k,n-1)$ $p(n,n)=1$, $p(0,n)=0$. And one may guess $p(k,n)=\frac{k}{n}$
08.11.2008 12:17
I'd like to reopen this thread because now i know the solution "from the Book" There are $ n$ safes and $ k$ of them are left open. We use this $ k$ keys to open new safes, to get new keys and so on... When the key will not give a new key? When it opens the safe in which there is a key from the first $ k$ safes. Let's call such a key "useless". The probability of opening all the safes is the probability that the key used to open the last safe was "useless". This is exactly the probability that there is a key from the first $ k$ safes in the last safe. So it is $ \frac{k}{n}$. No graphs, no recursion, almost two years to find
14.11.2008 03:41
Erm... Sounds great, but what exactly is "the last safe"? It is not a fixed safe as far as I understand, so how can we be sure that the probabilities are equal Am I missing something as usual?
14.11.2008 18:00
The probabilities are exactly the same as if instead of having the keys in safes, we draw $ k$ keys out of a bag containing all the keys, and get to draw a new key out of the bag every time we open a sack. This in turn is the same as if we randomly ordered all $ n$ keys at the start, and always just took the first remaining key in our ordering when we open a new safe. In these terms, we win if the safe corresponding to the last key in our ordering is one of the $ k$ safes that are already open.
15.11.2008 12:32
kevinatcausa solution is awesome! but mine is a little bit different. i'm indeed considering a random "last safe". Exact definition is: let $ \tau$ be a stopping time - the time when we cannot open more safes (all safes are open or we have no more keys). $ T = \tau\wedge (n - 1)$ is also a stopping time. The "last safe" is the safe not opened at $ T$ with minimal possible number. Let $ M_{t}$ be proportion of not opened safes at time $ t$ containing keys from the first $ k$ safes. $ M_{t}$ is a martingale. $ T$ - bounded stopping time. $ E(M_{T}) = E(M_{0}) = \frac {k}{n}$ by the optinal stopping time theorem. And $ E(M_{T})$ is the probability we have been searching (the probability of opening all safes is exactly the probability that the last safe contains the key from the first $ k$ safes). This text corrects all the errors of the intuitive post about "last safe".
24.12.2021 13:52
There are several nice ways to think of this result in terms of probabilities. Here is a brilliant, simple pure combinatorial solution to the problem by my friend Drago.