We are given the finite sets $ X$, $ A_1$, $ A_2$, $ \dots$, $ A_{n - 1}$ and the functions $ f_i: \ X\rightarrow A_i$. A vector $ (x_1,x_2,\dots,x_n)\in X^n$ is called nice, if $ f_i(x_i) = f_i(x_{i + 1})$, for each $ i = 1,2,\dots,n - 1$. Prove that the number of nice vectors is at least \[ \frac {|X|^n}{\prod\limits_{i = 1}^{n - 1} |A_i|}. \]
Problem
Source:
Tags: function, vector, probability, inequalities, linear algebra, matrix, induction
15.04.2008 15:12
First of all, there are $ |X|^n$ ways to choose $ n$ values (not necessarily distinct) from set $ X$. Since $ f_i: X\rightarrow\ A_i$, so we can choose the value of $ f_i(x_i)$ for $ |A_i|$ ways for each $ i=1,2,...,n-1$. Similarly, we can choose the value for $ f_i(x_{i+1})$ by $ |A_i|$ ways for each $ i=1,2,...,n-1$. So there are $ |A_i|^2$ possibilities of choosing values for $ f_i(x_i)$ and $ f_i(x_{i+1})$ for each $ i=1,2,...,n-1$. But since $ f_i(x_i)=f_i(x_{i+1})$ for each $ i=1,2,...,n-1$, we need only $ |A_i|$ ways of choosing the values for $ f_i(x_i)$ and $ f_i(x_{i+1})$ for each $ i=1,2,...,n-1$. So the number of nice vectors are at least $ |X|^n(\frac{|A_1|}{|A_1|^2})(\frac{|A_2|}{|A_2|^2})...(\frac{|A_{n-1}|}{|A_{n-1}|^2})=\frac {|X|^n}{\prod\limits_{i = 1}^{n - 1} |A_i|}.$
18.04.2008 02:49
We use a probabilistic method, so $ P(e(x): x\in S)$ denotes the probability that an event $ e$ dependant on some variable $ x$ happens as $ x$ varies randomly with uniform probability (which I'll call 'uniformly') on a set $ S$ (though if dependencies are obvious we omit them). As is conventional, $ P(e|d)$ denotes the probability that $ e$ happens given $ d$ happens. We aim to show that \[ P(\textbf{x}\text{ is nice}: \textbf{x} \in X^n ) \geq \frac {1}{\prod_{i = 1}^{n - 1} |A_i|} \] First, we fiddle our functions a little. We replace $ A_i$ with $ f_i(X)$ to force our functions to be surjective. We can then define a new set of functions $ g_i : A_i \rightarrow \mathbb{N}$ and sets $ G_i$ \[ G_i(a) = \{x \in X | f_i(x) = a\}, g_i(a) = |G_i(a)| \] We remark that every $ x\in X$ must be contained in precisely one G-set, so $ \sum_a g_i(a) = |X|$, a fact we shall need later. Lemma: $ f_i(x_i) = f_i(x_{i + 1}) : \textbf{x} \in X^n$ and $ f_j(x_j) = f_j(x_{j + 1}) : \textbf{x} \in X^n$ are independent events for $ i \not = j$. If $ |i - j| > 1$ the events are literally independent in that the variables each depend on are selected entirely independently of one another. It follows that we need only consider the case $ j = i + 1$. We consider the probability of the event $ e_1: f_i(x_i) = f_i(x_{i + 1})$. Suppose we know that event $ e_2: f_{i + 1}(x_{i + 1}) = f_{i + 1}(x_{i + 2})$ happens (and it does with positive probability). Consider the $ P(x_{i + 1} = y| e_2 : x_{i + 2} \in X)$ for some fixed $ y \in X$. If the probability is constant for all $ y$, we are done, because then $ x_{i + 1}$ continues to vary uniformly and so $ P(e_1) = P(e_1|e_2)$, because the only common value $ e_1$ and $ e_2$ depend on is $ x_{i + 1}$, which would vary uniformly if we did not know the outcome of $ e_2$. Now, $ P(x_{i + 1} = y| e_2: x_{i + 2} \in X) = \frac {P(f_{i + 1}(y) = f_{i + 1}(x_{i + 2}): x_{i + 2} \in X)}{g_{i + 1}(f_{i + 1}(y))}$ (because if $ e_2$ is satisfied then this must certainly happen, but the probability it happens with $ x_{i + 1} = y$ is clearly only $ 1/g_{i + 1}(f_{i + 1}(y))$ times the probability it happens overall, because $ x_{i + 1}$ is chosen randomly uniformly and is just as likely to be any of the $ g_{i + 1}(f_{i + 1}(y)) - 1$ alternatives). And this becomes $ \frac {P(y \in G_{i + 1}(f_{i + 1}(x_{i + 2})): x_{i + 2} \in X)}{g_{i + 1}(f_{i + 1}(y))} = \frac {g_{i + 1}(f_{i + 1}(y))}{|X|g_{i + 1}(f_{i + 1}(y))} = \frac {1}{|X|}$ because as $ x_{i + 2}$ varies uniformly it lands in the same G-set as $ y$ precisely $ g_{i + 1}(f_{i + 1}(y))$ times by definition of $ g$. So we happily see that the probability $ x_{i + 1}$ still varies uniformly over $ X$ even if we know $ e_2$ happens, so the events are independent and the lemma is proven. We next turn to directly estimating the probabilities $ P(f_i(x_i) = f_i(x_{i + 1}) : x_i, x_{i + 1} \in X)$, now knowing that we can simply take their product to give the probability that such an event holds for all $ i$. This will prove to be much less fiddly than the lemma. $ P(f_i(x_i) = f_i(x_{i + 1})) = P(\exists a\in A_i \text{ s.t. } x_i, x_{i + 1} \in G_i(a))$. Now if we fix $ a \in A_i$, as $ x\in X$ varies over all of $ X$, it is found in $ G_i(a)$ precisely $ g_i(a)$ times. As $ x_i, x_{i + 1}$ vary, they must be found in the same $ G_i(a)$ precisely $ (g_i(a))^2$ times. We therefore find the above probability evaluating as shown, which we then estimate using Chebyshev's (non probabilistic) inequality. \[ P(f_i(x_i) = f_i(x_{i + 1})) = \frac {\sum_{a \in A_i}(g_i(a))^2}{|X|^2} \geq \frac {(\sum_{a\in A_i} g_i(a))^2}{|A_i||X|^2} = \frac {|X|^2}{|A_i||X|^2} = \frac {1}{|A_i|} \] We seek the probability $ P(\textbf{x}\text{ is nice}: \textbf{x} \in X^n )$, that $ f_i(x_i) = f_i(x_{i + 1})$ for all $ i$. However, since these events are independent by the lemma, we may simply take their product to give the final probability, thus: \[ P(\textbf{x}\text{ is nice}: \textbf{x} \in X^n ) = \prod_{i = 1}^{n - 1} P(f_i(x_i) = f_i(x_{i + 1})) \geq \prod_{i = 1}^{n - 1} \frac {1}{|A_i|} \] This is the result required, and multiplying by the $ |X|^n$ different possibilities for randomly selected vectors, gives the result as stated in the problem.
22.04.2008 15:59
Draft. Let's analyze the case $ n = 2$. Let $ A_1 = \{b_1,\ldots,b_t\}$ for some $ t$ and let $ |X| = k$. Denote by $ a_1,\ldots,a_t$ the number of $ x\in X$ for which $ f_1(x) = b_1,\ldots,b_t$ respectively. Clearly $ \sum_{i = 1}^t a_i = k$. It is straightforward to see that the number of nice vectors is $ \sum_{i = 1}^t a_i^2$, hence the inequality to prove becomes $ \sum_{i = 1}^ta_i^2\ge\frac {\left(\sum_{i = 1}^ta_i\right)^2}t = \frac {k^2}t$, and it follows directly from Cauchy. Let a nice short vector of length $ m$ be a vector $ (x_1,\ldots,x_m)\in X^n$ which fulfills the statement $ f_i(x_i) = f_i(x_{i + 1})$ for all $ i = 1,2,\ldots,m - 1$, where $ 1\le m\le n - 1$. Let $ X = \{y_1,\ldots,y_k\}$. Assume that there are $ t_1,\ldots,t_k$ short nice vectors of length $ m < n - 1$ which end in $ y_1,\ldots,y_k$ respectively. Also for some $ y\in X$, let $ t_y$ denote the number of short nice vectors of length $ m$ ending in $ y$. Assume $ |A_m| = s$ and let $ X = X_1\cup X_2\cup\ldots\cup X_s$, where $ X_i\cap X_j = \emptyset$ if $ i\neq j$ be a partition of $ X$ satisfying: $ a,b\in X$ and $ f_m(a) = f_m(b)$ if and only if $ a,b\in X_i$ for some $ i\in\{1,2,\ldots,s\}$. Then a short nice vector of length $ m$ ending in $ x\in X$ is the Root of $ |X_i|$ short nice vectors of length $ m + 1$, where $ x\in X_i$. Denote by $ b_i = |X_i|$. Let $ S_m = \begin{bmatrix}t_1 & t_2 & \ldots & t_k \\ t_1 & t_2 & \ldots & t_k \\ \ldots \\ t_1 & t_2 & \ldots & t_k \end{bmatrix}$. Let $ U_m = (u_{ij})$, be a $ k\times k$ matrix with entries $ u_{ij} = 1$ if vectors $ x_i$ and $ x_j$ belong to the same $ X_l$, for some $ 1\le l\le s$ and $ u_{ij} = 0$ otherwise. In words, $ u_{ij} = 1$ if a short nice vector ending in $ x_i$ can be extended to a $ m + 1$ vector by adding $ x_j$ at the end. It is easy to see that $ S_mU_m$ is a matrix $ S_{m + 1}$ of the same form as $ S_m$: $ S_{m + 1} = \begin{bmatrix}T_1 & T_2 & \ldots & T_k \\ T_1 & T_2 & \ldots & T_k \\ \ldots \\ T_1 & T_2 & \ldots & T_k\end{bmatrix}$. Moreover by the arguments above, we easily see that the number of nice vectors of length $ m + 1$ is $ \mathrm{tr}(S_{m+1})=T_1 + \ldots + T_k$. This is because $ T_i=\sum_{j=1}^k t_ju_{ij}$ and $ u_{ij}=1$ if and only if a short nice vector of length $ m$ ending in $ y_i$ can be added at the end the vector $ y_j$. Hence $ S_{m + 1} = S_mU_m$, for each $ m = 1,2,\ldots,n - 1$. We easily see that $ S_1 = \begin{bmatrix}1 & 1 & \ldots & 1 \\ 1 & 1 & \ldots & 1 \\ \ldots \\ 1 & 1 & \ldots & 1\end{bmatrix}$ (meaning that there is exactly one short nice vector of length 1 ending in $ y_i$). Hence $ S_n = S_1U_1\ldots U_{n - 1}$ where we take account of the fact that the multiplication of matrices is associative. All matrices involved in the product are square matrices of dimenstion $ k = |X|$. Note now that the number of nice vectors is $ \mathrm{tr}(S_n)=g_1+g_2+\ldots+g_k$, where $ S_n=\begin{bmatrix}g_1 & g_2 & \ldots & g_k \\ g_1 & g_2 & \ldots & g_k \\ \ldots \\ g_1 & g_2 & \ldots & g_k\end{bmatrix}$ Now, $ \mathrm{tr}$ is a commutative operator, hence $ \mathrm{tr}(S_n)=\mathrm{tr}(S_1U_1\ldots U_{n-1})=\mathrm{tr}(S_1\cdot U_{\sigma(1)}U_{\sigma(2)}\ldots U_{\sigma(n-1)})$, where $ \sigma$ is any permutation of $ \{1,2,\ldots,n-1\}$. Our task is to show that $ \mathrm{tr}(S_1U_1U_2\ldots U_{n-1})\ge\displaystyle\frac{k^n}{|A_1|\cdot\ldots\cdot |A_{n-1}|}$. now some induction with a suitable arrangement of the $ U_i$'s should work
25.04.2008 09:30
Let $ X =${$ y_{i}$ for $ i = 1$ to $ |X|$} $ f_{j}: X \to A_{j}$ let $ A_{j} =$ {$ a_{j,i}$ for $ i = 1$ to $ |A_{j}|$} for $ j = 1$ to $ n - 1$ good $ m$ vector implies a good vector with $ m$ dimensions we define $ e_{j,i} =$ no of elements from X that satisfy $ f_{j}(y_{j}) = a_{j,i}$ we define an anologous function $ g_{j}:$ good $ j - 1$ vectors $ \to A_{j}$ such that $ g_{j}(x_{1},x_{2},...,x_{j - 2},x_{j - 1}) = f_{j}(x_{j - 1})$ let $ m_{j,i} =$ number of elements from "good $ j - 1$ vectors" that satisfy $ g_{i}(w_{j}) = a_{j,i}$ for $ w_{j} \in$ good $ j - 1$ vectors for all $ j$ _________________________________________________________________________________ for $ n = 2$ now the number of vectors $ (x_{1},x_{2}) \in X^{2}$ now we count the number of vectors for which $ f_{1}(x_{1}) = f_{1}(x_{2}) = a_{1,i}$ this is equal to $ e_{1,i}^{2}$ therefore the number of vectors for which $ f_{1}(x_{1}) = f_{1}(x_{2})$ is $ \sum_{i = 1}^{|A_{1}|}e_{1,i}^{2}$ by $ Tchbycheff's$ $ inequality$ this is greater than $ \frac {( \sum_{i = 1}^{|A_{1}|}e_{1,i})^{2}}{|A_{1}|}$ as $ \sum_{i = 1}^{|A_{1}|}e_{1,i}$ is the number elements in the domain $ = |X|$ hence number of vectors is $ \ge \frac {|X|^{2}}{|A_{1}|}$ therefore conjencture is true for $ n = 2$ ________________________________________________________________________________ assume that the conjencture is true for $ n = k - 1$ i.e. number of vectors {$ x_{1},...,x_{k}$} $ \in |X|^{k}$ satisfying $ f_{i}(x_{i}) = f_{i}(x_{i + 1})$ for $ i = 1$ to $ k - 1$ is atleast $ \frac {|X|^{k}}{\prod_{i = 1}^{k - 1}|A_{1}|}$ ________________________________________________________________________________ to prove the conjencture is true for $ n = k$ $ f_{k}: |X| \to A_{k}$ now we count the number of good $ k$ vectors $ ($ good $ k - 1$ vector$ ,x_{k})$ for which $ g_{k}($good $ k - 1$ vector$ ) = f_{k}(x_{k}) = a_{k,i}$ for all $ i = 1$ to $ |A_{k}|$ is $ m_{k,i} \cdot e_{k,i}$ therefore the number of good $ k$ vectors is $ \sum_{i = 1}^{|A_{k}|}m_{k,i} \cdot e_{k,i}$ observe that $ e_{k,i}$ and $ m_{k,i}$ are similarly ordered sequences therefore $ \sum_{i = 1}^{|A_{k}|}m_{k,i} \cdot e_{k,i} \ge \frac {(\sum_{i = 1}^{|A_{k}|}m_{k,i}) (\sum_{i = 1}^{|A_{k}|}e_{k,i})}{|A_{k}|}$ by $ Tchbycheff's$ $ inequality$ now observe that $ (\sum_{i = 1}^{|A_{k}|}m_{k,i})$ is the number of good $ k - 1$ vectors and is $ \ge$ $ \frac {|X|^{k}}{\prod_{i = 1}^{k - 1}|A_{1}|}$ by our assumption also $ (\sum_{i = 1}^{|A_{k}|}e_{k,i}) =$ number of elements in $ X = |X|$ hence $ \sum_{i = 1}^{|A_{k}|}m_{k,i} \cdot e_{k,i}$ $ \ge$ $ \frac {(\sum_{i = 1}^{|A_{k}|}m_{k,i}) (\sum_{i = 1}^{|A_{k}|}e_{k,i})}{|A_{k}|}$ $ \ge$ $ \frac {|X| \cdot |X|^{k}}{|A_{k}| \cdot \prod_{i = 1}^{k - 1}|A_{1}|} = \frac {|X|^{k + 1}}{\prod_{i = 1}^{k}|A_{1}|}$ ______________________________________________________________________________________ hence by induction number of good vectors is atleast $ \frac {|X|^{n}}{\prod_{i = 1}^{n - 1}|A_{1}|}$ for all integers $ n \ge 2$
27.04.2008 15:19
I noticed that mathlinks contest round 1 was started today (4/27). When is the limit of sending? I think I'm too late.. I tried some specific case. When $ n=2$, Let $ \#f_{1}[X] = k$. we can assume that $ f_{1}[X]=\{1,2,\cdots , k\}$. there are $ a_{i}$ elements in $ X$ whose image is $ i$. Then there are $ a_{1}^{2}+\cdots +a_{k}^{2}$ ( $ \ge \frac{(a_{1}+\cdots +a_{k})^{2}}{k} \ge \frac{|X|^{2}}{|A_{1}|}$ ) nice vectors.... So case $ n=2$ is proved.