Let $ n$ be a positive integer, let the sets $ A_{1},A_{2},\cdots,A_{n + 1}$ be non-empty subsets of the set $ \{1,2,\cdots,n\}.$ prove that there exist two disjoint non-empty subsets of the set $ \{1,2,\cdots,n + 1\}$: $ \{i_{1},i_{2},\cdots,i_{k}\}$ and $ \{j_{1},j_{2},\cdots,j_{m}\}$ such that $ A_{i_{1}}\cup A_{i_{2}}\cup\cdots\cup A_{i_{k}} = A_{j_{1}}\cup A_{j_{2}}\cup\cdots\cup A_{j_{m}}$.
Problem
Source: China Western Mathematical Olympiad 2002 P4
Tags: vector, linear algebra, matrix, combinatorics proposed, combinatorics
27.12.2008 17:47
Cute. To each of the $ A_l$ associate a column vector $ a_l = \begin{pmatrix} a_{l,1} \\ \vdots \\ a_{l,n} \end{pmatrix} \in \mathcal{M}_{n,1}(\mathbb{R}),$ where $ a_{l,i}$ is one if $ i \in A_l$ and zero otherwise. This way, we have $ n + 1$ vectors living in $ \mathbb{R}^n,$ hence they must be linearly dependent, meaning that we may find $ (\lambda_1, \ldots, \lambda_{n + 1}) \in \mathbb{R}^{n + 1} \setminus \{(0, \ldots, 0) \}$ such that $ \sum_{l = 1}^{n + 1} \lambda_l a_l = 0.$ Now, thanks to the nonnegativity of the $ a_{l,i}$ we may find two set of indices $ P = \{i_1, \ldots, i_k \}$ and $ N = \{j_1, \ldots, j_m \}$ corresponding to the positive and negative $ \lambda_l$ so that the above equality can be rewritten as : $ \sum_{l \in P} \lambda_l a_l = \sum_{l \in N} ( - \lambda_l) a_l.$ Those sets of indices are disjoint by construction and we have that $ \bigcup_{l \in P} A_l = \bigcup_{l \in N} A_l,$ qed. To have an idea of why the use of linear algebra is somewhat natural here, and more, I suggest reading Dimension arguments in combinatorics.
27.12.2008 19:55
This is a nice solution! Thanks.
24.10.2018 19:47
Here's an elementary solution which just uses induction. The idea is pretty simple, to act greedily, but explaining it took a lot of space. Standard stuff: We proceed by induction. $\text{Base case} =\text{Trivial case}$ Assume for $n-1$. We show for $n$. Main part: For a set $U$ of indices. define $$S_U:=\bigcup_{x \in U} A_x, V_U=\{A_x | x \in U\}$$Note that $S_U$ is the union of the members of $V_U$. Call a set bad if it contains $n$ and good otherwise. If there exists at most one bad set, then we are done by the induction hypothesis. So assume at least $2$ bad sets. Assume that there do not exist disjoint subsets $X, Y$ of indices such that $A_x, A_y$ are good for all $x \in X, y \in Y$ and $S_X=S_Y$, otherwise, we are trivially done. Choose two families of sets $A, B$ with $n$ elements each such that each of them contains at least one bad set.
the elements $n$ from all the bad sets. Then by the induction hypothesis, we find two disjoint subsets $X_1, Y_1$ of indices in $A$ and $X_2, Y_2$ in $B$ satisfying: \begin{align} S_{X_{1}}=S_{Y_{1}} \text{,} V_{X_{1}} \cap V_{Y_{1}}=\phi \\ S_{X_{2}}=S_{Y_{2}}, V_{X_{2}} \cap V_{Y_{2}}=\phi \end{align}By our assumption, at least one of $S_{X_{1}}, S_{Y_{1}}$ must contain a bad set, say $S_{X_{1}}$. Similarly assume that $S_{Y_{2}}$ contains a bad set. Hence, we find $M, N$ such that $$S_{M}=S_{X_{1}} \cup S_{X_{2}} \cup \{n\}=S_{Y_{1}} \cup S_{Y_{2}} \cup \{n\}=S_{N}$$We can add back the $\{n\}$ to all the bad sets which exist both in the LHS and RHS, and so we are done. Actually... it seems like we are done, but not! What if there's a set $A_k$ on both the sides? Then $M, N$ are not disjoint. Fortunately, we can fix it in the most natural way! Claim: We can remove $A_k$ from both the sides and we would still have $S_{M} \texttt{\textbackslash} A_{k}=S_{N} \texttt{\textbackslash} A_{k}.$ (here again, we do not consider the element $n$ in $A_k$) Proof of Claim: If $A_k \in V_{X_{1}}$ (note that we are talking about the family of sets $V_{X_{1}}$ and not their union $S_{X_{1}})$ then we can't have $A_k \in V_{Y_{1}}$ by (1). Hence we must have $A_k \in V_{Y_{2}} \implies A_k$ is not in $V_{X_{2}}$, by (2) Now we know that all the elements of $A_k$ are in $S_{Y_{1}}$ and $S_{X_{2}}$. Hence, even if we delete the set $A_k$ from both the sides, the elements will still remain on both the sides and hence we will still have $S_{M} \texttt{\textbackslash} A_{k}=S_{N} \texttt{\textbackslash} A_{k},$ as claimed. $\square$ Hence we are done! $\blacksquare$ P.S. This problem was communicated to me by biomathematics on a different thread, and I recently got to know the source so posted my solution here.
01.12.2019 18:16
Fang-jh wrote: Let $ n$ be a positive integer, let the sets $ A_{1},A_{2},\cdots,A_{n + 1}$ be non-empty subsets of the set $ \{1,2,\cdots,n\}.$ prove that there exist two disjoint non-empty subsets of the set $ \{1,2,\cdots,n + 1\}$: $ \{i_{1},i_{2},\cdots,i_{k}\}$ and $ \{j_{1},j_{2},\cdots,j_{m}\}$ such that $ A_{i_{1}}\cup A_{i_{2}}\cup\cdots\cup A_{i_{k}} = A_{j_{1}}\cup A_{j_{2}}\cup\cdots\cup A_{j_{m}}$. Define the vectors $\vec{v(x)}$ for all $x \in \{1,2, \dots , n+1\}$ such that $\vec{v_i(x)} = 1 \iff i \in A_{x}$ and $0$ otherwise. Notice that we have $n+1$ vectors on a $n-$ vectorspace and hence they ought to be linearly dependent (the proof this is simple indeed). Thus there exist scalars $\epsilon_i \in \mathbb{R} \ \forall i \in \{1,2,\dots , n+1\}$ such that $$\sum_{j=0}^{n+1} \epsilon_j \cdot \vec{v(j)} = 0$$. As the components of each of these vectors are in $\{0,1\}$ and since the total is $0$, some of these must be negative. Pull them on the other side of equality and we have two disjoint non-empty subsets of $\{1,2, \dots , n+1\}$, $I$ and $J$ , where $I = \{i_1,i_2 \dots, i_m\}$ and $J=\{j_1,j_2, \dots , j_l\}$ such that $m+l=n+1$ and $$ \sum_{i \in I} \epsilon_i \cdot \vec{v(i)} = \sum_{j \in J} -\epsilon_j \cdot \vec{v(j)}$$ And hence the desired result.
25.06.2022 15:43
Wizard_32 wrote: Claim: We can remove $A_k$ from both the sides and we would still have $S_{M} \texttt{\textbackslash} A_{k}=S_{N} \texttt{\textbackslash} A_{k}.$ (here again, we do not consider the element $n$ in $A_k$) Proof of Claim: If $A_k \in V_{X_{1}}$ (note that we are talking about the family of sets $V_{X_{1}}$ and not their union $S_{X_{1}})$ then we can't have $A_k \in V_{Y_{1}}$ by (1). Hence we must have $A_k \in V_{Y_{2}} \implies A_k$ is not in $V_{X_{2}}$, by (2) Now we know that all the elements of $A_k$ are in $S_{Y_{1}}$ and $S_{X_{2}}$. Hence, even if we delete the set $A_k$ from both the sides, the elements will still remain on both the sides and hence we will still have $S_{M} \texttt{\textbackslash} A_{k}=S_{N} \texttt{\textbackslash} A_{k},$ as claimed. $\square$ If I'm not mistaken, this is only valid for the first removal. Consider $n=7$, \[(X_1)\quad\{1,2,3,4,5\}\cup\{6,7\} \leftrightarrow (Y_1)\quad\{2,3,4,5,6\} \cup \{1,3\}\]\[(X_2)\quad\{1\}\cup \{2,3,4,5,6\} \leftrightarrow (Y_2)\quad\{1,2,3,4,5\}\cup \{4,6,7\}\]If both pairs of repeated sets $\{1,2,3,4,5\}$ and $\{2,3,4,5,6\}$ were removed from both sides, the remaining sets \[\{1\}\cup \{6,7\} \leftrightarrow \{1,3\}\cup\{4,6,7\}\]would be irreconcilable. One (counterintuitive) fix is to only remove one copy of each repeated set. If only the repeated sets in $X_1$ and $Y_1$ are removed (while preserving their counterparts in $X_2$ and $Y_2$), we end up with disjoint set indices on both sides, but with the same union: any element in a removed set (removed from $X_1$ or $Y_1$) would still exist in both $X_2$ and $Y_2$. In the above example, this corresponds to the final configuration \[\{1\}\cup \{2,3,4,5,6\}\cup \{6,7\} \leftrightarrow \{1,3\}\cup \{1,2,3,4,5\}\cup \{4,6,7\}.\] The only caveat would be the removal of all sets in $X_1$ originally containing the element $n$; but this can be avoided by selecting $X_2,Y_2$ among all sets, but particularly excluding some set in $X_1$ containing element $n$.