Problem
Source: IMO ShortList 2002, combinatorics problem 5
Tags: combinatorics, IMO Shortlist, Set systems
28.09.2004 16:43
Please post your solutions. This is just a solution template to write up your solutions in a nice way and formatted in LaTeX. But maybe your solution is so well written that this is not required finally. For more information and instructions regarding the ISL/ILL problems please look here: introduction for the IMO ShortList/LongList project and regardingsolutions
30.09.2004 14:04
Just an idea. Let A1,A2,...,An∈F. There is X∈F which is not a subset of ⋃ni=1Ai (since F is infinite), so X′=X−⋃ni=1Ai has at most r−1 elements and also has noneempty intersection with all Ai. So, I've proved that for all finite subfamilies F′⊂F, there is a set with r−1 elements which meets all elements in F′. Can you see a nice way to finish it from this point? I think I found one, but it's quite boring, so I won't post it now.
27.04.2005 21:19
consider the union of all sets in the family, call it X. For any set B with at most r-1 elements, we can find a set in the family, A, outside B (not intersecting with B). Now, take any set in the family, there is an element ,a1, in that set, that is contained in infinitely many sets in the family. Consider this infinite family, I1. Take a set ,A1, in the family but outside (not containing) a1,now there is an element a2 in A1 that is contained in infinitely many sets in the family I1, lets call it I2. Inductively, outside of the constructed set {a1, a2, ... a(r-1)} there is a set in the family, with an element ar that is in infinitely many sets in the family I(r-1), lets call it Ir, thus we have infinitely many sets(Ir family of sets) each contains a1...ar we reach a contradiction, as sets in the family are all distinct... I think this does it.. it's a lovely problem indeed..
22.09.2010 20:40
What if we put F1={r1,r2},F2={r2,r3} and Fa={r3,r1}, a≥3. It contradicts the problem
27.10.2011 01:53
Raúl wrote: What if we put F1={r1,r2},F2={r2,r3} and Fa={r3,r1}, a≥3. It contradicts the problem the problem said " let be an infinite family of sets" .....it's a set and cannot have two similar element!! i think i solved it but i'm not sure..!! let k∈F so every other element of F has a subset of k as it's intersection with k so because the subset's of K are finite so there are infinitely set of F such that their interesction with K is a subset of K like A1 and K=A1∪X1 ....so we can show them like {A1∪X1,A1∪Y1,A1∪Y2,A1∪Y3,....}and X1∩Yi=∅ ...also ∣A1∣≤r−1 (because F is a set and it's elements are different!) if we repeat this steps for Y1,Y2,Y3,... and do this r+1 times ...i mean that each of Y2,Y3,Y4,... have intersection with a subset of Y1 (maybe empty!) so infinitely of them have intersection with a subset of Y1 call it A2 that Y1=A2∪X2 so we show them as : A1∪A2∪X2,A1∪A2∪Z1,A1∪A2∪Z2,...... and so X2∩Zi=∅ and ∣A1∪A2∣≤r−1 now repeat this forZ1,Z2,Z3,..... then we arrive to A1∪X1,A1∪A2∪X2,A1∪A2∪A3∪X3,...,A1∪A2∪....Ar+1∪Xr+1 and we konw that Xi∩Xj=∅ P=A1∪...Ar+1 thjat because of the last step and difference between the elements of F so ∣P∣≤r−1 and if T∈F,T∩P=∅⇒T∩Xi is nonempty and because 1. xi are disjiont 2.∣P∣≤r−1⇒∣Xi∣≥1 so ∣T∣≥r+1 this is contradiction!!
20.10.2013 04:56
Lemma: Say all elements of an infinite family of (finite) sets F intersect an infinite bunch G of distinct r sets (any f in F and any g in G intersect). Then F also intersects an r−1 set. This lemma kills the problem because we just take G=F and we are done. We do this by induction. First we do r=2. Note that there is some element that appears infinitely many times in G, else we have infinitely many disjoint sets in G and all elements of F must intersect all of them and have infinite size, contradiction. Say it is true for r=k. Now we look at r=k+1. All elements of the infinite family of F intersect each of an infinite family G of k+1 sets. Now some element must show up infinitely many times in G by the exact same argument as for the r=2 case. Call this element 1. Say H is the infinite subfamily of G with 1's, and I is the infinite family of k elements formed by removing the 1's from the sets in G. Now, every member of F that lacks a 1 must intersect every element of I. Hence, by induction some set of k−1 elements meets everything in F without a 1. Then take the union of 1 and this set, and we have a set of k elements that meets everything in F. The lemma is proved by induction, and the statement follows.
20.06.2015 05:17
Let a number k≤r be "good" if, for any k things, only a finite number of the sets of F contain these k things. Clearly r is good since the sets are distinct. Now, assume k is good, and k≥2. Then take any k−1 things, and assume an infinite number of sets of F contain these k−1 things. Call these sets "good" sets, and let X be the set of the k−1 things. Consider one of these good sets, X∪{t1,...,tr−k+1}. By the goodness of k, only finitely many good sets contain one of the ti. So take one good set without any ti, say X∪{t′1,...,t′r−k+1}. By the goodness of k, only finitely many good sets contain one of the ti or t′i, so take one good set without any of those. Continue until I have r+1 good sets that only share elements in X. Let them be G1∩X,G2∪X,...,Gr+1∪X. If every set intersects X, then the problem is proven. Otherwise, there exists a set in F that doesn't intersect X. Then it must intersect G1,...,Gr+1. But all the G's are coprime, so this set has cardinality r+1, contradiction! Thus, k−1 is good. By induction, 1 is good. Take any set of r things a1,...,ar. Finite sets contain ai for any i, and thus finite sets intersect {a1,...,ar}. But all sets of F intersect {a1,...,ar}, contradiction!
26.05.2016 21:42
For a set A∈F and a subset B of A, let S(A,B) denote the set of sets T∈F with A∩T=B. Let |B|=1 is a subset of A∈F. If S(A,B) is empty, then we take all elements in A except for the element in B as our set of size r−1 and this clearly intersects all sets in F. Hence, we may suppose that S(A,B) is nonempty for all such sets A and B. Now, we will prove that S(A,B) is finite for all B⊆A∈F. We will strong induct on |B|. Clearly, the statement is true for |B|=r, since S(A,B)={A}. Now, suppose that the statement is true for |B|=k,k+1,…,r. We will now prove it for all |B|=k−1. Consider a set C∈F such that |A∩C|=1 and B∩C∩A=∅. Such a C exists by the previous paragraph, and let C′ be the subset of C consisting of all elements of C except the element it shares with A. Now, suppose that infinitely many sets in F intersect A at B. Let T be the set of those sets. Now, all sets in T intersect C′, so since there are infinitely many of them, some subset X of C′ is the set of intersection infinitely many times. Now, take some D such that D∩C′=X, and note that infinitely many sets intersect D at B∪X. But since |B∪X|=|B|+|X|≥k, we have a contradiction by strong induction. Since finitely many sets intersect at A at each one of its subsets, there are finitely many sets in F, a contradiction. Hence, the only possibility is the one described in the second paragraph, and we have already found the set of size r−1, so we're done.
27.03.2017 03:54
I'll prove the generalized HMMT formulation: HMMT 2016 Team Round #9 wrote: Fix positive integers r>s, and let F be an infinite family of sets, each of size r, no two of which share fewer than s elements. Prove that there exists a set of size r−1 that shares at least s elements with each set in F. We consider the following algorithm. Initially, let X=∅ and G=F. Then, perform the following: If there exists x∉X which lies in infinitely elements of G, then add x to X and remove all sets in G not containing x. Otherwise, stop. The end result is an infinite sub-family G, such that each S∈G contains every element of X, while |G|=∞ still. We claim that this X works. Evidently, |X|≤r−1. We can take a subset G1, \dots, Gr of sets in G such that Gi∩Gj=X for i≠j, since each element of Gi∖X is in finitely many guys. In particular, |X|≥s. Crude picture: [asy][asy] size(6cm); draw( (0,0)--(3,0)--(3,1)--(0,1)--cycle ); label("X", (3,1), dir(45)); dot( (0.5,0.5) ); dot( (1.5,0.5) ); dot( (2.5,0.5) ); dot( (-0.5,1.2), blue ); dot( (-1.2,1.2), blue ); dot( (-0.5,0.6), red ); dot( (-1.2,0.6), red ); dot( (-0.5,-0.2), heavygreen ); dot( (-1.2,-0.2), heavygreen ); [/asy][/asy] (G1 is blue plus X, et cetera) Then any S∈F must meet each Gi in at least s places. It thus has to meet X in at least s places, since otherwise it would have to meet each Gi in at least one element outside of X, but those are disjoint.
19.10.2017 01:54
Suppose a set X∈F has elements {1,2,⋯n}. For each other set in F, its intersection with X is a subset of X. Now consider all the subsets of X, there is a finite number of them. There are infinitely many sets in F, so there exists some set Y⊂X with such that the set S containing all sets in F with precisely Y as the intersection with X is infinite. First suppose the size of Y is less than n−1. If all the subsets in F meet Y, then we are done. Otherwise, take a set Z in F that is disjoint with Y. For each set in S, they must also intersect with Z at another element not in Y. Let Y1 be the set containing Y and this element. Since |S| is infinite, there exists an infinite set S1∈S such that all the elements in S1 contain Y1. Now do the same thing again to get an infinite set S2 such that all sets in S2 contain Y2 where |Y2|>|Y1|. We can continue to do so until we get to some infinite set Si such that all sets in Si contain Yi where |Yi|=n−1 (For the case when |Y|=n−1, we can just jump to this step and let Yi=Y). But now there can't exist another set in F−Si which doesn't meet Yi as otherwise an infinite subset of Si must contain the same n elements and so we're done.
30.06.2019 05:51
Here is a solution with StoryScene that only works when F is countable. yugrey says this should be generalizable to all F with transfinite induction, but I'm not knowledgeable enough on the subject to verify this. Let F={S1,S2,…}. For any t and any 1≤i≤r let Xi(t) be the set of all sets of size i which meet each of S1,S2,…,St. Note that Xr(t) is always infinite, since every element of F lies in it. Let f(t) be the smallest i so that Xi(t)≠∅. Since Xi(t+1)⊆Xi(t) for all t, it's clear that f(t) is nondecreasing, but f(t) only takes values in {1,2,…,r}, so it eventually becomes some constant k. Now I claim that if f(t)=k, then |Xk(t)| must be finite. Suppose otherwise, so Xk(t) is infinite. Then there exists some S∈Xk(t) so that S isn't in S1∪S2∪⋯∪St, so S contains some element e not in any of the Si for 1≤i≤t. Now consider S′=S∖e, which has size k−1 and meets S1,S2,…,St. This implies Xk−1(t)≠∅, contradiction. Now since |Xk(t)| is finite but nonzero for all large t and Xk(t+1)⊆Xk(t), it follows Xk(t) eventually becomes constant for t≥T. Now take any S∈Xk(T); it follows that S meets all sets in F, and since Xr(t) is infinite, we have k≤r−1, so |S|≤r−1 as desired.
06.10.2019 08:36
The transfinite induction that tastymath references is rather straightforward. In any case, there is something else very interesting about this problem. Amusingly in my research I am reading a paper by mathematician (and IMO 1971 gold medalist) Peter Frankl (https://www.renyi.hu/~pfrankl/2017-4.pdf) that asks what the right bound on F is (it is known that we need F to be bigger than rr/2r and it cannot be bigger than rr/(1+o(1))r). My solution in this thread is reminiscent of some of the lemmas in this paper (in particular the ones about A and G).
07.10.2019 17:50
orl wrote: Let r≥2 be a fixed positive integer, and let F be an infinite family of sets, each of size r, no two of which are disjoint. Prove that there exists a set of size r−1 that meets each set in F. We denote the family of sets by F. Let X:=⋃F∈FF be the underlying set. X is infinite set, since F is infinite. For some x1∈X we choose a set F1 of k−1 elements such that {x1}∪F1∈F. Then we choose x2∈X,x2∉{x1}∪F1 and a set F2 with {x2}∪F2∈F. Proceeding this way, we obtain (disjoint) sets Ui:={xi},i≥1, and sets of r−1 elements Fi,i≥1 such that Ui∪Fi∈F,i=1,2,… and (⋃iUi)⋂(⋃iFi)=∅. Consider X′:=⋃iFi. If X′ is infinite we run the same process again and obtain mutually disjoint sets (we use the same notations again) Ui,i≥1, each one consisting of two elements and sets Fi,i≥1, each one of r−2 elements, such that Ui∪Fi∈F and (⋃iUi)⋂(⋃iFi)=∅. In case X′:=⋃iFi is infinite, we proceed with this process further until we come to a moment we have disjoint sets Ui,i∈N each with k,k<n elements and sets Fi, each with r−k elements, such that Ui∪Fi∈F and (⋃iUi)⋂(⋃iFi)=∅ and X′:=⋃iFi is finite. (Such moment certainly will come, since otherwise we finally get Fi=∅ and disjoint set of Ui of r elements, Ui∈F, which contradicts that any two sets in F meet.) Now, X′ being finite means there exist finitely many subsets of X′ with r−k elements, implying infinitely many sets among Fi,i≥1 coincide with some F0⊂X′,|F0|=r−k. We claim the elements in F0 satisfy the requirement. Indeed, suppose there exists F∈F,F∩F0=∅. But Ui∪F0∈F for infinitely many i, and since Ui are disjoint, some Ui does not meet F, thus Ui∪F0 does not meet F, contradiction. Remark. The fact exploited is that the family F is too large. There is no need F to be infinite. One can go through and estimate how large should be |F|, if finite, in order to apply the same arguments.
08.10.2019 20:02
tastymath75025 wrote: Here is a solution with StoryScene that only works when F is countable. yugrey says this should be generalizable to all F with transfinite induction, but I'm not knowledgeable enough on the subject to verify this. Let F={S1,S2,…}. For any t and any 1≤i≤r let Xi(t) be the set of all sets of size i which meet each of S1,S2,…,St. Note that Xr(t) is always infinite, since every element of F lies in it. Let f(t) be the smallest i so that Xi(t)≠∅. Since Xi(t+1)⊆Xi(t) for all t, it's clear that f(t) is nondecreasing, but f(t) only takes values in {1,2,…,r}, so it eventually becomes some constant k. Now I claim that if f(t)=k, then |Xk(t)| must be finite. Suppose otherwise, so Xk(t) is infinite. Then there exists some S∈Xk(t) so that S isn't in S1∪S2∪⋯∪St, so S contains some element e not in any of the Si for 1≤i≤t. Now consider S′=S∖e, which has size k−1 and meets S1,S2,…,St. This implies Xk−1(t)≠∅, contradiction. Now since |Xk(t)| is finite but nonzero for all large t and Xk(t+1)⊆Xk(t), it follows Xk(t) eventually becomes constant for t≥T. Now take any S∈Xk(T); it follows that S meets all sets in F, and since Xr(t) is infinite, we have k≤r−1, so |S|≤r−1 as desired. Nice! No need for transfinite induction in case the family is uncountable. But, if you use it, it usually is made by indexing the sets in F with ordinals, like {Sα:α<β}, where β is some ordinal. In case β is countable, one can arrange Sα,α<β in a sequence and apply the same argument. But, if β is uncountable, for example the first uncountable ordinal ω1 (when you take the limit, and ω1+1,....etc.), one should use some other argument like the one used below. Not that it is of any difficulty, but it makes the induction pointless. Suppose F is uncountable family satisfying the condition. As you proved, for any countable subfamily F⊂F,F={S1,S2,…}, there exists a finite collection Xk(T),k≤r−1 of sets with k elements (eventually depending on F ), as described in post N12. We simply denote Xk(T) as Xk or rather Xk(F) (to mark it may depend on F). Let ℓ me the maximal possible k in Xk(F) when F varies. Note that for any two countable subfamilies F1,F2⊂F with Xℓ(F1) and Xℓ(F2), Xℓ(F1)∩Xℓ(F2)≠∅. Indeed, F1∪F2 is also countable, and arranging those sets in a row it easily follows Xℓ(F1∪F2)=Xℓ(F1)∩Xℓ(F2). So, we take minimal Xℓ(F) when F varies (but providing in Xk(F) we have k=ℓ). Denote it with X(F0). I claim, all sets in X(F0) (which consist of exactly ℓ<r elements) do the job. Indeed let S0∈X(F0). Take any set S∈F. We may consider F:={S}∪F0. Since Xℓ(F)⊂Xℓ(F0) and Xℓ(F0) is the minimal one, we get Xℓ(F)=Xℓ(F0) or S∩S0≠∅. ◼ Remark. The idea in post #12 may be applied in slightly different way, working only with finite sets. For any collection F of finite sets S1,S2,…,Sn, let X(F) denotes the collection of all sets of size at most r−1 that meet any set in F, and there aren't two sets in X(F) one is the subset of the other. In the same way, as you proved it in #12, there exist finite collection F with X(F) of finite size. So, you take a minimal X(F), under inclusion, and further apply the same method as above.
18.11.2020 03:04
See https://arxiv.org/abs/2010.02541 for new bounds on this problem.
10.04.2021 15:38
Suppose on the contrary that such set does not exist. Claim. For each 1≤n≤r−1 there exists some {a1,...,an} which is contained in an infinite number of sets in F. Proof. Induct on n. For n=1, suppose S0∈F. Then every S in F intersect S0, hence by pigeonhole principle, there exists infinitely many sets intersecting the same element in S0, so we are done. Now for the general case n we can find some {a1,...,an−1} contained in an infinite number of sets S1,S2,... in F. If this set intersects every other set in F then we are done. Otherwise consider a set S0 not intersecting it. Then each of the S1,S2,... intersect S0. By pigeonhole principle an infinite number of them intersect at the same element an, hence taking {a1,...,an} we are done. ◼ Now we can take T={a1,...,ar−1} satisfying the condition in the Claim, being contained in sets S1=T∩{b1},S2=T∩{b2},..., where the b′s are all distinct. Suppose on the contrary that some set S0 in F do not intersect {a1,...,ar−1}. Then it must contain all b1,b2,..., contradiction.
10.05.2021 14:16
FTSOC, assume problem statement is false. Let S be the set of largest size which is a subset of infinitely many elements of F. If there are many such S, then pick any of them. Now, we claim |S|=r−1. Proof: Assume not, clearly |S|<r. Now, we can find f∈F such that f∩S=Φ(else, problem statement follows). Thus, there must be some element t∈f, such that t is in infinitely many of the sets that contain S. Thus, S∪{t} would be a larger set satisfying problem conditions. Thus, we have found an r−1 set which is a subset of infinitely many sets in F. But, observe that we should still be able to apply the above algorithm if problem statement is false. But, we cannot have |S|=r. Thus, we are done.
10.06.2023 03:12
Clearly, there must exist at least one element x that appears in infinitely many sets F. Otherwise, consider any S∈F. For each x∈S, finitely many sets T∈F have the property that x∈T. However, this means that only finitely many sets T∈F can meet S, so infinitely many sets are disjoint from it, contradiction. Now, let U be the maximal set, selected from a greedy algorithm, such that U is a subset of infinitely many S∈F. Clearly, |U|≤r−1. Let F0 be the maximal subfamily of F such that U⊂S for all S∈F0. Since U is maximal, for any V1,V2,…,Vn∈F0, there is infinitely many W∈F0 such that Vi and W only intersect at U for all i. Otherwise, there is infinitely many W∈F0 such that Vi∖U and W∖U aren't disjoint for some I. Since the union of Vi is finite, though, there must exist a particular element that appears in infinitely many W∈F0, contradiction. Thus, let V1,V2,…,Vr+1 all be mutually disjoint except at U. For all S∈F, S and Vi are not disjoint for all i. If S is disjoint from U, then it must reach one element from each Vi∖U, but that's r+1 distinct elements, absurd.