For integer $n\geq2$, let $x_1, x_2, \ldots, x_n$ be real numbers satisfying \[x_1+x_2+\ldots+x_n=0, \qquad \text{and}\qquad x_1^2+x_2^2+\ldots+x_n^2=1.\]For each subset $A\subseteq\{1, 2, \ldots, n\}$, define\[S_A=\sum_{i\in A}x_i.\](If $A$ is the empty set, then $S_A=0$.) Prove that for any positive number $\lambda$, the number of sets $A$ satisfying $S_A\geq\lambda$ is at most $2^{n-3}/\lambda^2$. For which choices of $x_1, x_2, \ldots, x_n, \lambda$ does equality hold?
Problem
Source: 2012 USAMO problem #6
Tags: probability, inequalities, real analysis, 2012 USAMO, Probabilistic Method, Hi, integration
26.04.2012 01:45
Three words: Second Moment Method. (And you're instantly done... what a silly problem.)
26.04.2012 01:46
pythag011 wrote: Three words: Second Moment Method. (And you're instantly done... what a silly problem.) can you explain second moment method? never heard of it
26.04.2012 01:48
http://en.wikipedia.org/wiki/Second_moment_method Let vi be the random variable that is xi with probability 1/2 and -xi with probability 1/2. Then, Var(vi) = 1. We can make the random variables independent, so var (v1+v2+...+vn) = 1, and then the problem follows directly from Markov. (http://en.wikipedia.org/wiki/Markov's_inequality) I didn't want to cite that in the test, so I just did the roundabout method of just looking at the sums of all the $S_i^2$ (Which is quite motivated even if you haven't the second moment method... the denominator is the thing you want squared, so it's natural to consider a sum of squares.
26.04.2012 01:59
I'll show you my proof. I looked at it and saw Chebyshev's Inequality: maybe that's what Pythag meant.
The thought patterns here belong to modern (Lebesgue-style) analysis, or to probability, which is pretty much the same thing. Or combinatorics, which can use the same style. The indicator function is a very useful piece of notation.
26.04.2012 02:06
That's pretty much my solution, but a lot of it is encapsulated in the following theorem in combinatorics: If x is a random variable, define E[x] to be the expected value of x. The Variance of x, which we will denote Var[x], is E[(x-E[x])^2]. A special case of what is called "The Second Moment Method" in combinatorics is the following theorem: If x1, x2,..., xn are independent random variables, then Var[x1+x2+....+xn] = Var[x1]+Var[x2]+...+Var[xn]. So using the construction in our previous post, the conditions are equivalent to (if we set x = the random variable corresponding to the sum of xi) E[x] = 0. Var[x] = 1. Or, rewriting with the definition of variance, E[x] = 0. E[x^2] = 1. So by Markov's inequality, the probability x^2 >= a is at least (1/a). Note that x = 2$S_A$, where A is the set of i such that x_i is positive. Then, noting that for every positive x we have a negative x and plugging in a = $4\lambda^2$ finishes the problem.
26.04.2012 02:31
26.04.2012 02:49
pi37 wrote:
Nope, that's right.
26.04.2012 03:20
Thanks to trigfan, I realized that there actually is an easy way to end the problem without markov's (so I actually could have gotten it on the test)
I remember learning this technique from pythag011 in one of those random additive combinatorics WOOT classroom lectures... If only I spent more than 10 minutes on problem 6 on the test Also trigfan would like to know if this is correct.
26.04.2012 03:32
pi37 wrote: Thanks to trigfan, I realized that there actually is an easy way to end the problem without markov's (so I actually could have gotten it on the test)
I remember learning this technique from pythag011 in one of those random additive combinatorics WOOT classroom lectures... If only I spent more than 10 minutes on problem 6 on the test Also trigfan would like to know if this is correct. Something like that works. Interestingly, being a "combinatorialist" (more like combinatorics student, but oh well D:) gave me three free problems on this USAMO: #2, because it was standard probabilistic method; #3, because it was basically a much easier version of EDP, and #6, because it's one application of the second moment method.
26.04.2012 08:07
Being a combinatorialist can help you solve basically any problem on the USAMO if you are a good combinatorial student.
27.04.2013 03:35
I managed to solve the inequality, but not the equality case. Lemma: (Chebyshev) For any positive $\epsilon$ and random variable $X$, \[\text{Pr}\left(|X-\mu| \ge \epsilon \sigma\right) \le \frac{1}{\epsilon^2}\] where $\mu$ is the expected value of $X$ and $\sigma^2$ is its variance. Proof: \[\sigma^2 = \mathbb{E}((X-\mu)^2) \ge \epsilon^2 \sigma^2 \text{Pr}\left(|X-\mu| \ge \epsilon \sigma\right)\] from which the result follows. We first compute $\mathbb{E}(S_A)$. Let $B=\{1, 2, \cdots, n\} \backslash A$ (that is, the set containing all integers from $1$ to $n$ not in $A$). Then it is not difficult to see that $S_A + S_{B} = \sum_{i=1}^n x_i = 0$, and since there is an evident bijection between $A$ and $B$, we have that $\mathbb{E}(S_A) = 0$. Note that \begin{align*} \sum_{A \subseteq \{1, 2, \cdots, n\}} S_A= \sum_{A \subseteq \{ 1, 2, \cdots, n\}} \left(\sum_{i \in A} x_i\right)^2 = 2^{n-1} \sum_{i=1}^n x_i^2 + 2^{n-2} \sum_{\substack{i, j \in A \\ i \neq j}} 2x_ix_j = 2^{n-1} - 2^{n-2} = 2^{n-2}\end{align*} so it follows that $\mathbb{E}(S_A^2) = \frac{2^{n-2}}{2^n} = \frac{1}{4}$. Therefore $\sigma^2 = \mathbb{E}((S_A-\mathbb{E}(S_A))^2) = \frac{1}{4}$. Now we have $\text{Pr}(|S_A-\mathbb{E}(S_A)| \ge \epsilon\sigma )\&= \text{Pr}\left(|S_A| \ge \frac{1}{2}\epsilon\right)$. Notice that if we multiply $-1$ to all $x_i, 1 \le i \le n$, we obtain $-S_A$ instead of $S_A$. By symmetry, $\text{Pr}\left(S_A \ge \frac{1}{2}\epsilon\right)= \frac{1}{2}\cdot \text{Pr}\left(|S_A| \ge \frac{1}{2}\epsilon\right)$. From the lemma, we have \begin{align*}\text{Pr}(|S_A - \mathbb{E}(S_A)| \ge \sigma \epsilon) \le \frac{1}{\epsilon^2} \implies \text{Pr}\left(S_A \ge \frac{1}{2}\epsilon\right) \le \frac{1}{2\epsilon^2} \implies \text{Pr}(S_A \ge \lambda) \le \frac{1}{8\lambda^2}\end{align*} where we have made the substitution $\epsilon = 2\lambda$. Now the result immediately follows, since the number of possible $S_A$'s is $2^n$.
26.09.2015 09:53
It is instant from Markov's Inequality, Just observe that $E[S_A]=\frac{1}{4}$ and that $E[S_A^2 \ge \lambda^2] \le \frac{2^n}{4\lambda^2}$ gives the conclusion.
20.12.2018 13:30
@anantmudgal09 , for the expected value you have to take the $S_A$ which is positive, so there should be $2^n/2$ in the last expression establishing the correct bound ! For the equality case take $S_A$ from ${S_A,S_B}$ such that $S_A + S_B$ is $0$
13.04.2020 16:33
We compute $\mathbb{E} (S_A^2)$ over all sets $S_A$. Note that each $S_A$ can be expressed as $\varepsilon_1 a_1 + \varepsilon_2 a_2 + \cdots + \varepsilon_n a_n$ such that $\varepsilon_i \in \{ 0,1 \}$ for each $i$. Then we have $$\mathbb{E} (S_A^2) = \mathbb{E} \left( \displaystyle\sum\limits_{i=1}^n \varepsilon_i^2a_i^2 + 2 \displaystyle\sum\limits_{1 \leq i < j \leq n} \varepsilon_i \varepsilon_j a_i a_j \right) = \frac{1}{2} \mathbb{E} \left( \displaystyle\sum\limits_{i=1}^n a_i^2 \right) +\frac{1}{2} \mathbb{E} \left( \displaystyle\sum\limits_{1 \leq i < j \leq n} a_ia_j \right) = \frac{1}{2}-\frac{1}{4} = \frac{1}{4}$$. Assume for contradiction that there were more than $\frac{2^{n-3}}{\lambda^2}$ sets $A$ with $S_A \geq \lambda$. Note that if $B$ is the set of all elements of $\{ 1, 2, \dots , n \}$ that are not in $A$, we have that $S_B \leq - \lambda$. Thus there are more than $\frac{2^{n-2}}{\lambda^2}$ sets $C$ such that $S_C^2 \geq \lambda^2$, so we must have $\displaystyle\sum\limits_A S_A^2 > 2^{n-2}$, where the sum extends over all $2^n$ possible sets $A$. But by definition, we have $\frac{1}{2^n} \displaystyle\sum\limits_{A} S_A^2 = \mathbb{E} (S_A^2) = \frac{1}{4} \Rightarrow \displaystyle\sum\limits_{A} S_A^2 = 2^{n-2}$, which is a contradiction. In order for equality to occur, we need every set $A$ to have $S_A$ as one of $0, \lambda, -\lambda$. It's easy to see that this can only happen if $\lambda = \frac{1}{\sqrt{2}}$ and $\{ x_1, x_2, \dots , x_n \} = \{ \lambda , -\lambda , 0, 0, \dots , 0 \}$
14.06.2020 02:53
Consider the average value of $S_A^2$. Since we have $x_i$ is an element with probability $\frac12$, we have the average is \[ \bar{S_A} = \frac12 (\sum x_i^2 ) + \frac14 (2 \sum_{i<j} x_i x_j) = \frac14 \]Then, the sum of $S_A^2$ over all subsets is $2^{n-2}$, so at most $2^{n-2}/\lambda^2$ satisfy $S_A^2 \ge \lambda^2$. Note that $S_A = -S_{A^C}$, so at most $2^{n-3}/\lambda^2$ satisfy $S_A \ge \lambda$.
19.06.2020 00:20
The problem is just a corollary of Chebyshev's inequality: Theorem: [Chebyshev] Let \(X\) be an integrable random variable with finite expected value and finite non-zero variance. Then for any real number \(r>0\), \[\operatorname{Pr}\left\{(X-\mathbb E[X])^2\ge r\right\}\le\frac{\operatorname{Var}(X)}r.\]Equality holds if and only if \((X-\mathbb E[X])^2\in\{0,r\}\). Let \(\varepsilon_1\), \ldots, \(\varepsilon_n\) be random variables chosen from \(\{0,1\}\), and let \(S_A=\sum_i\varepsilon_ix_i\). Since \(\varepsilon_ix_i\) are all independent, it is not hard to compute \begin{align*} \mathbb E[S_A]&=\sum_i\mathbb E[\varepsilon_ix_i]=\sum_i\frac{x_i}2=0\\ \operatorname{Var}[S_a]&=\sum_i\operatorname{Var}(\varepsilon_ix_i)=\sum_i\frac{x_i^2}4. \end{align*} By Chebyshev's inequality, \[\operatorname{Pr}\left\{S_A^2\ge\lambda^2\right\}\le\frac1{4\lambda^2},\]Since we can always pair \(A\) with its complement to negate \(S_A\), \[\operatorname{Pr}\left\{S_A\ge\lambda\right\}=\frac{\operatorname{Pr}\left\{S_A^2\ge\lambda^2\right\}}2\le\frac1{8\lambda^2},\]and thus the number of \(A\) is at most \(2^{n-3}/\lambda^2\), as needed. Equality holds when \(S_A\in\{0,\lambda,-\lambda\}\) for every choice of \(A\); that is, \[(x_1,\ldots,x_n)=\left(\frac1{\sqrt2},-\frac1{\sqrt2},0,0,\ldots,0\right)\]and permutations.
22.06.2020 04:42
The idea is to consider $S_A^2$ and note that for each positive $S_A$ there is an equal and negative $S_A$. Remark that \[\mathbb{E}[S_A^2] = \mathbb{E}_{A\subseteq \{1,2,\cdots , n\}}\left[\sum_{i\in A}x_i^2 + \sum_{i,j\in A, i\ne j}2x_ix_j\right] = \frac{2^{n-1}\displaystyle\sum_{i=1}^nx_i^2 + 2^{n-2}\sum_{1\le i<j\le n}2x_ix_j}{2^n}.\]This follows from double-counting. Then, remark that $\displaystyle\sum_{i=1}^n x_i^2=1$ and $\displaystyle\sum_{1\le i<j\le n}2x_ix_j = \left(\displaystyle\sum_{i=1}^nx_i\right)^2-\displaystyle\sum_{i=1}^n x_i^2 = -1$. Substituting this in yields \[\mathbb{E}[S_A^2] = \frac{2^{n-1}-2^{n-2}}{2^n} = \frac{1}{4}.\]Thus, by Markov's Inequality, \[\mathbb{P}[S_A^2 \ge \lambda^2] \le \frac{1}{4\lambda^2}.\]Because each positive $S_A$ comes with a negative $S_A$, this implies \[\mathbb{P}[S_A\ge \lambda] \le \frac{1}{8\lambda^2}.\]Noting that there are $2^n$ values of $A$ yields the desired inequality. Note that Markov's Inequality can hold iff $S_A^2 \in \{0,\lambda^2\}$ for all $S_A$. Remark that if there are some four distinct values $a<b<c<d$ of $x_i$, then Markov's Inequality cannot hold because \[a+b<a+c<b+c<b+d<c+d\]yield at least $3$ values of $S_A^2$. As it is clear that not all $x_i$ are the same, we get either $2$ or $3$ distinct values of $x_i$. For a particular nonzero $x_i$, taking $x_i,2x_i$ gives a contradiction to $S_A^2 \in \{0,\lambda\}$. We then throw away all zero $x_i$; we can add them back later. We now get the following cases. Case 1: We have two nonzero $x_i$. Let those $x_i$ be $a<b$. We get $a+b=0,a^2+b^2=1$. Thus, $a=-1/\sqrt{2},b=1/\sqrt{2}$. The possible values of $S_A^2$ are $0,1/2$, thus we get equality for \[\{x_1,\dots , x_n\} = \left\{-\frac{1}{\sqrt{2}},\frac{1}{\sqrt{2}},0,\cdots , 0\right\}\]and $\lambda = \frac{1}{\sqrt{2}}$ with some number of $0$s. Indeed, we have $2^{n-2} = 2^{n-3}/\lambda^2$ possible sets $A$; $\frac{1}{\sqrt{2}}$ and some choice of $0$s from the $n-2$ instances of $0$. Case 2: We have three nonzero $x_i$. Let those $x_i$ be $a<b<c$. As $a+b<a+c<b+c$, we get $a+c=0$ and $a+b=-(b+c)$ by the information about $S_A^2$. This yields $0=(a+b)+(b+c) = (a+b+c)+b = b$, absurd. Hence, we have the one equality case described earlier.
17.08.2020 06:40
Claim: $\sum _{A\subseteq\{1, 2, \ldots, n\}} S_A^2= 2^{n-2}.$ Proof: Remark that $$\mathbb{E}[S_A^2] = \frac{1}{2} \sum_{1 \leq i \leq n} x_i^2 + \frac{1}{2} \sum_{1 \leq i \neq j \leq n} x_ix_j = \frac{1}{2} - \frac{1}{4} = \frac{1}{4}.$$Thus by linearity of expectation on the $2^n$ subsets, we have proven our desired claim. Now remark that $S_A$ and $S_{\{1, 2, \ldots, n\} \setminus A}$ have opposite signs, exactly one must be positive (of course, unless both are $0$). Now, we are home free: let $k$ be the number of subsets $A$ such that $S_A \geq 0$ and $S_A^2 \geq \lambda^2$; now $k \leq \frac{2^{n-3}}{\lambda^2}$, as desired. Equality holds when $\lambda = \frac{1}{\sqrt{2}}$ and $\{ x_1,x_2 \cdots x_n\} = \{ \frac{1}{\sqrt{2}} , -\frac{1}{\sqrt{2}}, 0 \cdots 0 \}$ and permutations.
15.09.2020 07:59
horribly written sketch: We translate the problem into asking about $S_A^2 \geq \lambda^2$. To avoid negative $S_A$, we simply need to slap on a factor of $2$ at the end since all the numbers are symmetric about $0$. Now, we proceed by calculating $\mathrm{E} [S_A^2]$. I'm too lazy to type, so we get $\frac{1}{4}$. Half of these have positive $S_A$. Thus, the sum of all subsets squared is $2^{n - 2}$. Thus if the number of positive $S_A$ with $S_A^2 \geq \lambda^2$ were more than $\frac{2^{n - 3}}{\lambda^2}$, then the total number of $S_A$ satisfying this would be more than $\frac{2^{n - 2}}{\lambda^2}$. Multiplying would yield a contradiction as we would get a sum greater than $2^{n - 2}$.
09.02.2021 23:02
Since the sum of elements of each set is the negation of its complement, we are done if we can show $P(S_A^2\ge \lambda^2)\ge \frac{1}{4\lambda^2}$. Since each $x_i$ is included in $2^{n-1}$ subsets and each pair $x_i,x_j$ is included in $2^{n-2}$ subsets, the expected value of $S_A^2$ is equal to $$\frac{(2^{n-1})(\sum\limits_{i=1}^n x_i^2+\sum\limits_{1\le i<j\le n} x_i x_j)}{2^n}=\frac{-\sum\limits_{1\le i<j\le n} (x_i x_j)}{2}=\frac{1}{4}$$as $\sum\limits_{1\le i<j\le n} x_i x_j=\frac{(\sum\limits_{i=1}^n x_i)^2-\sum\limits_{i=1}^n x_i^2}{2}$, and we can finish with Markov's inequality. The equality case is when $S_A$ can take on only 0, $\lambda$ or $-\lambda$, which forces there to be at most two distinct nonzero values of $x_i$; obviously, we must have them be $\frac{\sqrt{2}}{2},-\frac{\sqrt{2}}{2}$, with the remaining $x_i$ values being 0 and $\lambda=\frac{\sqrt{2}}{2}$. Indeed, we can see that this selection of values satisfies the equality.
12.08.2021 18:23
30.07.2023 02:55
Consider choosing the subset $A$ randomly (e.g. including each element with probability $1/2$ independently). We wish to show that the probability of the sum being at least $\lambda$ is at most $\frac{1}{8\lambda^2}.$ Hungry for cancellation, we look at the expected value of the square of the sum. We have $$E(Sum^2)=E(Sum)^2+Var(Sum)$$$$=Var(Sum).$$Consider choosing $A$ is $n$ independent events of including or excluding a specific element. The event of picking $x_i$ has variance $\frac{1}{4}x_i^2,$ and since variance is additive, the variance of the sum is $$\frac{1}{4}(x_1^2\cdots +x_n^2)=\frac{1}{4}.$$ Suppose FTSOC that the probability of being at least than $\lambda$ is more than $\frac{1}{8\lambda^2}.$ Then, by symmetry, the probability of being at most $-\lambda$ is also more than $\frac{1}{8\lambda^2}.$ Hence, looking at the expected value of the square, this is already more than $\frac{1}{4}$, contradiction. In order for the equality case to occur, the probability of a sum of $\lambda$ must be $\frac{1}{8\lambda^2}$, as well as $-\lambda.$ Other than this, all other possible sums must be zero, since anything else would push it over. Thus, each subset has $$Sum\in\{0,\lambda,-\lambda\},$$so clearly each element is one of those too, Furthermore, if there are at least two elements that are $\lambda$, there will be a $2\lambda$ sum, and similar for $-\lambda$. Thus, the equality case is were one element is $\lambda$, one element is $-\lambda$, and all other elements are 0, in which case clearly $\lambda=\frac{\sqrt{2}}{2}$.
16.08.2023 18:06
Solved with GoodMorning. Claim: $\sum_{A\subseteq \{1,2,\ldots, n\} }S_A^2 = 2^{n-2}$ Proof: We have \begin{align*} \sum_{A\subseteq \{1,2,\ldots, n\} }S_A^2 = \sum_{A\subseteq \{1,2,\ldots, n\} } \sum_{i < j \in A} 2x_i x_j + \sum_{A\subseteq \{1,2,\ldots, n\}} \sum_{i\in A } x_i^2 \\ = 2^{n-1} ( \sum_{1\le i<j\le n } x_i x_j + \sum_{i=1}^n x_i^2 )\\ 2^{n-2} \\ \end{align*}because $\sum_{1\le i < j \le n}x_i x_j = \frac{0^2 - 1}{2} = -\frac{1}{2} $. $\square$ Suppose FTSOC we had more than $\frac{2^{n-3}}{\lambda^2}$ sets $A$ with $S_A \ge \lambda$. Then the sum of $S_A^2 $ over all these sets is more than $\frac{2^{n-3}}{\lambda^2} \cdot \lambda^2 = 2^{n-3} $. But notice that $S_A^2$ equals $S_B^2$, where $B$ is the complement of $A$ (we have $S_A = - S_B$). This implies that \[ \sum S_A^2 = 2\sum_{S_A > 0} S_A^2 > 2 \cdot 2^{n-3} = 2^{n-2}, \]contradiction. Now suppose equality held. Since $\sum S_A^2 \ge 2\sum_{S_A > 0} S_A^2,$ we must have $\sum_{S_A>0}S_A^2 \le 2^{n-3}$. Notice that $\sum_{S_A \ge \lambda} S_A^2$ is at least $\frac{2^{n-3}}{\lambda^2} \cdot \lambda^2 = 2^{n-3}$. However, we also have $\sum_{S_A \ge \lambda} S_A^2 \le \sum_{S_A > 0} S_A^2\le 2^{n-3}$. This implies that $\sum_{S_A \ge \lambda} S_A^2 = \sum_{S_A > 0 } S_A^2 = 2^{n-3}$. Therefore, there are no sets $A$ with $0<S_A < \lambda$. If there was some set $A$ with $S_A > \lambda$, then we would have $\sum_{S_A \ge \lambda} S_A^2 > 2^{n-3}$, absurd. Thus, the only nonnegative possible values of $S_A$ are $0$ and $\lambda$. By taking the complement of $A$, the only negative value for $S_A$ is $-\lambda$. This implies that all $x_i$ are equal to $0, \lambda, $ or $-\lambda$. Notice there must be the same number of $\lambda$s as $-\lambda$s. If there were two or more of each of these, then there exists an $A$ with $S_A = 2\lambda$, but this means $2\lambda \in \{0,\lambda, -\lambda\}$, so $\lambda = 0$, absurd. Clearly we cannot have all $x_i$ equal zero. Therefore $(x_i)$ consists of one $\lambda$, one $-\lambda$, and $n-2$ zeroes. So $2\lambda^2 = 1\implies \lambda = \frac{\sqrt{2}}{2}$. So the only equality cases are \[ \boxed{ \left( -\frac{\sqrt{2}}{2}, \frac{\sqrt{2}}{2}, 0, 0, \ldots, 0 \right)}\]and permutations, which clearly work (with $\lambda = -\frac{\sqrt{2}}{2}$).
30.08.2023 21:21
Since subsets come in complement pairs, it suffices to show that there are at least $2^{n-2}/\lambda^2$ sets $A$ with $|S_A| \geq \lambda$. Let $Y_1,\ldots,Y_n$ be independent random variables such that $Y_i$ is equal to $0$ or $x_i$, each with probability $\frac{1}{2}$. Then the random variable $Z:=Y_1+\cdots+Y_n$ represents the sum of a randomly chosen subset. Observe that $$\mathbb{E}[Z]=\sum_i \mathbb{E}[Y_i]=\sum_i \frac{x_i}{2}=0,$$and since the $Y_i$ are all independent, $$\mathrm{Var}(Z)=\sum_i \mathrm{Var}(Y_i)=\sum_i \frac{x_i^2}{4}=\frac{1}{4}.$$Then by Chebyshev's inequality, $$P(|Z-0| \geq (2\lambda)(1/2)) \leq \frac{1}{4\lambda^2},$$hence there are at most $2^n/(4\lambda^2)=2^{n-2}/\lambda^2$ sets $A$ with $|S_A| \geq \lambda$, as desired. Equality occurs in the application of Chebyshev only when the only possible values $Z$ can assume are precisely $\pm \lambda$ or $0$. This clearly implies that there is exactly one positive and one negative element among the $x_i$, and these clearly must have equal absolute value. This common absolute value is then clearly $1/\sqrt{2}$, so equality occurs if and only if $\{x_1,\ldots,x_n\}=\{1/\sqrt{2},-1\sqrt{2},0,\ldots,0\}$ and $\lambda=1/\sqrt{2}$. $\blacksquare$
06.09.2023 07:31
used 30% hint Note that in $\sum S_A^2$, an arbitrary $x_i$^2 appears 2^{n-1} times, and a $2x_ix_j,i>j$ appears $2^{n-2}$ times, whence $\sum S_A^2=2^{n-1}-2^{n-2}\sum_{i>j}2x_ix_j\stackrel{\sum 2x_ix_j=0-\sum x_i^2=1}{=}2^{n-2}$. By Markov's inequality we have $P(S_A^2\ge\lambda^2)\le\frac1{4\lambda^2}$, but we overcounted the probability since this also adds in $S_A\le\lambda$, and for every set X the complement is negative, so the real probability is $\frac{1}{8\lambda^2}\implies\frac{2^n}{8\lambda^2}=\frac{2^{n-3}}{\lambda^2}$ sets work, as desired. Equality holds when $S_A \in (0,\lambda,-\lambda)$, meaning there's at most one positive and respective negative number, which gives $(x_1,x_2 \dots x_n)=(\frac{\sqrt{2}}{2},-\frac{\sqrt{2}}{2},0 \dots 0),\lambda=\frac{\sqrt{2}}{2}$ with permutations. $\blacksquare$
21.12.2023 21:56
Markov's Inequality: If $X$ is a given non-negative random variable and $a > 0$, then the probability that $X > a$ is at most, \begin{align*} P(X \geq a) \leq \frac{\mathbb{E}[X]}{a} \end{align*}Proof. This was presented by a friend. From the definition of expectation we have, \begin{align*} \mathbb{E}[X] = \int_{-\infty}^{\infty} x \cdot P(X = x) \text{ } dx \end{align*}Now as $X$ is non-negative we have, \begin{align*} \mathbb{E}[X] = \int_0^{\infty} x \cdot P(X = x) \text{ } dx \end{align*}Then clearly we must have, \begin{align*} \mathbb{E}[X] &= \int_0^{a} x \cdot P(X = x) \text{ } dx + \int_a^{\infty} x \cdot P(X = x) \text{ } dx\\ &\geq \int_a^{\infty} x \cdot P(X = x) \text{ } dx\\ &\geq \int_a^{\infty} a \cdot P(X = x) \text{ } dx\\ &= a \int_a^{\infty} P(X = x) \text{ } dx\\ &= a \cdot P(X \geq a) \end{align*}whence we are finished. From this we can also find the equality case of Markov's. Namely we must have \begin{align*} \int_0^a x \cdot P(X = x) \text{ } dx = 0 \end{align*}as well as, \begin{align*} \int_a^{\infty} x \cdot P(X = x) \text{ } dx = \int_a^{\infty} a \cdot P(X = x) \text{ dx} \end{align*}In simpler terms, the first holds if $P(X < a) = 0$, and the second holds when $P(X > a) = 0$. Thus we must always have $X = 0$ or $X = a$. $\blacksquare$ Denote the number of valid subsets by $V$. The idea now is to instead consider $S_A^2$, which is motivated by the $\lambda^2$ in our answer extraction. This also allows us to use Markov's inequality, as we are then guaranteed positive variables. Note that, \begin{align*} \mathbb{E}[S_A^2] &= \left(\sum \frac{1}{2} \cdot x_i \right)^2 \\ &= \frac{1}{2} \cdot \sum x_i^2 + \frac{1}{2} \cdot \sum x_ix_j\\ &= \frac{1}{2} - \frac{1}{4} = \frac{1}{4} \end{align*}Then by Markov's we find, \begin{align*} P(S_A^2 \geq \lambda^2) &\leq \frac{\mathbb{E}[S_A^2]}{\lambda^2}\\ &= \frac{1}{4\lambda^2} \end{align*}Finally using linearity of expectation we have that the expected number of good subsets is, \begin{align*} \mathbb{E}[V] &= P(S_A^2 \geq \lambda^2) \cdot 2^{n-1}\\ &\leq \frac{1}{4\lambda^2} \cdot 2^{n-1}\\ &= \frac{2^{n-3}}{\lambda^2} \end{align*}as we are only considering positive subsets. More completely for any subset $S$ of the $x_i$, the complement of $S$ has an equal but opposite sum. Each of these subsets have the same sum when squared, but only one is valid in our original problem. From the equality case of Markov's we must always have $S_A^2 = \lambda^2$ or $S_A^2 = 0$. Then equality holds if and only if we have, \begin{align*} (x_1, \dots, x_n) = \left(\frac{1}{\sqrt{2}}, - \frac{1}{\sqrt{2}}, 0, \dots, 0 \right) \end{align*}and permutations, as we must have the $x_i$ sum to $0$ and have the $x_i^2$ sum to $1$.
27.12.2023 10:18
Observe that $E[S_A] = 0$ because for each $A \in [n]$, note that $S_A + S_{[n] \backslash A} = 0$, hence we have a bijection and the expected sum $S_A$ is $0$. Now we seek to compute $E[S_A^2].$ Observe that, \begin{align*} S_A^2 &= (\sum_{i \in A} x_i)^2 \\ &= (\sum_{i \in A} x_i^2) + (\sum_{(i, j) \in A^2} x_ix_j) \\ &= (\sum_{i \in [n]} (1_{i \in A})x_i^2) + \sum_{(i, j) \in [n]^2} (1_{i \in A}1_{j \in A})x_ix_j \\ E[S_A^2] &= \sum_{i \in [n]} x_i^2 E[1_{i \in A}] + \sum_{(i, j) \in [n]^2} x_ix_j E[(1_{i \in A}1_{j \in A})] \\ &= \frac{1}{2} (\sum_{i \in [n]} x_i^2) + \frac{1}{4} [(\sum_{i \in [n]} x_i)^2 - (\sum_{i \in [n]} x_i^2)] \\ &= \frac{1}{4}. \end{align*}Where we used LoE and indicator functions. Utilizing the bijection and by Chebychev inequality, \begin{align*} P(|S_A - E[S_A]| \ge \lambda) &\le \frac{\text{Var}(S_A)}{\lambda^2} \\ 2P(S_A \ge \lambda ) &\le \frac{E[S_A^2] - E[S_A]^2}{\lambda^2} \\ P(S_A \ge \lambda) &\le \frac{1}{8 \lambda^2}. \end{align*}So the number of $A$ s.t $S_A \ge \lambda$ is $\le |\mathcal{P}([n])| \cdot P(S_A \ge \lambda) = \frac{2^{n-3}}{\lambda^2}.$
29.12.2023 21:56
Lemma 1. Let $k\le n$ be a nonnegative integer. Then summing over all $\binom nk$ (sorted) combinations of $a_1,a_2,\dots,a_k\in\{1,2,\dots,n\}$, \[ \sum\left(x_{a_1}+x_{a_2}+\dots+x_{a_k}\right)^2 = \binom{n-2}{k-1}. \]Proof. It is easy to see that this should be a linear combination of the two degree $2$ symmetric polynomials in terms of $x_1,x_2,\dots,x_n$. Since those are both constant(by the given conditions), this sum must be constant. Therefore, it suffices to plug in $x_1=\frac{1}{\sqrt2}$, $x_2=-\frac{1}{\sqrt2}$, and $x_3=x_4=\dots=x_n=0$. The rest is simple computation. $\blacksquare$ Now fix a $k$. Then the number of (sorted) combinations $a_1,a_2,\dots,a_k\in\{1,2,\dots,n\}$ such that \[ \left|x_{a_1}+x_{a_2}+\dots+x_{a_k}\right|\ge\lambda \]would be at most $\dfrac{\binom{n-2}{k-1}}{\lambda^2}$. Thus, noting that $k=0$ and $k=n$ don't work, the number of subsets $A\subseteq\{1,2,\dots,n\}$ such that \[ \left|\sum_{a\in A} x_a\right|\ge\lambda \]is at most $\dfrac{2^{n-2}}{\lambda^2}$. But note that for every subset $A$ such that the sum is $\ge\lambda$, the complement of this subset $A^{C}$ would have its sum be $\le -\lambda$. Therefore, the sum is positive exactly half the time over all $A$ that work, so the number of subsets $A\subseteq\{1,2,\dots,n\}$ such that \[ \sum_{a\in A} x_a\ge\lambda \]is indeed at most $\dfrac{2^{n-3}}{\lambda^2}$. Now we will find the equality cases. Note that the $\dfrac{\binom{n-2}{k-1}}{\lambda^2}$ needed to all be integers, so specifically $\dfrac{1}{\lambda^2}$ needed to be an integer. Also, each of the \[ \left(x_{a_1}+x_{a_2}+\dots+x_{a_k}\right)^2 \]had to be $0$ or $\lambda^2$. Specifically, when $k=1$, we needed $x_i\in\{0,\lambda,-\lambda\}$ for every $i$. Then for $k=2$, to avoid encountering the case where we have $(\lambda+\lambda)^2$, we must have at most one $\lambda$. Similarly, we must have at most one $-\lambda$. Now by the constraints in the problem(sum is $0$ and sum of squares is $1$), this means we must have one $\lambda$, one $-\lambda$, and the rest are $0$. Thus $\lambda=\frac{1}{\sqrt2}$. This always works, so we are done. $\blacksquare$ To iterate the equality case, $\lambda=\dfrac{1}{\sqrt2}$ and the following multisets are equal. \[ \{x_1,x_2,\dots,x_n\} = \{\lambda,-\lambda,0,0,\dots,0\}. \]
12.06.2024 06:10
Let $E[X]$ denote the expected value of $X$. Claim: $E\left[S_A^2\right] = \frac{1}{4}.$ Proof: We see that \[S_A^2 = \sum_{a \in A} x_a^2 + \sum_{a \neq b \in A} x_ax_b\]By a simply summing all subsets $A$, we can see \[E\left[\sum_{a \in A} x_a^2\right] = \frac{1}{2}\]We also see that \[\sum_{a \neq b \in A} x_ax_b = 1\]and by similar logic (summing over all $A$), this implies \[E\left[\sum_{a \neq b \in A} x_ax_b\right] = \frac{1}{4}\]Combining these two gives the desired result. $\square$ FTSoC, assume there exists more than $\frac{2^{n-3}}{\lambda^2}$ sets $A$. Take any set $A$ that has sum $\ge \lambda$ and take the complement set, the subset $C$ that has all the numbers in $\{1,2,\ldots,n\}$ that are not in $A$. Notice this set $C$ satisfies $S_C^2 \ge \lambda^2$. Thus there are actually more than $\frac{2^{n-2}}{\lambda^2}$ sets that satisfy the sum squared $\ge \lambda^2$. Thus summing over all these sets gives \[\sum_{\text{all sets}} S_A^2 \ge 2^{n-2} \implies E[S_A^2] > \frac{1}{4}\]a contradiction.
13.07.2024 00:43
Let $a_i$ be equal to $0$ or $1$ with equal probability. Then let $y_i = a_ix_i$, and it follows that $\mathbb{E}(S_A) = \sum_i y_i$. Note that $\mathbb{E}(y_i) = \frac{1}{2}x_i$, so $\mathbb{E} (\sum_i y_i)^2 = \frac{1}{4} (\sum_i x_i)^2 = \frac{1}{4}$. So then $\mathbb{E}(S_A) = \frac{1}{4}$. Applying Markov's inequality gives $\mathbb{P}(S_A^2 \geq \lambda^2) \leq \frac{\mathbb{E} S_A^2}{\lambda^2} = \frac{1}{4\lambda^2}$. So there are at most $\frac {2^{n-2}}{\lambda^2}$ subsets so that $|S_A| \geq \lambda$. However, taking the opposite choice of $a_i$ over all $i$ for each $S_A$ so that $|S_A| \geq |\lambda|$ gives us a subset so that $\lambda \geq S_A$ so we divide $\frac{2^{n-2}}{\lambda^2}$ by two to get our desired bound. Note that equality in Markov's occurs only if $S_A^2 = \{0, \lambda^2\}$ so there must be exactly two nonzero values in $x_i$. Since the sum of these two values must be $0$, they have equal absolute value so both are equal to $\sqrt{\frac 12}$, done.
09.01.2025 10:16