Let $n\in\mathbb N_+.$ For $1\leq i,j,k\leq n,a_{ijk}\in\{ -1,1\} .$ Prove that: $\exists x_1,x_2,\cdots ,x_n,y_1,y_2,\cdots ,y_n,z_1,z_2,\cdots ,z_n\in \{-1,1\} ,$ satisfy $$\left| \sum\limits_{i=1}^n\sum\limits_{j=1}^n\sum\limits_{k=1}^na_{ijk}x_iy_jz_k\right| >\frac {n^2}3.$$Created by Yu Deng
Problem
Source: 2023 China TST Problem 11
Tags: inequalities, China TST, Probabilistic Method
18.03.2023 17:02
This is the stuff you see in nightmares
18.03.2023 18:31
What the...
18.03.2023 19:02
@EthanWYX2009, can you post P10, please?
18.03.2023 21:31
Looks interesting!
19.03.2023 06:07
Let $$S_l=\sum_{1\le j,k\le n} a_{j,k,l}x_jy_k$$By picking $z_l$s appropriately the answer is at least $\sum_{l=1}^n |S_l|$ Note $$E[S_l^2]=\sum_{1\le j_1,k_1,j_2,k_2\le n} a_{j_1,k_1,l}a_{j_2,k_2,l} E[x_{j_1}y_{k_1}x_{j_2}y_{k_2}] =\sum_{1\le j,k\le n} a_{j,k,l}^2x_j^2y_k^2=n^2$$ We compute $E[S_l^4]$ first. The key intuition is that $S_l$ tends to a normal distribution as n to infinity, so if we can upper bound $ E[S_l^4]$ we can lower bound $E[|S_l|]$ with the inequality $$E[|S_l|]^2 E[S_l^4] \ge E[S_l^2]^3$$Which is true by AMGM or holder Note $E[S_l^4]=\# ((j_1,j_2,j_3,j_4) \text{ contain each number an even number of times and } (k_1,k_2,k_3,k_4) \text{ contain each number an even number of times } $ There are $n+\binom 42 \binom n2=n+3n(n-1)$ choices for $(j_1,j_2,j_3,j_4)$ and same for $(k_1,k_2,k_3,k_4)$. Thus $E[S_l^4]<9n^4$ so we do get $E[ |S_l|]> n/3$ for all $l$. This readily implies that I can get $\sum_{j=1}^n |S_j| \ge E[ \sum_{j=1}^n |S_j|] = \sum_{j=1}^n E[|S_j|] \ge n^2/3$
19.03.2023 06:55
Splendid! but rather unfriendly to those who haven't seen this technique, I guess?
19.03.2023 07:36
I guess. This is a rather standard problem. My first instinct is when the constant 1/3 is replaced by $\sqrt{2/ \pi}$, where I have $n^2$ independent random variables. Unfortunately these $n^2$ random variables are only pairwise independent, so I had to look for another method, which is either bounding moments or using chernoff bound. Since I want this bound to hold even when n is small I tried fourth-moment bounding.
24.12.2023 23:47
R8kt wrote: This is the stuff you see in nightmares Not really, it's the classic "unbalancing lights" from e.g. Alon-Spencer. See for instance Page 19 of https://web.stanford.edu/~lindrew/18.218.pdf . Maybe it's indeed hard for high schoolers though.
24.12.2023 23:53
i love probability theory....
25.12.2023 00:29
Nice and interesting
04.02.2024 06:05
Here is the best bound $\left(\frac 2{\pi}-\varepsilon\right)n^2$ for all $\varepsilon>0$ when $n$ is large. We use a famous lemma: Quote: Lemma. For real numbers $a_1,a_2,\ldots ,a_n,$ choose $\varepsilon_1,\varepsilon_2,\ldots ,\varepsilon_n=\pm 1$ ramdomly, independently and uniformly. Then $$\mathbb E\left[\left|\sum_{k=1}^n\varepsilon_ka_k\right|\right]>\sqrt{\frac{2/\pi-\varepsilon}{n}}\sum_{k=1}^n|a_k|.$$ If this is correct, The original problem is easy: \begin{align*}\mathbb E\left[\sum_{k=1}^n\left|\sum_{i=1}^n\sum_{j=1}^na_{ijk}x_iy_j\right|\right]&=\frac 1{2^n}\sum_{x_1,\ldots ,x_n=\pm 1}\sum_{k=1}^n\mathbb E\left[\sum_{k=1}^n\left|\sum_{j=1}^ny_j\sum_{i=1}^na_{ijk}x_i\right|\right]\\&>\frac 1{2^n}\sum_{x_1,\ldots ,x_n=\pm 1}\sum_{k=1}^n\sqrt{\frac{2/\pi-\varepsilon}{n}}\sum_{j=1}^n\left|\sum_{i=1}^na_{ijk}x_i\right|\\&=\sqrt{\frac{2/\pi-\varepsilon}{n}}\sum_{k=1}^n\sum_{j=1}^n\mathbb E\left[\left|\sum_{i=1}^na_{ijk}x_i\right|\right]\\&>\left(\sqrt{\frac{2/\pi-\varepsilon}{n}}\right)^2\sum_{k=1}^n\sum_{j=1}^n\sum_{i=1}^n|a_{ijk}|\\&=\left(\frac 2{\pi}-\varepsilon\right)n^2.\end{align*}Therefore select $z_k=\text{sgn}\sum_{i=1}^n\sum_{j=1}^na_{ijk}x_iy_j,$ by the expectation and we are done.$\Box$
04.02.2024 17:15
EthanWYX2009 wrote: Quote: Lemma. For real numbers $a_1,a_2,\ldots ,a_n,$ choose $\varepsilon_1,\varepsilon_2,\ldots ,\varepsilon_n=\pm 1$ ramdomly, independently and uniformly. Then $$\mathbb E\left[\left|\sum_{k=1}^n\varepsilon_ka_k\right|\right]\geqq{\color{red}\sqrt{\frac{2/\pi-\varepsilon}{n}}}\sum_{k=1}^n|a_k|.$$ If this is correct, Is the sharp(est) coefficient $\frac{{n-1\choose{\left\lfloor\frac{n-1}2\right\rfloor}}}{2^{n-1}}$? (It is a corollary of the Hlawka's inequality when \(n=3\).)
06.02.2024 05:49
Can someone provide a reference of a proof of the famous lemma used by @EthanWYX2009
06.02.2024 05:59
Quote: Lemma. For real numbers $a_1,a_2,\ldots ,a_n,$ choose $\varepsilon_1,\varepsilon_2,\ldots ,\varepsilon_n=\pm 1$ ramdomly, independently and uniformly. Then $$\mathbb E\left[\left|\sum_{k=1}^n\varepsilon_ka_k\right|\right]>\sqrt{\frac{2/\pi-\varepsilon}{n}}\sum_{k=1}^n|a_k|.$$ Here is the proof: WLOG let $a_1,a_2,\ldots ,a_n>0.$ Then \begin{align*} \mathbb E\left[\left|\sum_{k=1}^n\varepsilon_ka_k\right|\right] &=\frac 1{2^n}\sum_{t=0}^n\sum_{\#\{\varepsilon_i=1\}=t}\left|\sum_{k=1}^n\varepsilon_ka_k\right|\\ &=\frac 1{2^n}\sum_{0\le t\le n/2}\sum_{\#\{\varepsilon_i=1\}=t}\left|\sum_{k=1}^n\varepsilon_ka_k\right|+\frac 1{2^n}\sum_{n/2< t\le n}\sum_{\#\{\varepsilon_i=1\}=t}\left|\sum_{k=1}^n\varepsilon_ka_k\right|\\ &\ge -\frac 1{2^n}\sum_{0\le t\le n/2}\sum_{\#\{\varepsilon_i=1\}=t}\sum_{k=1}^n\varepsilon_ka_k+\frac 1{2^n}\sum_{n/2< t\le n}\sum_{\#\{\varepsilon_i=1\}=t}\sum_{k=1}^n\varepsilon_ka_k\\ &=\left(\frac 1{2^n}\sum_{t=0}^n\frac{|n-2t|}n\binom nt\right)\sum_{k=1}^na_k\\ &=\frac 1{2^{n-1}}\binom{n-1}{\left\lfloor\frac{n-1}2\right\rfloor}\sum_{k=1}^na_k.\end{align*}By Stirling's approximation, $$\frac 1{2^{n-1}}\binom{n-1}{\left\lfloor\frac{n-1}2\right\rfloor}\sim\frac{(n-1)!}{2^{n-1}\cdot((n-1/2)!)^2}\sim\dfrac{\sqrt{2\pi n}\cdot (n/e)^{n-1}}{2^{n-1}\cdot \left(\sqrt{2\pi\cdot{n}/2}\cdot \left({n}/{2e}\right) ^{\frac{n-1}2}\right)^2}\sim\sqrt{\frac 2{\pi n}}.$$so we are done.$\Box$
07.02.2024 20:23
Amazing!
17.06.2024 22:22
CANBANKAN wrote: Note $E[S_l^4]=\# ((j_1,j_2,j_3,j_4) \text{ contain each number an even number of times and } (k_1,k_2,k_3,k_4) \text{ contain each number an even number of times } $ This should be $\le$, since $\prod_{l=1}^4 a_{j_l,k_l}$ can be $-1$ instead of $1$. Oops