Show that the inequality \[\sum_{i=1}^n \sum_{j=1}^n \sqrt{|x_i-x_j|}\leqslant \sum_{i=1}^n \sum_{j=1}^n \sqrt{|x_i+x_j|}\]holds for all real numbers $x_1,\ldots x_n.$
Problem
Source: IMO 2021 P2
Tags: IMO 2021, IMO, algebra, n-variable inequality, inequalities, Convexity, concavity
20.07.2021 23:45
uhm isnt it a bit too early to post the problems?
20.07.2021 23:51
Related problem
21.07.2021 00:27
Why this problem made the IMO, what’s wrong with PSC???
21.07.2021 00:33
Does anyone have a solution? It must have a clean one right?
21.07.2021 00:35
Very nice by integral
21.07.2021 00:36
Use induction on $n$. Base cases are easy. Imagine shifting all the $x_i$ by a variable real number $t$. The LHS stays constant, while the RHS consists of a bunch of concave functions joined together at "cusps". Therefore, the minimum must be achieved at one of the cusps (the RHS goes to infinity as $\lvert t \rvert \to \infty$). This happens when $x_i = 0$ for some $i$ or $x_i = -x_j$ for some $i, j$. In this case, delete $x_i$ or ($x_i$ and $x_j$) and use the inductive hypothesis.
21.07.2021 00:39
nprime06 wrote: Show that the inequality \[\sum_{i=1}^n \sum_{j=1}^n \sqrt{|x_i-x_j|}\leqslant\sum_{i=1}^n \sum_{j=1}^n \sqrt{|x_i+x_j|} \]holds for all real numbers $x_1,\ldots x_n.$ Let $f(x)=\sum_{i=1}^n \sum_{j=1}^n \sqrt{|(x_i+x)+(x_j+x)|}$. We have $\lim_{x_\to\infty}=\lim_{x_\to-\infty}=\infty$ and $f$ ist continuous. Thus there is an $x$ for which $f(x)$ ist minimal. Since $\sqrt{x}$ ist convex, $f(x)$ is minimal for $2x=x_i+x_j$. For $i=j$ we can reduce the inequality to $n-1$ variables. For $i\neq j$ we can reduce the inequality to $n-2$ variables.
21.07.2021 00:42
I'm told this problem was proposed by dnkywin. According to the official solutions it is also true that \[\sum_{i=1}^n\sum_{j=1}^n|x_i-x_j|^\alpha\le\sum_{i=1}^n\sum_{j=1}^n|x_i+x_j|^\alpha\]for all $\alpha\in(0,2]$. The above solutions work for all $\alpha\le1$ but concavity breaks for $\alpha>1$. That case seems to require more non-elementary machinery.
21.07.2021 00:42
Again this is a very good problem but is not suitable for this contest. Would be suitable if it was on the Undergraduate Olympiad.
21.07.2021 00:45
First we note that adding a constant $\frac{y}{2}$ to all the variables $x_i$ doesn't change the LHS. So we might try to do this in a way that makes the RHS smaller. If we can prove the resulting (stronger) inequality, we will be done, and according to the problem statement, it should still hold, so this seems like a promising plan. Each term on the RHS will now be of the form $\sqrt{|x_i + x_j + y|}$, and so the derivative with respect to $y$ is $$\frac{sgn(x_i + x_j + y)}{\sqrt{|x_i + x_j + y|}}$$for any $y$ such that $x_i + x_j + y \neq 0$. Note that this is strictly strictly decreasing for $y < - x_i - x_j$ \textit{and} for $y > -x_i - x_j$. (Here we show a proof using calculus, but it's possible to do it without this using that square root is concave). The derivative of the entire sum on the RHS is the sum of the derivatives, so it is also strictly decreasing in any interval such that $x_i + x_j + y \neq 0$ for all $i,j$. If the (total) derivative is positive, the RHS is currently increasing in $y$, so picking a small negative $y$ will decrease the RHS. In fact, since the derivative is strictly decreasing, we can keep making $y$ smaller until we hit a point where $y = -x_i-x_j$ for some $i,j$, and the RHS will only decrease faster and faster since the derivative becomes larger and larger as $y$ decreases. Conversely, if the (total) derivative is negative we should pick a small positive $y$ and then keep increasing it until $y = -x_i - x_j$ for some $i,j$. In the special case where the derivative is zero we can in fact move in any direction (we will be at a local maximum), and we will then keep moving in this direction, again because the derivative is strictly decreasing. The above argument shows that it's enough to consider a case where there is some $i,j$ such that $x_i = - x_j$. But then just plugging in these values into the inequality we notice that every term involving $x_i$ or $x_j$ cancels (this is easy to check). Hence we have reduced it to showing the inequality for $n-2$ numbers $x_1, ..., x_{n-2}$. We can now finish by induction. It only remains to check the base cases $n = 0$ and $n = 1$ which are both trivial.
21.07.2021 01:03
Can this be solved with integrals?
21.07.2021 01:40
Doru2718 wrote: Can this be solved with integrals? yes
21.07.2021 01:41
@above But can someone post it please? I am very curios for a solution with integrals.
21.07.2021 01:44
rafaello wrote: Why this problem made the IMO, what’s wrong with PSC??? There's "A" problem with part of the shortlist.
21.07.2021 02:20
Doru2718 wrote: @above But can someone post it please? I am very curios for a solution with integrals. Deleted posted
21.07.2021 02:22
@above I don't think that solution works here because the square root is an issue.
21.07.2021 03:40
I remember the problem very well. Because I couldn't solve it, and I searched , then arrived at The 2016 Mathematical Olympiad Summer Program. Probably, this is , 8. Sequences, series of real numbers and inequalities.
21.07.2021 05:42
21.07.2021 06:09
Solved with David Dong, Elliott Liu, Ryan Yang. The idea is to force \(x_i+x_j=0\) for some \(i\), \(j\), then induct down. Claim: If \(x_i+x_j\ne0\) for all \(i\), \(j\), then for some \(k\) the sequence \(x_1+k\), \ldots, \(x_n+k\) produces the same left-hand side but a smaller right-hand side. Proof. Evidently shifting the \(x_i\) by a constant does not affect the left-hand side. Take small $\varepsilon$, and observe by Jensen that \[ \left(\sum_{i,j}\sqrt{|x_i+x_j+2\varepsilon|}+\sqrt{|x_i+x_j-2\varepsilon|}\right) \le\sum_{i,j}2\sqrt{|x_i+x_j|}, \]so either \[\sum_{i,j}\sqrt{|x_i+x_j+2\varepsilon|}\le\sum_{i,j}\sqrt{|x_i+x_j|}\quad\text{or}\quad\sum_{i,j}\sqrt{|x_i+x_j-2\varepsilon|}\le\sum_{i,j}\sqrt{|x_i+x_j|}.\]Thus either \(k=\varepsilon\) or \(k=-\varepsilon\) works. \(\blacksquare\) Therefore we reduce the problem to the cases where \(x_i+x_j=0\) for some \(i\), \(j\). Finally, deleting \(x_i=+x\) and \(x_j=-x\) decreases both sides by \[2\sqrt{2x}+\sum_{k\ne i,j}\left(2\sqrt{|x_k-x|}+2\sqrt{|x_k+x|}\right),\]so we may induct down. The remaining base cases of \(n=0,1\) are trivial. Remark: The above proof shows that the equality cases of this inequality are when the \(x_i\) are symmetric across 0.
25.07.2022 20:05
dangerousliri wrote: The best inequality on IMO is P3 IMO2006. I don't agree with you. The $uvw$'s technique kills it immediately (it's a quadratic inequality of $w^3$). It's a very smooth problem.
24.10.2022 09:39
I know concavity and I'm not even AMO
24.10.2022 09:40
xpoes wrote: I know concavity and I'm not even AMO i know concavity and i'm only amo
26.10.2022 01:45
Consider shifting $(x_1,x_2,...,x_n)\to (x_1+\epsilon,...,x_2+\epsilon,x_n+\epsilon).$ The LHS stays constant, while RHS is a sum of concave functions depending on $\epsilon,$ which goes to infinity with $\epsilon\to \infty.$ Then as we can see it has minimum at either $x_i+\epsilon=0$ (and we may remove variable $x_i$) or $x_i+x_j+2\epsilon=0$ (and we may remove $x_i$ and $x_j$), so the problem consequently reduces to trivial cases $n\in \left \{ 0,1\right \}.$
30.11.2022 17:24
Just posting for storage, trying to keep this account alive and practice my Latex skill.
24.02.2023 15:12
Here is my solution: https://calimath.org/pdf/IMO2021-2.pdf And I uploaded the solution with motivation to: https://youtu.be/qehg7waNdSQ
15.06.2023 18:01
Claim 1. For $d \in \mathbb{R}_{\neq 0}$ and positive $t \leq |d|$, we have $\sqrt{|d-t|} + \sqrt{|d+t|} \leq 2\sqrt{|d|}$.
Let $f(x_1, ..., x_n) = \sum_{i=1}^n \sum_{j=1}^n \sqrt{|x_i+x_j|}$. Claim 2. All global minima of $f(x_1+t, ..., x_n+t)$ over $t \in \mathbb{R}$ occur when $(x_i + t) + (x_j + t) = 0$ for some $i, j$.
Now we are ready to prove the problem by induction. Let $P(n)$ denote the given statement. Base case: $n = 0, 1$
Induction: $P(n) \Leftarrow P(n-2) \lor P(n-1)$ for all $n \geq 2$
08.08.2023 15:54
We can, instead, use the identity $$\sqrt{t}=\frac{1}{2\sqrt \pi}\int_0^\infty \frac{1-e^{-tx}}{x^{3/2}} \ dx.$$Although it's basically the same. Anyway I think this trick of using definite integrals to summing is common at some level (?) Nice though.
08.08.2023 16:39
nprime06 wrote: Show that the inequality \[\sum_{i=1}^n \sum_{j=1}^n \sqrt{|x_i-x_j|}\leqslant \sum_{i=1}^n \sum_{j=1}^n \sqrt{|x_i+x_j|}\]holds for all real numbers $x_1,\ldots x_n.$ Posted by user Kilua, in 2015
12.08.2023 12:09
mihaig wrote: Posted by user Kilua, in 2015 Can you give a link?
26.02.2024 21:33
It's #3 in the thread arqady. Furthermore, I have a question. Can we split up terms like $x_1\geq x_2\geq \cdots \geq x_p\geq 0\geq x_{p+1}\geq x_{p+2} \geq \cdots x_n$ and play around LHS and RHS ? IMO Shortlist 2019 Problem A.2 can be solved using this split up too.
04.03.2024 08:10
Prove or disprove for all $a_1,a_2,\cdots,a_{n}$ reals and $k$ non-negative integer $$\sum_{i_1}^{n}{\sum_{i_2}^{n}{\cdots\sum_{i_{2k}}^{n}{\sqrt{|x_{i_1}+x_{i_2}+\cdots+x_{i_k}-\left(x_{i_{k+1}}+x_{i_{k+2}}+\cdots+x_{i_2k}\right)|}}}}\leq \sum_{i_1}^{n}{\sum_{i_2}^{n}{\cdots\sum_{i_{2k}}^{n}{\sqrt{|x_{i_1}+x_{i_2}+\cdots+x_{i_{2k}}|}}}}$$
18.06.2024 19:25
Does smoothing on $x_n$ work for this problem? I am convinced that the function $f(x)=\sqrt{\left|x+x{1}\right|}+\sqrt{\left|x+x{2}\right|}+\sqrt{\left|x+x{3}\right|}+\sqrt{\left|x+x{4}\right|}-\left(\sqrt{\left|x-x{1}\right|}+\sqrt{\left|x-x{2}\right|}+\sqrt{\left|x-x{3}\right|}+\sqrt{\left|x-x_{4}\right|}\right)+\sqrt{2\left|x\right|}$ can be smoothed out to show that $x_n = -max{|x_i|}$ but am not sure how to work out the details WITHOUT taking the derivative.
12.01.2025 15:02
The basic idea is to shift all the $x_i$ by some $x$ such that the LHS is the same whereas the RHS is minimised- when this happens, either two terms will be equal or a term will be zero, and in either case the inequality is implied by the cases for lower $n$, thus allowing us to do an induction argument. (I have spent way too much time on this problem and will NOT bother writing this up, thank you very much.) [Also, I have no idea how to prove a claim without derivating although the claim's proof is handwaved in the shortlist by using some weird concavity argument?]
03.02.2025 09:59
learned this integral identity today, seems to be quite useful