Let $a_1, b_1, a_2, b_2, \dots , a_n, b_n$ be nonnegative real numbers. Prove that \[ \sum_{i, j = 1}^{n} \min\{a_ia_j, b_ib_j\} \le \sum_{i, j = 1}^{n} \min\{a_ib_j, a_jb_i\}. \]
Problem
Source: USAMO 2000/6
Tags: USAMO, inequalities, quadratics, vector, calculus, induction
12.08.2004 17:38
For me, this is the hardest and most beautiful inequality ever given in a contest. Absolutely incredible!
12.08.2004 21:51
A very hard solution is published at kalva site. I have studied it and I have found it very interesting but too much hard for a common people like me.
06.10.2004 22:57
wow: i love inequalities, but this one almost killed me - amazing
16.08.2005 21:24
I think I have found a simple proof of this famous inequality. First, we will prove a handy lemma which shows that a certain quadratic form is nonnegative. Lemma 1: Let $r_1$, ..., $r_n$ be nonnegative numbers. Let $x_1$, ..., $x_n$ be real numbers. Then \[ \sum_{i, j} \min(r_i, r_j) x_i x_j \ge 0. \] Proof: We may assume that $r_1 \le r_2 \le \ldots \le r_n$. For convenience, define $r_0 = 0$. The left-hand side of the inequality becomes \[ \sum_i r_i x_i^2 + 2 \sum_{i < j} r_i x_i x_j \, . \] But that expression is equivalent to the sum \[ \sum_i (r_i - r_{i-1}) \biggl( \sum_{j=i}^n x_j \biggr)^2 \, . \] Each term of that sum is nonnegative, so the whole sum is nonnegative. $\blacksquare$ To apply Lemma 1, we will choose the $r$ and $x$ vectors carefully. Define \[ r_i = \frac{\max(a_i, b_i)}{\min(a_i, b_i)} - 1. \] (If the denominator is 0, set $r_i$ to be any nonnegative number.) Define $x_i = \textrm{sgn}(a_i - b_i) \min(a_i, b_i)$. We will next prove a lemma that connects our original problem to the quadratic form. Lemma 2: $\min(a_i b_j, a_j b_i) - \min(a_i a_j, b_i b_j) = \min(r_i, r_j) x_i x_j$. Proof: If we switch the values of $a_i$ and $b_i$, then both sides negate. Thus we may assume $a_i \ge b_i$. Similarly we may assume $a_j \ge b_j$. If either $b_i$ or $b_j$ is $0$, then both sides are $0$. Thus we may assume that $b_i$ and $b_j$ are positive. By the definitions of $r$ and $x$, we have \begin{eqnarray*} r_i & = & \frac{a_i}{b_i} - 1 \\ r_j & = & \frac{a_j}{b_j} - 1 \\ x_i & = & b_i \\ x_j & = & b_j \, . \end{eqnarray*} Plugging those values in, we get \begin{eqnarray*} \min(r_i, r_j) x_i x_j & = & \min\bigl(\frac{a_i}{b_i} - 1, \frac{a_j}{b_j} - 1\bigr) b_i b_j \\ & = & \min(a_i b_j, a_j b_i) - b_i b_j \\ & = & \min(a_i b_j, a_j b_i) - \min(a_i a_j, b_i b_j) \, . \end{eqnarray*} That completes the proof of Lemma 2. $\blacksquare$ Finally, we can apply Lemma 2 and then Lemma 1 to get \begin{eqnarray*} \sum_{i,j} \min(a_i b_j, a_j b_i) - \sum_{i, j} \min(a_i a_j, b_i b_j) & = & \sum_{i, j} \left[ \min(a_i b_j, a_j b_i) - \min(a_i a_j, b_i b_j) \right] \\ & = & \sum_{i, j} \min(r_i, r_j) x_i x_j \\ & \ge & 0 \, . \end{eqnarray*} Bringing one sum over to the right-hand side gives the desired inequality.
16.08.2005 21:31
Could we use Lagrange?
17.08.2005 09:21
That's just great. I knew lemma 1 (to which I had a nice integral solution) and I also tried to use it, but I just could not find the good $r_i, x_j$. It is a fantastic solution, truly nice and beautiful, Ravi.
17.08.2005 12:55
Somehow, I'm interested Ravi B how did you found that solution ?
17.08.2005 14:59
Harazi, thanks for your kind words. I think I see the integral solution of Lemma 1 (replacing the min with an integral), and it is cute. Rafal, after many false starts, here's what I did. I think the key idea was, instead of thinking of the pair $a_i$ and $b_i$ as two separate numbers, to think of them as consisting of three independent parts: their minimum, the ratio of their maximum to their minimum, and a bit indicating which is larger. I gave a name to the first two parts, $m_i$ and $r_i$. To handle the bit about which is larger, I split each of the original sums into four cases, depending on whether $a_i \ge b_i$ and/or $a_j \ge b_j$. Each sum could then be expressed solely in terms of $m_i$ and $r_i$. I subtracted the two sides, and then I realized I could collapse the four sums into one by using the sign function. At that point, I had reduced the inequality to Lemma 1, and I knew I was home. In the proof of Lemma 2, I should have pointed out that if either $a_i = b_i$ or $a_j = b_j$, then both sides are zero, so we can ignore those cases.
31.10.2005 00:16
harazi wrote: I knew lemma 1 (to which I had a nice integral solution) How did you prove it with integrals?
31.10.2005 00:33
Use the identity \[ \min(r_i, r_j) = \int_0^\infty [t \le r_i] [t \le r_j] \, dt \, . \] Here $[S]$ means $1$ if $S$ is true, and $0$ otherwise.
01.11.2005 00:36
There's one thing I don't understand about this inequality. Since $a_i$ and $b_j$ are said to be just nonnegative, then why this inequality holds??? I mean, not looking into the solution, I just see the inequality and I don't know why it holds... Can anybody explain it to me? oops, I'm beginning to understand it. Perhaps someone could write the integral proof of Lemma 1 for me, please.
27.12.2006 03:37
Well basically, you have the intuition that $a_{i}a_{j}$ and $b_{i}b_{j}$ are less "average" than $a_{i}b_{j}$ and $a_{j}b_{i}$, sort of like how you have $a^{2}+b^{2}\ge 2ab$. I think I have another solution--nowhere near as awesome as Ravi's, but I'll post it anyway for those who think the above one is too tricky to find. We proceed by induction. The base case is trivial, as $\min(a_{1}^{2}, b_{1}^{2}) \le a_{1}b_{1}$, since $a_{1}b_{1}$ is the geometric mean of $a_{1}^{2}$ and $b_{1}^{2}$. Now we will prove the inequality for $n$ variables assuming that it holds for $n-1$ variables. Consider $\sum \min(a_{i}b_{j}, b_{i}a_{j})-\sum \min(a_{i}a_{j}, b_{i}b_{j})$ as a function of $a_{i}$. It would suffice to prove the inequality when $a_{i}$ is set to a value such that this function is minimal (it is clear that it has a minimum, since it plateaus out when $a_{i}$ is sufficiently large, and it is continuous). This function is piecewise linear, so the minimum must occur at a "critical" point, i.e. when either $a_{i}b_{j}= b_{i}a_{j}$ or $a_{i}a_{j}= b_{i}b_{j}$ creates a cusp in the graph. Note that if the minimum occurs at $a_{i}= 0$, we immediately reduce the problem to $n-1$ variables and are done. We also need the cusp to open upward. The cusp created when $a_{i}b_{j}= b_{i}a_{j}$ opens downward, which is not what we want, so our desired cusp must have been created when $a_{i}a_{j}= b_{i}b_{j}$ for some $j$. This applies to all $i, j$, so with the appropriate smoothing, we can assume that for each $i$, there exists $j$ with $a_{i}a_{j}= b_{i}b_{j}$. WLOG, we will assume $\max \left( \frac{a_{1}}{b_{1}}, \frac{b_{1}}{a_{1}}\right) \ge \max \left( \frac{a_{i}}{b_{i}}, \frac{b_{i}}{a_{i}}\right)$ for all $i$, and that $a_{1}a_{n}= b_{1}b_{n}$. Also suppose $\frac{a_{1}}{b_{1}}= \frac{b_{n}}{a_{n}}= t \ge 1$ (we could always swap $(a_{1}, b_{1})$ with $(a_{n}, b_{n})$ to achieve this). We again look to minimizing $\sum \min(a_{i}b_{j}, b_{i}a_{j})-\sum \min(a_{i}a_{j}, b_{i}b_{j})$. Note that for every $j$, $\min(a_{1}b_{j}, b_{1}a_{j}) = b_{1}a_{j}$, since $\frac{a_{1}}{b_{1}}\ge \frac{a_{j}}{b_{j}}$. Similarly, $\min(a_{1}a_{j}, b_{1}b_{j}) = b_{1}b_{j}$. These terms are the only ones involving either $a_{1}$ or $b_{1}$ in the sums, and they add up to $b_{1}(a_{1}+2a_{2}+2a_{3}+\ldots+2a_{n}-b_{1}-2b_{2}-2b_{3}-\ldots-2b_{n})$. In an analogous way, the terms involving $a_{n}$ or $b_{n}$ add up to $a_{n}(b_{n}+2b_{n-1}+\ldots+2b_{1}-a_{n}-2a_{n-1}-\ldots-2a_{1})$. Notice that $a_{1}$ and $b_{n}$ appear only in $a_{1}b_{1}+a_{n}b_{n}$, so if we decrease $a_{1}$ and $b_{n}$ by the same ratio in such a way that $\frac{a_{1}}{b_{1}}$ is still at least $\frac{a_{i}}{b_{i}}$ for any $i$, the entire sum is decreased. Then we can decrease them enough so that $\frac{a_{1}}{b_{1}}= \frac{a_{i}}{b_{i}}$ for some $i$, and WLOG $i = 2$. However, we can replace $a_{1}$ and $a_{2}$ with $a_{1}+a_{2}$, and we can replace $b_{1}$ and $b_{2}$ with $b_{1}+b_{2}$. It is not difficult to verify that this does not change the sum in any way, but it reduces the number of variables to $n-1$, and so the inequality holds, and the induction works. You might notice from this and the first solution that the ratio $\frac{a_{i}}{b_{i}}$ is important in this problem.
10.08.2015 01:01
What is $sgn(x)$ defined to be?
10.08.2015 01:09
MathPanda1 wrote: What is $sgn(x)$ defined to be? See: https://en.wikipedia.org/wiki/Sign_function Basically, $\text{sgn} (x) = -1 \text{ if } x<0$, $\text{sgn} (x)=0 \text{ if } x=0$, and $\text{sgn} (x) =1 \text{ if } x>0$.
12.06.2017 05:44
I find the identity in post 5's solution impossible to motivate, so here's my solution, which shows how I derive a similar identity: We will rewrite $\min (a_ib_j,a_jb_i) - \min (a_ia_j,b_ib_j)$ as $0.5(a_ib_j + a_jb_i - |a_ib_j-a_jb_i|) - 0.5( a_ia_j +b_ib_j - | a_ia_j-b_ib_j|)$, using the identity $\min (x,y) = 0.5(x+y-|x-y|)$. This last expression becomes $0.5( (b_j-a_j)(a_i-b_i) - |a_ib_j - a_jb_i| + |a_ia_j-b_ib_j| )$. We show the sum of all such terms is $\ge 0$. Next, we notice that $|x| - |y|$ has magnitude $\min ( |x-y|, |x+y|)$, so $|x|-|y| = \pm \min (|x-y|, |x+y|)$. Of course, this does not help us find the value of $X=|a_ia_j-b_ib_j| - |a_ib_j-a_jb_i|$ unless we know the proper choice of sign, so we must determine when the previous quantity is nonnegative and when it is negative. Notice $|a_ia_j-b_ib_j| \ge |a_ib_j-a_jb_i| \iff a_i^2a_j^2 + b_i^2b_j^2 \ge a_i^2b_j^2 + a_j^2b_i^2 \iff (a_i^2 - b_i^2)(a_j^2-b_j^2) \ge 0 \iff (a_i-b_i)(a_j-b_j)\ge 0$. Let's define $c_i = \text{sgn} (a_i-b_i)$. Then the sign of $X$ is just $c_ic_j$, so each of the original terms becomes equivalent to $0.5( (b_j-a_j)(a_i-b_i) +c_ic_j\min ( |a_ib_j -a_jb_i + a_ia_j-b_ib_j|, |a_ib_j -a_jb_i - a_ia_j +b_ib_j|) )$. We now define $d_i = a_i-b_i$ for convenience, noting $d_i = |a_i-b_i|c_i$, so this expression becomes $0.5( - d_id_j +c_ic_j \min ( |(a_j+b_j)d_i|, |(a_i+b_i)d_j|) = 0.5d_id_j \left(-1 + \min \left( \left|\dfrac{a_j+b_j}{d_j}\right|, \left|\dfrac{a_i+b_i}{d_i}\right|\right)\right)$, which equals $0.5d_id_j \min \left( \dfrac{a_j+b_j}{|a_j-b_j|} -1, \dfrac{ a_i+b_i}{|a_i-b_i|}-1 \right) = 0.5d_id_j \min (D_i,D_j)$ where $D_i = \dfrac{a_i+b_i}{|a_i-b_i|}-1 \ge 0$. So now the problem is reduced to the following: Lemma: If $d_i\in \mathbb R$ and $D_i\in \mathbb R_{\ge 0}$ for $1\le i\le n$, then $\sum_{i,j} d_id_j \min (D_i,D_j) \ge 0$. Proof: Define $\lambda_i (x)$ to be $0$ if $x \not \in [0,D_i]$ and $1$ otherwise. Then the sum becomes $\sum_{i,j} d_id_j \int_0^{\infty} \lambda_i (t) \lambda_j (t) dt= \int_0^{\infty} \sum_{i,j} d_i\lambda_i (t) d_j\lambda_j (t) = \int_0^{\infty} \left( \sum_i d_i \lambda_i (t) \right)^2 \ge 0$, so we're done.
24.06.2019 23:28
fuzzylogic wrote: Let $a_1, b_1, a_2, b_2, \dots , a_n, b_n$ be nonnegative real numbers. Prove that \[ \sum_{i, j = 1}^{n} \min\{a_ia_j, b_ib_j\} \le \sum_{i, j = 1}^{n} \min\{a_ib_j, a_jb_i\}. \] Maybe I have misunderstood something, but it seems clearly false. Pick $\epsilon > 0$, but small. Let each of the $a_i$ be $\epsilon$ and each of the $b_i$ be $1$. Then, independent of $i,\,j$ we have $\text{min}\{a_ia_j, b_ib_j\} = \text{min}\{\epsilon^2,1\} = \epsilon^2$ while independent of $i,\,j$ we have $\text{min}\{a_ib_j,b_ia_j\} = \epsilon$. So the left hand sum is $n^2 \epsilon^2$, the right hand sum is $n^2 \epsilon$, and the inequality fails.
25.06.2019 00:32
GetOffTheInternet wrote: fuzzylogic wrote: Let $a_1, b_1, a_2, b_2, \dots , a_n, b_n$ be nonnegative real numbers. Prove that \[ \sum_{i, j = 1}^{n} \min\{a_ia_j, b_ib_j\} \le \sum_{i, j = 1}^{n} \min\{a_ib_j, a_jb_i\}. \] Maybe I have misunderstood something, but it seems clearly false. Pick $\epsilon > 0$, but small. Let each of the $a_i$ be $\epsilon$ and each of the $b_i$ be $1$. Then, independent of $i,\,j$ we have $\text{min}\{a_ia_j, b_ib_j\} = \text{min}\{\epsilon^2,1\} = \epsilon^2$ while independent of $i,\,j$ we have $\text{min}\{a_ib_j,b_ia_j\} = \epsilon$. So the left hand sum is $n^2 \epsilon^2$, the right hand sum is $n^2 \epsilon$, and the inequality fails. why it fails? the inequality is still true!
09.06.2022 17:18
In the following,$\int$ means intergrating from $0$ to $+\infty$. We define $f(t,x)=1(\text{if}\ 0 \le x \le t),0(\text{otherwise})$.It is easy to see that $\int f(a,x)f(b,x)=\int cf(a/c,x)f(b/c,x)$ and $\min\{a,b\}=\int f(a,x)f(b,x)$. Notice that $$\begin{aligned}&\sum\limits_{1 \le i,j \le n}\min\{a_ib_j,b_ia_j\}\\=&\sum\limits_{1 \le i,j \le n}\int f(a_ib_j,x)f(b_ia_j,x)\\=&\sum\limits_{1 \le i,j \le n}\int b_ib_jf(a_i/b_i,x)f(a_j/b_j,x)\\=&\int (\sum\limits_{i=1}^n b_i f(a_i/b_i,x))^2\\=&\int \frac{1}{2}(\sum\limits_{i=1}^n b_i f(a_i/b_i,x))^2+\frac{1}{2}(\sum\limits_{i=1}^n a_i f(b_i/a_i,x))^2\\ \ge& \int (\sum\limits_{i=1}^n b_i f(a_i/b_i,x))(\sum\limits_{i=1}^n a_i f(b_i/a_i,x))\\=&\sum\limits_{1 \le i,j \le n}\int a_ib_jf(a_j/b_j,x)f(b_i/a_i,x)\\=&\sum\limits_{1 \le i,j \le n}\int f(a_ia_j,x)f(b_ib_j,x)\\=&\sum\limits_{1 \le i,j \le n}\min\{a_ia_j,b_ib_j\}\end{aligned}$$ Q.E.D.
23.06.2023 10:43
My first impression of this inequality is related to completing the square using the Cauchy Inequality, let me go for that for an hour and I will reply to this with my solution (if I can get the solution).
03.08.2023 11:20
The splendid integral solutions are impossible to think of, and maybe the sol below is much more natural. We only need $\sum_{i, j = 1}^{n} \frac{a_ib_j + a_jb_i - |a_ib_j-a_jb_i|}{2} - \sum_{i, j = 1}^{n} \frac{a_ia_j +b_ib_j - | a_ia_j-b_ib_j|}{2} \ge 0$, which is $\sum_{i, j = 1}^{n} |a_ia_j-b_ib_j| -\sum_{i, j = 1}^{n} |a_ib_j-a_jb_i| \ge (\sum_{i = 1}^{n} (a_i-b_i))^2$. We will prove $\sum_{i, j = 1}^{n} |a_ia_j-b_ib_j| -\sum_{i, j = 1}^{n} |a_ib_j-a_jb_i| \ge (\sum_{i = 1}^{n} |a_i-b_i|)^2$. For $1 \le i \le n$, mark points $C_i(a_i, b_i)$ and $C_i'(b_i, a_i)$ on a $x-y$ coordinate. Let $l: y=x$. Draw $2n$ vectors $\overrightarrow{OC_i}$ and $\overrightarrow{OC_i'}$. Relabel all the vectors to be $v_1, v_2, \dots, v_{2n}$ such that the lines they belong to rotate clockwise from ${l}$ (if any two lines coincide, label them at any order). Then $v_i$ is in the first quadrant and $v_i, v_{2n+1-i}$ are symmetric wrt ${l}$. Let $v_1+v_2+\dots+v_n=\overrightarrow{OX}$ and $\blacksquare_{WXYZ}$ is a square where $W, Y$ is on ${l}$. Let ${P}$ be the polygon with side vectors $v_1, v_2, \cdots, v_{2n}, -v_1, -v_2, \cdots, -v_{2n}$. ${P}$ is not self-intersecting, its sides are all counterclockwise, and ${P}$ is both axially (wrt ${l}$) and centrally symmetric. $X, Z$ is on ${P}$. Since $|a_ia_j-b_ib_j|=v_i \times v_{2n+1-j}=v_j \times v_{2n+1-i}$, $-|a_ib_j-a_jb_i|=\begin{cases} v_i \times v_j & i<j \\ 0 & i=j \\ v_{2n+1-i} \times v_{2n+1-j} & i>j \end{cases}$ and $WX=\sum_{i = 1}^{n} |a_i-b_i|$, we just need $\sum_{1 \le i<j \le 2n} v_i \times v_j \ge XW^2$, which is $S_P \ge S_{WXYZ}$. Recall that all vectors are in the first quadrant and have positive slope, $P$ covers $\blacksquare_{WXYZ}$. Done.
26.06.2024 20:12
Storage/Propaganda
13.11.2024 00:17
It suffices to show \[ 0 \le \sum_{i, j = 1}^{n} \min\{a_ib_j, a_jb_i\} - \min\{a_ia_j, b_ib_j\}. \] Some casework shows \[ \sum_{i, j = 1}^{n} \min\{a_ib_j, a_jb_i\} - \min\{a_ia_j, b_ib_j\} \]\[= \sum_{i = 1}^{n}\sum_{j = 1}^{n} \text{sgn}(a_i - b_i) \min\{a_i, b_i\} \, \text{sgn}(a_j - b_j) \min\{a_j, b_j\} \min\left\{ \frac{\max\{a_i, b_i\}}{\min\{a_i, b_i\}} - 1, \frac{\max\{a_j, b_j\}}{\min\{a_j, b_j\}} - 1 \right\} .\]The rest is not so difficult. \[ \sum_{i = 1}^{n}\sum_{j = 1}^{n} \text{sgn}(a_i - b_i) \min\{a_i, b_i\} \, \text{sgn}(a_j - b_j) \min\{a_j, b_j\} \int_{0}^{\infty} \mathbf{1}_{\left\{ x \le \frac{\max\{a_i, b_i\}}{\min\{a_i, b_i\}} - 1 \right\}} \mathbf{1}_{\left\{ x \le \frac{\max\{a_j, b_j\}}{\min\{a_j, b_j\}} - 1 \right\}} \, \mathrm{d}x \]\[= \int_{0}^{\infty} \sum_{i = 1}^{n}\sum_{j = 1}^{n} \text{sgn}(a_i - b_i) \min\{a_i, b_i\} \, \text{sgn}(a_j - b_j) \min\{a_j, b_j\}\mathbf{1}_{\left\{ x \le \frac{\max\{a_i, b_i\}}{\min\{a_i, b_i\}} - 1 \right\}} \mathbf{1}_{\left\{ x \le \frac{\max\{a_j, b_j\}}{\min\{a_j, b_j\}} - 1 \right\}} \, \mathrm{d}x\]\[= \int_{0}^{\infty} \left( \sum_{i=1}^{n}\text{sgn}(a_i - b_i) \min\{a_i, b_i\} \mathbf{1}_{\left\{ x \le \frac{\max\{a_i, b_i\}}{\min\{a_i, b_i\}} - 1 \right\}} \right)^2 \, \mathrm{d}x \ge 0\]as desired. Remark. This works because $\frac{\text{max}\{a_j, b_j\}}{\min\{a_j, b_j\}}, \frac{\text{max}\{a_i, b_i\}}{\min\{a_i, b_i\}}\ge 1$. Oops. In retrospect, this is more or less the same as kingu's solution, except the lemma is proved directly in the solution. Edit #2: 100th post!!!