The least number is $m$ and the greatest number is $M$ among $ a_1 ,a_2 ,\ldots,a_n$ satisfying $ a_1 +a_2 +...+a_n =0$. Prove that \[ a_1^2 +\cdots +a_n^2 \le-nmM\]
Problem
Source: New Ineq
Tags: Inequality, optimization, maximization, IMO Shortlist, IMO Shortlist 1972
27.08.2008 00:11
brian_gold wrote: The least number is m and the greatest number is M among $ a_1 ,a_2 ,...,a_n$ satisfying $ a_1 + a_2 + ... + a_n = 0$ Prove $ a_1^2 + ... + a_n^2 \le - nmM$ $ \sum {(x_i-m)(x_i-M)}+\sum {(y_i-m)(y_i-M)}\le 0$
28.08.2008 04:57
What are $ x_i$ and $ y_i$ supposed to be?
28.08.2008 14:32
Marius Mainea wrote: brian_gold wrote: The least number is m and the greatest number is M among $ a_1 ,a_2 ,...,a_n$ satisfying $ a_1 + a_2 + ... + a_n = 0$ Prove $ a_1^2 + ... + a_n^2 \le - nmM$ $ \sum {(x_i - m)(x_i - M)} + \sum {(y_i - m)(y_i - M)}\le 0$ $ \sum {(a_i - m)(a_i - M)}\le 0$
29.08.2008 09:25
The $ a_i$ part is right, but I don't know where you get $ b_i$ from.
15.10.2013 23:05
Let $a_i: =M-d_i$ for all $i$. We have $m=M-d_n$ and $\sum_{i=1}^{m}d_i=nM$ \[\sum_{i=1}^{n}a_i^2=\sum_{i=1}^{n}(M^2-2Md_i+d_i^2)=-nM^2+\sum_{i=1}^{n}d_i^2\]\[\leq -nM^2+d_n(nM)=-nM^2+nM(M-m)=-nmM\]
26.10.2021 05:07
Notice that $(m-a_i)(M-a_i) \leq 0$ $(m-a_1)(M-a_1)+(m-a_2)(M-a_2)+\cdots+(m-a_n)(M-a_n) \geq 0$ $(a_1^2+a_2^2+ \cdots +a_n^2)-(m+M)(a_1+a_2+ \cdots +a_n)+nmM$ $a_1^2+a_2^2+ \cdots +a_n^2-(m+M)\cdot0+nmM \leq 0$ $a_1^2+a_2^2+ \cdots +a_n^2 \leq -nmM$
10.01.2025 18:01
First, supose WLOG $a_1=M$ and $a_n=m$ and let $X$ be a discrete random varible wich takes the value $a_1, a_2, ..., a_n$ uniformly and randomly. Since $\sum_i a_i=0$ we get that $$\mathbb{E}[X]=\sum_i (\frac{1}n \cdot a_i)=0$$Now, let $\sum_i a_i^2= \mathbb{S}$. Therefore $$\sigma^2_X=\mathbb{E}[X^2] -\mathbb{E}[X]^2=\sum_i (\frac{a_i^2}n)-0=\frac{\mathbb{S}}n$$Then, let $$X_1 \in \{a_1, a_2, ..., a_n\} \setminus \{a_1, a_n\}$$such that $\mathbb{P}(X_1=a_2)=0$ redistributing this probability for $a_1$ and $a_n$ such that $\mathbb{E}[X_1]=0$. As the probabilty of the value of $X_1$ is more concentrated in extreme values like $a_1$ and $a_n$(or just by Jensen´s inequallity), whe get that $$\sigma^2_{X_1} \geq \sigma^2_X$$So, take $X_2, X_3,..., X_{n-2}$ by the same way. Hence, we´ll have $X_{n-2} \in \{a_1, a_n\}$ with $\mathbb{E}[X_{n-2}]=0$ and $$\sigma^2_{X_{n-2}} \geq \sigma^2_{X_{n-1}} \geq ... \geq \sigma^2_{X_1} \geq \sigma^2_X=\frac{\mathbb{S}}2$$But, we can explicity calculate $\mathbb{E}[X_{n-2}]$: let $\mathbb{P} (X_{n-2}=a_1)=p$ and $\mathbb{P} (X_{n-2}=a_n)=1-p$, so $$\mathbb{E}[X_{n-2}]=0=pa_1+(1-p)a_n \Rightarrow p=\frac{-a_n}{a_1-a_n} \Rightarrow 1-p=\frac{a_1}{a_1-a_n}$$$$\sigma^2_{X_{n-2}}=\mathbb{E}[X^2_{n-2}]=pa_1^2+(1-p)a_n^2=-a_1\cdot a_n \Rightarrow -a_1\cdot a_n \geq \sigma^2_X=\frac{\mathbb{S}}n \Rightarrow -n\cdot a_1 \cdot a_n \geq \mathbb{S} \blacksquare$$