Let $x_1,x_2,\cdots,x_n$ $(n\geq2)$ be a non-decreasing monotonous sequence of positive numbers such that $x_1,\frac{x_2}{2},\cdots,\frac{x_n}{n}$ is a non-increasing monotonous sequence .Prove that \[ \frac{\sum_{i=1}^{n} x_i }{n\left (\prod_{i=1}^{n}x_i \right )^{\frac{1}{n}}}\le \frac{n+1}{2\sqrt[n]{n!}}\]
Problem
Source: China Hangzhou
Tags: inequalities, inequalities proposed, China TST
24.03.2015 23:37
Induction in the obvious manner does work, however the calculations are quite brutal. The key idea is to reduce to a case where the object function being studied always has a positive second derivative. Using this one can simply check the endpoints and the result follows.
03.04.2015 09:39
03.04.2015 14:14
How is that clear? Can you please elaborate?
04.04.2015 01:32
Consider the substitution (and abuse of notation) $ x_k \rightarrow kx_k $ for all $ k \in \{1, 2, \dots, n\}. $ My further use of these variables will assume that the substitutions have happened. We are going to attack the problem with continuous smoothing. Since the inequality is homogenous we can WLOG assume that $ \prod_{i = 1}^{n}x_i = 1. $ Moreover the conditions outlined in the problem become $ x_1 \ge x_2 \ge x_3 \ge \dots \ge x_n $ and $ x_1 \le 2x_2 \le \dots \le nx_n. $ The inequality we want to prove is $ x_1 + 2x_2 + \dots + nx_n \le \frac{n(n + 1)}{2}. $ I claim we can smooth $ x_k $ and $ x_{k + 1} $ to $ \sqrt{x_kx_{k + 1}} $ and $ \sqrt{x_kx_{k + 1}} $ without decreasing the value of $ x_1 + 2x_2 + \dots + nx_n. $ It suffices to show that $ kx_k + (k + 1)x_{k + 1} \le (2k + 1)\sqrt{x_kx_{k + 1}}. $ By squaring and factoring we find that the claim is equivalent to $ (k^2x_k - (k + 1)^2x_{k + 1})(x_k - x_{k + 1}) \le 0 $ which follows from the facts that $ x_k \ge x_{k + 1} $ and $ kx_k \le (k + 1)x_{k + 1}. $ Now I claim that if we smooth $ x_1 $ and $ x_2 $ in this fashion, and then smooth $ x_2 $ and $ x_3, $ and then $ x_3 $ and $ x_4, $ and so on until $ x_{n - 1} $ and $ x_n, $ and then re-smooth $ x_1 $ and $ x_2, $ and so on infinitely many times then each element in our sequence approaches $ \sqrt[n]{\prod_{i = 1}^{n}x_i} = 1. $ This will obviously prove the inequality since this smoothing can never break any of the three given conditions on the $ x_i. $ After a bunch of smoothing, our sequence will look like $ x_1^{a_{11}}x_2^{a_{12}}{\dots}x_n^{a_{1n}}, x_1^{a_{21}}x_2^{a_{22}}{\dots}x_n^{a_{2n}}, \dots, x_1^{a_{n1}}x_2^{a_{n2}}{\dots}x_n^{a_{nn}} $ for some $ a_{ij} \in \mathbb{Q} $ for all $ i, j \in \{1, 2, \dots, n\}. $ Associate to this sequence the matrix $ \begin{pmatrix} a_{11} & a_{12} & \ldots & a_{1n} \\ a_{21} & a_{22} & \ldots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \ldots & a_{nn}\end{pmatrix}. $ The matrix associated with the initial sequence is clearly the $ n \times n $ identity matrix $ I_n. $ Every time we smooth $ x_1 $ and $ x_2 $ and then $ x_2 $ and $ x_3 $ and so on until $ x_{n - 1} $ and $ x_n $ we are taking the matrix associated with our sequence before the process begins and multiplying it by $ A = \begin{pmatrix} \frac{1}{2} & \frac{1}{2} & \ldots & 0 \\ \frac{1}{2} & \frac{1}{2} & \ldots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \ldots & 1\end{pmatrix} \cdot \begin{pmatrix} 1 & 0 & 0 & \ldots & 0 \\ 0 & \frac{1}{2} & \frac{1}{2} & \ldots & 0 \\ 0 & \frac{1}{2} & \frac{1}{2} & \ldots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \ldots & 1\end{pmatrix} \cdot \dots \cdot \begin{pmatrix} 1 & 0 & \ldots & 0 & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & \ldots & \frac{1}{2} & \frac{1}{2} \\ 0 & 0 & \ldots & \frac{1}{2} & \frac{1}{2}\end{pmatrix}. $ This is essentially a Markov Chain, and it is clear that $ \lim_{N \to \infty}A^N = \begin{pmatrix} \frac{1}{n} & \frac{1}{n} & \ldots & \frac{1}{n} \\ \frac{1}{n} & \frac{1}{n} & \ldots & \frac{1}{n} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{1}{n} & \frac{1}{n} & \ldots & \frac{1}{n}\end{pmatrix}. $ which implies the desired result.
05.04.2015 06:50
Isn't there an elementary proof? I don't know Linear algebra.
05.04.2015 10:11
Me neither Is there something involving Abel summation ?
06.04.2015 04:14
EDIT: Miscalculated a derivative
06.04.2015 08:45
as long as $y_{n + 1}^n \le y_1y_2 \cdots y_n$, this partial derivative will always be negative. Sorry this is wrong !
29.02.2016 05:56
$ \frac{\sum\limits_{i=1}^{n} x_i }{n\left (\prod\limits_{i=1}^{n}x_i \right )^{\frac{1}{n}}}\le \frac{n+1}{2\sqrt[n]{n!}} \Leftrightarrow \frac{2(\sum\limits_{i=1}^{n} x_i)}{n(n+1)} \le \frac{\left (\prod\limits_{i=1}^{n}x_i \right )^{\frac{1}{n} }}{ \sqrt[n]{n!}} = (\prod\limits_{i=1}^{n}\frac{x_i}{i})^{\frac{1}{n}} $ Since $ (\prod\limits_{i=1}^{n}\frac{x_i}{i})^{\frac{1}{n}} \geq \frac{n}{\sum\limits_{i=1}^{n} \frac{i}{x_i}}$ we only need to prove $\frac{n}{\sum\limits_{i=1}^{n} \frac{i}{x_i}} \geq \frac{2(\sum\limits_{i=1}^{n} x_i)}{n(n+1)}$, which is equivalent to $ \frac{n+1}{2} \geq \frac{1}{n^2}(\sum\limits_{i=1}^{n} \frac{i}{x_i})(\sum\limits_{i=1}^{n} x_i)$ Notice that $x_1, x_2, ..., x_n$ is non-decreasing and $\frac{1}{x_1}, \frac{2}{x_2}, ..., \frac{n}{x_n}$ is non-decreasing, by Chebyshev's inequality, $\frac{1}{n^2}(\sum\limits_{i=1}^{n} \frac{i}{x_i})(\sum\limits_{i=1}^{n} x_i) \le \frac{1}{n}\sum\limits_{i=1}^{n} i = \frac{n+1}{2}$
21.05.2016 17:28
AM-GM and chebyshev kill this problem
21.05.2016 18:09
mssmath wrote: Induction in the obvious manner does work, however the calculations are quite brutal. The key idea is to reduce to a case where the object function being studied always has a positive second derivative. Using this one can simply check the endpoints and the result follows. I attempted for an inductive work but midway the calculations got wrong somewhere and ............................ sad ........ thing happened !! For those who want a solution without Linear Algebra use induction and be patient and careful with the calculations.
21.05.2016 18:10
Stranger8 wrote: AM-GM and chebyshev kill this problem Stranger please elaborate a little and if possible post the solution !!
09.08.2016 01:03
The idea is quite nice: Let $r_i = \frac{x_{i+1}}{x_i}$, so $1 \le r_i \le 1 + \frac{1}{i}$. In particular, observe that the value of $x_1$ is completely irrelevant here, for if we know all the $r_i$ we have the value. Hence, the LHS is a function of the $r_i$. I will show that the LHS is nondecreasing in each of the $r_i$, so its maximal value will be obtained when all $r_i$ are maximal. This corresponds to $1, 2, 3, \cdots n$, so we will conclude. Let \[ g = \log{(\sum_{i=1}^{n} x_i)} - \frac{1}{n} \log{(\prod_{i = 1}^n x_i )} \] It is clear that if $g$ is increasing in each $r_i$, then we will conclude. It suffices to do the calculation then. Let $S_1 = \sum_{i = j+1}^{n} x_i$, $S_2 = \sum_{i = 1}^{j} x_i$, $S = S_1 + S_2$ and let $P_1 = \prod_{i = j+1}^{n} x_i$, $P_2 = \prod_{i = 1}^{j} x_i$, and $P = P_1P_2$. I claim that $\dfrac{\partial g}{\partial r_i} \ge 0$. Observe that \[ \frac{\partial g}{\partial r_i} = \frac{S_1}{Sr_i} - \frac{n-j}{nr_i}, \] so showing that this is positive is just a consequence of $\frac{S_1}{S} \ge \frac{n - j}{n}$, which is clear: Observe that it becomes $\frac{S_1}{S_2} \ge \frac{n-j}{j}$, which is of course a direct consequence of \[ jS_1 \ge (n-j)(j)x_{j+1} \ge (n-j)(j)x_j \ge (n-j)S_2 \]
17.05.2023 07:10
Stranger8 wrote: AM-GM and chebyshev kill this problem adityaguharoy wrote: Stranger8 wrote: AM-GM and chebyshev kill this problem Stranger please elaborate a little and if possible post the solution !! Here is a prove using AM-GM and Chebyshev inequality. We have $\frac1{x_1}\geq\frac2{x_2}\geq\cdots\geq\frac n{x_n}$. Then using AM-GM and Chebyshev inequality, $$\begin{aligned}\frac{n(n+1)}2 &=\sum_{j=1}^nx_j\cdot\frac j{x_j}\\ &\geqslant\frac 1n\sum_{j=1}^nx_j\sum_{j=1}^n\frac j{x_j}\\ &\geqslant\frac 1n\sum_{j=1}^nx_j\cdot n\sqrt[n]{\frac{n!}{x_1x_2\cdots x_n}}\\ &=\frac{x_1+x_2+\cdots +x_n}{\sqrt[n]{x_1x_2\cdots x_n}}\cdot\sqrt[n]{n!}.\end{aligned}$$Therefore $\frac{x_1+x_2+\cdots +x_n}{n\sqrt[n]{x_1x_2\cdots x_n}}\leqslant\frac{n+1}{2\sqrt[n]{n!}}.\blacksquare$