Show that each convex pentagon has a vertex from which the distance to the opposite side of the pentagon is strictly less than the sum of the distances from the two adjacent vertices to the same side. Note. If the pentagon is labeled $ ABCDE$, the adjacent vertices of $ A$ are $ B$ and $ E$, the ones of $ B$ are $ A$ and $ C$ etc.
Problem
Source: Romanian TST 2 2008, Problem 3
Tags: inequalities, geometry, vector, trigonometry, analytic geometry, geometry proposed
11.06.2008 21:56
pohoatza wrote: Note. If the pentagon is labeled $ ABCDE$, the adjacent vertices of $ A$ are $ B$ and $ E$, the ones of $ B$ are $ A$ and $ C$ etc. Got it It seemed that a combinatorial solution for this question was out of reach, altough some students mentioned (successful) approaches by repeated projections of sides on diagonals & careful area estimations - maybe w/ convex combinations, and even with some term smashing (being required a strict inequality...). Here follows my (polished) solution from the contest: Denote by $ [\mathcal{P}]$ the area of the polygon (polygonal surface) $ \mathcal{P}$, $ S : = [ABCDE]$, and by lowercase letters the area of the convex figure spanned by the sides at the corresponding vertex (thus, $ a = [EAB]$, and so on). From now on i will denote cyclic sums by $ \sum$. Assume that $ a = \min\{a, b, c, d, e\}$. We will prove that the vertex $ A$ itself satisfies the hyphotesis, id est, $ d(A, CD) < d(B, CD) + d(E, CD)$. Since it is difficult to handle distances, multiply by $ CD / 2$ to get an inequality between areas: $ [ACD] < [BCD] + [ECD]$, or equivalently, $ S < b + c + d + e$. Assume, by contradiction, that the inequality fails. Clearly, $ S > a$, hence $ (S - a)\left(S - (b + c + d + e) \right) \ge 0$, or, by expanding everything, $ S^2 - S \sum a + a(b + c + d + e) \ge 0$. However, by the minimality of $ a$, $ a(b + c + d + e) \le ab + ea + bc + ed < \sum ab$. This yields $ 0 \le S^2 - S \sum a + a(b + c + d + e) < S^2 - S \sum a + \sum ab = 0$, a contradiction in the view of the lemma in the sequel (which, in fact, renders these computations straightforward). $ \blacksquare$ Lemma. With the notations above, the following (not so, it seems) well-known identity holds $ \boxed{S^2 - S \sum a + \sum ab = 0}$. What follows is a condensed exposition of the solution in Prasolov's book. However, there are other approaches available, such as... analytic geometry (the way I proceeded in the contest ). In what follows I will use $ \mathbf{boldface}$ letters to denote vectors. (Those familiarized with the "Vectors" chapter in the aforementioned book may skip to the next paragraph without missing anything). Introduction. Denote by $ \prec \cdot, \cdot \succ$ the pseudoinner product of two nonzero vectors, namely $ \prec \mathrm{u}, \mathrm{v} \succ : = |\mathrm{u}| \cdot |\mathrm{v}| \cdot \sin \left( \angle ( \mathrm{u}, \mathrm{v} ) \right)$, (the signed area of the parralelogram spanned by $ \mathrm{u}, \mathrm{v}$), and extend it such that it maps a pair to zero when one of the vectors is the zero vector. If we denote by $ < \mathrm{u}, \mathrm{v} >$ the inner product of $ \mathrm{u}, \mathrm{v}$, we get the relation $ < \mathrm{u}, \mathrm{v} > ^2 + \prec \mathrm{u}, \mathrm{v} \succ^2 = (|\mathrm{u}| \cdot |\mathrm{v}|)^2$, whence, if $ \mathrm{u} = (a_1, a_2)$ and $ \mathrm{v} = (b_1, b_2)$, $ \prec \mathrm{u}, \mathrm{v} \succ = a_1b_2 - a_2b_1$. Now, it is easy to check that this application is anticommutative and "well-behaved", namely, For any scalar $ \lambda \in \mathbb{R}$, $ \prec \lambda\mathrm{u}, \mathrm{v} \succ = \lambda \prec \mathrm{u}, \mathrm{v} \succ$, and $ \prec \mathrm{u}, \mathrm{v} + \mathrm{w} \succ = \prec \mathrm{u}, \mathrm{v} \succ + \prec \mathrm{u}, \mathrm{w} \succ$. Now, assign to any triple of points $ (A, B, C)$ its oriented area, namely $ S(A, B, C) : = \frac {1}{2} \prec \overrightarrow{AB}, \overrightarrow{AC} \succ$. Clearly, $ [ABC] = |S(A, B, C)|$ and $ S(A, B, C) = S(B, C, A) = - S(B, A, C)$. Solution (for the Lemma). Assume that $ \{\mathrm{e_1}, \mathrm{e_2}\}$ generates the vectors in the plane, and pick two reals $ x_1, x_2$. Let $ \mathrm{x} : = x_1\mathrm{e_1} + x_2\mathrm{e_2}$. Taking the pseudoinner product to the right with $ \mathrm{e_1}$, respectively $ \mathrm{e_2}$, we get $ \prec \mathrm{x}, \mathrm{e_1} \succ = x_2 \prec \mathrm{e_2}, \mathrm{e_1} \succ$, and $ \prec \mathrm{x}, \mathrm{e_2} \succ = x_1 \prec \mathrm{e_1}, \mathrm{e_2} \succ$, and now, multiplying by $ \mathrm{e_2}$, respectively $ \mathrm{e_1}$, and substracting $ \prec \mathrm{e_1}, \mathrm{e_2} \succ \mathrm{x} = \prec \mathrm{x}, \mathrm{e_2} \succ e_1 + \prec \mathrm{e_1}, \mathrm{x} \succ e_2$. Finally, take our product to the right with some vector $ \mathrm{y}$, which yields $ \prec \mathrm{e_1}, \mathrm{e_2} \succ \prec \mathrm{x}, \mathrm{y} \succ + \prec \mathrm{x}, \mathrm{e_2} \succ \prec \mathrm{y}, \mathrm{e_1} \succ + \prec \mathrm{e_1}, \mathrm{x} \succ \prec \mathrm{y}, \mathrm{e_2} \succ = 0$. Now, the problem is over. Assume that $ A, B, C, D, E$ is the order of the points, counter-clockwise (i.e., $ S(A, B, C) > 0$). Take $ \mathrm{e_1} = \overrightarrow{AB}$, $ \mathrm{e_2} = \overrightarrow{AE}$, $ \mathrm{x} = \overrightarrow{AC}$, $ \mathrm{y} = \overrightarrow{AD}$. The obtained relation reduces to $ a(S - b - e) - (S - b - d)(S - c - e) + be = 0$, and this ends the proof. $ \square$