We say that a function $f:\mathbb{R}^k \rightarrow \mathbb{R}$ is a metapolynomial if, for some positive integers $m$ and $n$, it can be represented in the form \[f(x_1,\cdots , x_k )=\max_{i=1,\cdots , m} \min_{j=1,\cdots , n}P_{i,j}(x_1,\cdots , x_k),\] where $P_{i,j}$ are multivariate polynomials. Prove that the product of two metapolynomials is also a metapolynomial.
Problem
Source: IMO Shortlist 2012, Algebra 7
Tags: function, linear algebra, algebra, polynomial, IMO Shortlist
31.07.2013 05:07
My solution proves that if $f, g$ are metapolynomials, 1. $f+g$ is a metapolynomial. 2. $-f$ is a metapolynomial. 3. $fg$ is a metapolynomial, when $f$ is always positive. 4. $fg$ is a metapolynomial.
03.08.2013 12:46
None of the claims are clear for me. It is possible to give the main ideas for each claim? As it is an A7, it isnt that easy for everybody to understand the claims.
03.08.2013 21:02
I will just write the full solution with "not so much" details. Well, for some visualization I wrote a matrix with elements P={P_ij}. Now the value of f at some point is first, pick the minimum from each row, then, pick the maximum among those minimums. Denote the matrix for g as $Q$ Assume that $P$ has $a$ rows and $b$ columns, and $Q$ has $c$ rows and $d$ columns. For step 1, you merge every pair of rows picked one each from $P$ and $Q$ to make a new set of rows. You will get a matrix with ab rows as a result. How to merge : you make a new row of cd elements by writing the sum of every pair picked from each two original rows. The reason why this matrix works for $f+g$ is pretty obvious if you think about it For Step 2, you should think about how to get the value of f a little bit differently. Pick arbitrary element from each row, so $a$ values in total, take the maximum value, do this for all $b^a$ tuples, now take the minimum of them. This will give the value of $f$, because to get the minimum value, it is the best to pick the minimum from each rows. Now multiplying $(-1)$ to all elements of the matrix and doing the same thing above to get the value of $-f$, maximum and minimum is switched to each other and you can again get something like "maximum of the minimums", and make some appropriate matrix. I guess the third claim should have been "every $P_{ij}, Q_{ij}$ is always positive"(call this a positive metapolynomial). Anyway, if this is the case, make a new matrix by merging as in step 1, but instead using products. Because all elements are positive, the orders don't change and you can make the same claim as claim 1. Now for the final claim, since you can write any polynomial as a difference of two positive polynomials, do it for each element of $P$. Now to get the value of $f$, for the positive polynomials with positive sign, you must just proceed with the normal process, and for the polynomials with negative sign, you must pick the minimum among the maximums but this is just same as maximum among the minimums if you arrange the matrix a little as explained in step2. The polynomials with positive sign and the polynomials with negative sign can be worked independently, so you get that a metapolynomial is a difference of two positive metapolynomials. Now previous claims can finish the problem! Edit : wrote an incomplete solution at first, but edited right after.
16.10.2014 18:58
jaydoubleuel wrote: My solution proves that if $f, g$ are metapolynomials, 1. $f+g$ is a metapolynomial. 2. $-f$ is a metapolynomial. 3. $fg$ is a metapolynomial, when $f$ is always positive. 4. $fg$ is a metapolynomial. How do you come up with this procedure?
30.06.2018 17:00
When you reach at the point you prove that if $f,g$ are metapolynomials then so are $f+g$ and $-f$ the problem can be solved as follows. Show that $f^3$ is a metapolynomial (trivial , just replace each entrie of the matrix $P_{ij}$ with $P^3_{ij}$) Then $(f+1)^3=f^3+3f^2+3f+1$ is a metapolynomial ,subtract $f^3 , 3f ,1 $ that are metapolynomials and finally get $f^2$ is a metapolynomial. $(f+g)^2=f^2+g^2+2fg $ is a metapolynomial and subtract $f^2+g^2$ to get that $fg$ is a metapolynomial.
17.06.2020 00:28
Solved with goodbear, Th3Numb3rThr33. In what follows, we say \[ \operatorname{\textbf{meta}}P:=\max_{i=1,\ldots,m}\min_{j=1,\ldots,n}P_{ij}(x_1,\ldots,x_k), \quad\text{where}\quad P=\begin{bmatrix} P_{11}&\cdots&P_{1n}\\ \vdots&\ddots&\vdots\\ P_{m1}&\cdots&P_{mn} \end{bmatrix} \] Remark: The general shape of this proof is to repeat the proof of \(f\), \(g\) integrable implies \(fg\) integrable, in which we prove that \(f+g\) integrable, \(cf\) integrable for any \(c\in\mathbb R\), \(f^2\) integrable, and conclude that \(\frac12\left[(f+g)^2-f^2-g^2\right]=fg\) integrable. Lemma: If \(f\), \(g\) are metapolynomials, then so is \(f+g\). Proof. Say \(f=\operatorname{\textbf{meta}}P\), \(g=\operatorname{\textbf{meta}}Q\), where \(P\) has dimensions \(a\times b\) and \(Q\) has dimensions \(c\times d\). Then \(f+g=\operatorname{\textbf{meta}}R\), where \(R\) is an \(ac\times bd\) matrix where we put all possible circular permutations of \(Q\) on top of \(P\). \(\blacksquare\) Lemma: If \(f\) is a metapolynomial, then so is \(-f\). (Hence, \(cf\) is a metapolynomial for any \(c\in\mathbb R\).) Proof. If \(f=\operatorname{\textbf{meta}}P\), where \(P\) has dimensions \(m\times n\), then \(-f=\operatorname{\textbf{meta}}Q\), where \(Q\) is a \(n^m\times m\) matrix constructed as follows: take all \(m\)-tuples where the \(i\)th term comes from the \(i\)th row of \(-P\), and let that tuple be one row vector of \(Q\). For example, \[\operatorname{\textbf{meta}}\begin{bmatrix}-1&-4\\-3&-2\end{bmatrix}=-\operatorname{\textbf{meta}}\begin{bmatrix}1&3\\1&2\\4&3\\4&2\end{bmatrix}.\]\(\blacksquare\) Lemma: If \(f\) is a metapolynomial, then so is \(f^2\). Proof. Observe that \((f+1)^3\) and \(f^3+3f+1\) are metapolynomials, so take their difference. \(\blacksquare\) Finally \(\frac12\left[(f+g)^2-f^2-g^2\right]=fg\) is a metapolynomial, so we are done.
07.07.2022 22:40
Claim 1: If $f,g$ are metapolynomials then $f+g$ is a metapolynomial. Proof: Let \[f(x_1,\cdots , x_k )=\max_{i=1,\cdots , m} \min_{j=1,\cdots , n}P_{i,j}(x_1,\cdots , x_k),\] \[g(x_1,\cdots , x_k )=\max_{a=1,\cdots , c} \min_{b=1,\cdots , d}P_{i,j}(x_1,\cdots , x_k),\] Write $P_{i,j}=P_{i,j}(x_1,\cdots,x_k)$ for brevity. $$f(x_1,\cdots,x_k)+g(x_1,\cdots,x_k) = \max_{1\le i\le m} \min_{1\le j\le n} P_{i,j}(x_1,\cdots,x_k) + \max_{1\le a\le c} \min_{1\le b\le d} Q_{a,b} (x_1,\cdots,x_k)$$ $$=\max_{1\le i\le m, 1\le a\le c} \left( \min_{1\le j\le n} P_{i,j} + \min_{1\le b\le d} Q_{a,b} \right)$$ $$=\max_{(i,a) \in [m]\times [c]} \min_{(j,b)\in [n]\times[d]} \left(P_{i,j}+Q_{a,b}\right)$$ Claim 2: If $f$ is a metapolynomial, then so is $f^3$. Proof: Note $a<b \iff a^3< b^3$. Claim 3: If $f$ is a metapolynomial, then so is $-f$. Proof: $$f = \max\limits_{1\le i\le m} \min_{1\le j\le n} P_{i,j}$$ $$-f = \min_{1\le i\le m} (-\min_{1\le j\le n} P_{i,j})=\min_{1\le i\le m} (\max_{1\le j\le n} -P_{i,j})$$ $$ = \max_{(c_1,\cdots, c_m) \in [n]^m} \min_{1\le j\le m} -P_{j,c_j}$$ Claim 4: $f^2$ is a metapolynomial. Proof: $f^2 = \frac{(f+1)^3-f^3-1-3f }{3}$ Claim 5: $fg$ is a metapolynomial: Proof: $fg=\frac{(f+g)^2-f^2-g^2)}{2}$.
22.08.2023 14:19
It's quite clear that for $k=1$, a metapolynomial is just a continuous function which is piecewise polynomial (with finitely many pieces). I'm curious if the property holds for $k>1$ also. (If this is right, then the conclusion follows immediately.)
23.08.2024 02:28
Let \[f(x_1,\cdots , x_k )=\max_{i=1,\cdots , m_1} \min_{j=1,\cdots , n_1}P_{i,j}(x_1,\cdots , x_k)\]\[g(x_1,\cdots , x_k )=\max_{i=1,\cdots , m_2} \min_{j=1,\cdots , n_2}Q_{i,j}(x_1,\cdots , x_k).\]Let $\sigma_p = ((p_1, \dots, p_{m_1}), i_p)$ where $1\le i_p\le m_1$ and $1\le p_i\le n_1$. Define $\sigma_q$ similarly. Say $\sigma_p$ is "attained" if $P_{i, p_i} = \min_{j=1,\cdots , n_1}P_{i,j}$ for all $i$ and $P_{i_p, p_{i_p}} = \max_{i=1,\cdots, m_1} P_{i, p_i}$. We will define a set $S_{\sigma_p, \sigma_q}$ of polynomials such that: -if $\sigma_p$ and $\sigma_q$ are attained, then $\min(S_{\sigma_p, \sigma_q}) = fg$ -$\min(S_{\sigma_p, \sigma_q}) \le fg$ Let $\alpha(x, y_1, y_2) = (y_2-y_1)(\frac{1+x^2}2)$. Define $S_{\sigma_p, \sigma_q}$ to have all elements of the form \[ F_{r_1, r_2, s_1, s_2, j_1, j_2} = P_{i_p, p_{i_p}}Q_{i_q, q_{i_q}}\]\[+ \alpha(Q_{i_q, q_{i_q}}, P_{i_p, p_{i_p}}, P_{i_p, j_1}) + \alpha(Q_{i_q, q_{i_q}}, P_{r_1, p_{r_1}}, P_{i_p, j_1}) + \alpha(Q_{i_q, q_{i_q}}, P_{r_1, p_{r_1}}, P_{r_1, s_1})\]\[+ \alpha(P_{r_1, s_1}, Q_{i_q, q_{i_q}}, Q_{i_q, j_2}) + \alpha(P_{r_1, s_1}, Q_{r_2, q_{r_2}}, Q_{i_q, j_2}) + \alpha(P_{r_1, s_1}, Q_{r_2, q_{r_2}}, Q_{r_2, s_2})\] where $1\le j_1, s_1 \le n_1$, $1\le r_1\le m_1$, $1\le j_2, s_2 \le n_2$, and $1\le r_2\le m_2$. Claim: If $\sigma_p$ and $\sigma_q$ are attained, then $\min(S_{\sigma_p, \sigma_q}) = P_{i_p, p_{i_p}}Q_{i_q, q_{i_q}}$. Proof: If $\sigma_p$ and $\sigma_q$ are attained, then all of the $\alpha$ terms of $F_{r_1, r_2, s_1, s_2, j_1, j_2}$ are positive. So, \[\min(S_{\sigma_p, \sigma_q}) = F_{i_p, i_q, p_{i_p}, q_{i_q}, p_{i_p}, q_{i_q}} = P_{i_p, p_{i_p}}Q_{i_q, q_{i_q}} = fg.\] Claim: $\min(S_{\sigma_p, \sigma_q}) \le fg$ Proof: Let $f=P_{r_1, s_1}$, $g=Q_{r_2, s_2}$. Let $j_1$ be the number such that $P_{i_p, j_1}$ is minimized and define $j_2$ analagously. Then, all of the $\alpha$ terms in $F_{r_1, r_2, s_1, s_2, j_1, j_2}$ are negative. So, we have \[F_{r_1, r_2, s_1, s_2, j_1, j_2} = P_{i_p, p_{i_p}}Q_{i_q, q_{i_q}}\]\[+ \alpha(Q_{i_q, q_{i_q}}, P_{i_p, p_{i_p}}, P_{i_p, j_1}) + \alpha(Q_{i_q, q_{i_q}}, P_{r_1, p_{r_1}}, P_{i_p, j_1}) + \alpha(Q_{i_q, q_{i_q}}, P_{r_1, p_{r_1}}, P_{r_1, s_1})\]\[+ \alpha(P_{r_1, s_1}, Q_{i_q, q_{i_q}}, Q_{i_q, j_2}) + \alpha(P_{r_1, s_1}, Q_{r_2, q_{r_2}}, Q_{i_q, j_2}) + \alpha(P_{r_1, s_1}, Q_{r_2, q_{r_2}}, Q_{r_2, s_2})\]\[\le P_{i_p, p_{i_p}}Q_{i_q, q_{i_q}} + Q_{i_q, q_{i_q}}(P_{i_p, j_1} - P_{i_p, p_{i_p}}) + Q_{i_q, q_{i_q}}(P_{r_1, p_{r_1}} - P_{i_p, j_1}) + Q_{i_q, q_{i_q}}(P_{r_1, s_1} - P_{r_1, p_{r_1}})\]\[+ P_{r_1, s_1}(Q_{i_q, j_2} - Q_{i_q, q_{i_q}}) + P_{r_1, s_1}(Q_{r_2, q_{r_2}} - Q_{i_q, j_2}) + P_{r_1, s_1}(Q_{r_2, s_2} - Q_{r_2, q_{r_2}})\]\[= P_{r_1, s_1}Q_{r_2, s_2}.\] It is always the case that some pair $\sigma_p, \sigma_q$ is attained. Thus, \[fg = \max_{\substack{\sigma_p\in (\{1,\dots, n_1\}^{m_1})\times \{1,\dots, m_1\}\\ \sigma_q\in (\{1,\dots, n_2\}^{m_2})\times \{1,\dots, m_2\}}} \min(S_{\sigma_p, \sigma_q}).\]