Suppose $f,g \in \mathbb{R}[x]$ are non constant polynomials. Suppose neither of $f,g$ is the square of a real polynomial but $f(g(x))$ is. Prove that $g(f(x))$ is not the square of a real polynomial.
Problem
Source: India TST 2017 D2 P1
Tags: algebra, polynomial
09.12.2017 22:02
Through this solution, for any polynomials $P,S\in \mathbb{C}[x]$, the statement $P(x)\equiv S(x)$ denote $P(x)=S(x)$ for infinite value of $x\in \mathbb{C}$. We'll prove the following equivalent problem instead: Suppose $f,g \in \mathbb{R}[x]$ are non-constant polynomials that neither of $f,g$ is the square of a real polynomial, then $f(g(x))$ and $g(f(x))$ can't be both the square of a real polynomial Suppose to the contrary that there exists such polynomial $f,g$ that $f(g(x))$ and $g(f(x))$ are square of some real polynomials. Since $f$ is not the square of a real polynomial, there exists $z\in \mathbb{C}$ that $f(x)=(x-z)^{2h+1}Q(x)$ for some polynomial $Q\in \mathbb{C}[x]$ that $Q(z)\neq 0$ and $h\in \mathbb{Z}_{\geq 0}$. For any root $r$ of the equation $g(x)-z=0$, the following argument works. Let $g(x)-z=(x-r)^qK(x)$ for some polynomial $K\in \mathbb{C}[x]$ that $K(r)\neq 0$ and $q\in \mathbb{Z}_{\geq 1}$. There exists polynomial $t\in \mathbb{R}[x]$ that $f(g(x))\equiv t(x)^2$. So, $t(r)=0$. There exists positive integer $k$ that $t(x)=(x-r)^kR(x)$ for some polynomial $R\in \mathbb{C}[x]$ that $R(r)\neq 0$. From $f(g(x))=t(x)^2$, we get $(g(x)-z)^{2h+1}Q(g(x))=t(x)^2$. And so $(x-r)^{q(2h+1)}K(x)^{2h+1}A(x)=(x-r)^{2k}R(x)^2$ where polynomial $A\in \mathbb{C}[x]$ denote the polynomial $Q(g(x))$. Note that $A(r)=Q(g(r))=Q(z)\neq 0$. Differentiate the equation gives us $(x-r)^{q(2h+1)}K(x)^{2h+1}A'(x)+(2h+1)A(x)g'(x)(x-r)^{q(2h)}K(x)^{2h} \equiv 2(x-r)^{2k}R(x)R'(x)+2kR(x)^2(x-r)^{2k-1}$. In clearer form, $(x-r)^{q(2h)}K(x)^{2h}\Big( (x-r)^qK(x)A'(x)+(2h+1)A(x)g'(x)\Big) =(x-r)^{2k-1}\Big( 2(x-r)R(x)R'(x)+2kR(x)^2\Big)$. Since $R(r)\neq 0$, which gives $2(r-r)R(r)R'(r)+2kR(r)^2\neq 0$, the exponent of factor $(x-r)$ of polynomials in both sides must equal to $2k-1$. Note that $g'(x)=(x-r)^qK'(x)+qK(x)(x-r)^{q-1}$. So, $2k-1=q(2h)+w$ where $w$ is the exponent of factor $(x-r)$ of the polynomial $(x-r)^qK(x)A'(x)+(2h+1)A(x)\Big( (x-r)^qK'(x)+qA(x)(x-r)^{q-1}\Big)$. Note that the polynomial is equal to $(x-r)^{q-1}\Big( (x-r)K(x)A'(x)+(2h+1)A(x)((x-r)K'(x)+qA(x))\Big)$. It's not hard to see that, since $A(r)\neq 0$, there's no factor $(x-r)$ in $(x-r)K(x)A'(x)+(2h+1)A(x)((x-r)K'(x)+qA(x))$. So, $w=q-1$. This gives $2k-1=q(2h+1)-1$, $q$ must be even. We get that, for any root $r$ of $g(x)-z=0$, the exponent of factor $(x-r)$ in $g(x)-z$ must be even. In other words, there exists polynomial $M\in \mathbb{C}[x]$ that $g(x)-z=M(x)^2$. Since $g(f(x))$ is the square of a real polynomial, there exists polynomial $T\in \mathbb{R}[x]$ that $g(f(x))\equiv T(x)^2$. So, $z\equiv \Big( T(x)-M(f(x))\Big) \Big( T(x)+M(f(x))\Big)$. It's obvious that $\deg (T(x)) =\deg (M(f(x)))$, also the leading coefficients of $T(x)-M(f(x))$ and $T(x)+M(f(x))$ can't be both zero. So, the only possible way to makes the equation holds is $z=0$. In that case, we get that $g(x)=M(x)^2$ where $M\in \mathbb{C}[x]$. From polynomial $M$, there exists polynomials $M_1,M_2\in \mathbb{R}[x]$ that $M(x)\equiv M_1(x)+iM_2(x)$. We get $g(x)\equiv \Big( M_1(x)^2-M_2(x)^2\Big) +2iM_1(x)M_2(x)$. Since $g\in \mathbb{R}[x]$, at least one of $M_1$ and $M_2$ must be zero polynomial. But since $g(f(x))$ is a square of real polynomial, $g(x)$ can't be $-M_2(x)^2$ (which will lead to $g(x)\equiv -M_2(x)^2\equiv 0$). Hence, the only one possibility left is $g(x)\equiv M_1(x)^2$, contradiction with the condition of $g$, done.
09.12.2017 22:24
Suppose both $f\circ g,g\circ f$ are squares ,we'll prove it $\forall f,g \in \mathbb{C}[X]$.Suppose that $f$ s has more than 1 root which multiplicity isn't even say $\alpha,\alpha_1$.Note that $(g(x)+m,g(x)+n)=1$ $m\not=n$ over all $\mathbb{C}[X]$ and hence $g(x)-\alpha_1,g(x)-\alpha$ are squares which contradicts the constant restriction. So both $f,g$ have exactly one root which multiplicity is odd say $\alpha,\beta$.Let $f=p^2(x-\alpha)$,$g=p^2(x-\beta)$,and $f-\beta$ is a square (as is $g\circ f$) which is wrong as it has an odd degree. EDIT:This needs to exclude the ones that are squares in $\mathbb{C}[X]$ but not in $\mathbb{R}[X]$ i.e the negative real times square of a poly though this is easily handled ($f\circ g$ obviously ain't a square).
09.12.2017 22:25
Suppose $f,g$ are non-square polynomials and $f(g(x)), g(f(x))$ are both perfect squares as polynomials. Lemma. For any polynomials $A,B \in \mathbb{C}[x]$ with $\min \{\text{deg} A, \text{deg} B\} \ge 1$ the polynomial $A^2-B^2$ is non-constant unless $A^2=B^2$. (Proof) Suppose $c=(A(x)-B(x))(A(x)+B(x))$ is a constant. Note that at least one of $A-B, A+B$ has a non-zero leading coefficient if $A^2 \ne B^2$; hence $\text{deg} (A^2-B^2) \ge 1$ a contradiction! Suppose $g(x)=r(x)h(x)^2$ where $r$ is square-free. Then $r(f(x))=t(x)^2$ for some $t \in \mathbb{C}[x]$. Suppose $r$ has $k>1$ roots $z_1, \dots, z_k$ then $\gcd(f(x)-z_1, f(x)-z_2)=1$ hence we can find $a, b \in \mathbb{C}[x]$ with $f(x)-z_1=a(x)^2$ and $f(x)-z_2=b(x)^2$. Consequently, $a^2-b^2$ is a constant polynomial, contradicting the lemma. Hence $r$ has exactly one real root or it is a constant polynomial. Case 1. $r$ has exactly one real root $z$. Then $f(x)-z=\ell(x)^2$ for some $\ell \in \mathbb{C}[x]$. However, $f(g(x))$ is a square polynomial too, so $z$ is the difference of two square polynomials. Again this can not hold by the lemma unless $z=0$ yielding $f=\ell^2$. Case 2. $r$ is constant. Then again $f=\ell^2$ for some $\ell \in \mathbb{C}[x]$. Now we need to show that this implies $f$ is the square of a real polynomial. Suppose it isn't a square. Since each root of $f$ has an even multiplicity, it can only mean that $f=-v^2$ for some polynomial $v \in \mathbb{R}[x]$ as $f \in \mathbb{R}[x]$. Consequently, $-v(g(x))^2$ is the square of a real polynomial. However, as $|x| \rightarrow \infty$ we see $-v(g(x))^2<0$, yielding the desired contradiction!
10.12.2017 10:52
My solution: Lemma: If $p(x)$ is a square and $a\in \mathbb{R}/{0},$ then $f(x)-a$ isn't a square. Proof: Suppose $p(x)=q(x)^2\not = const,$ and $f(x)-a=h(x)^2\not= const,$ then $a=(q(x)-h(x))(q(x)+h(x)),$ but this is impossible since $q(x)-h(x),q(x)+h(x)$ is nonconstant. Then we can write $f(x)$ as $l(x)^2(x-b_1)(x-b_2)...(x-b_k),$ where $l(x)\in\mathbb{R} [x],$ and $b_1,b_2,...,b_k\in\mathbb{R}.$ Then $f(g(x))=l(g(x))^2(g(x)-b_1)(g(x)-b_2)...(g(x)-b_n),$ where $(g(x)-b_1)...(g(x)-b_n)=h(x)^2.$ Suppose $\exists t,$ $g(t)=b_1\implies h(t)=0,$ then $h(x)=(x-t)j(x),$ Therefore $$(x-t)^2\mid g(x)-b_1,$$then $g(x)-b_i$ is square for $i=1,...,k.$ From lemma it follows $k=1,$ so $f(x)=l(x)^2(x-t),$ and $g(x)=k(x)^2+t.$ It follows $g(f(x))=k(f(x))^2+t,$ and from lemma $g(f(x))$ isn't a square.
26.11.2021 18:23
I read about the below lemma from am09's post in Kurschak Competition 2016 p3, so am not attaching proof LEMMA: For non-constant polynomials $A,B$ , $deg(A^2-B^2)>0$ unless $A^2=B^2$. Let $f=A^2.B , g=P^2.Q$ , where all roots of $B$ and $Q$ have multiplicity $1$, $Q,B$ are non-constant. Then $B(P(x)^2Q(x))$ is a perfect square. $$B(P^2(x)Q(x))=\prod (P(x)^2Q(x)-a_i)$$.Since $\gcd( (P(x)^2Q(x)-a_i), (P(x)^2Q(x)-a_j)=1) $, there exist a sequence of polynomials $t_i$ such that $$ P(x)^2Q(x)-a_i=t_i(x)^2$$.From the lemma (since none of $t_i$ can be constant) we have that only one $a_i$ can be there in the expansion of $B$.Then $B(x)=(x-a)$.FTSOC , let $g(f(x))$ be the square of a real polynomial.Then $Q(x)=(x-b)$.This implies $$P(x)^2 (x-b)=H(x)^2 -a$$a clear contradiction because of the parity of the degree on two sides are different.
26.10.2023 18:45
Does the following work? Claim: For distinct real numbers $a$ and $b$ the equations $P(x) = a$ and $P(x) = b$ for some polynomial $P$ have atleast as many roots as $1$ added to the degree of $P$. Proof: Let $S_a$ and $S_b$ denote the set of roots. Assume their are atmost $n$ roots where $n$ is the degree of $P$. But then, there have to be a total of $n$ repetitions of roots. But then all of these repetitions are part of $P'(x)$ (the derivative) which has atmost $n-1$ roots. Now $g(x) = \alpha$ is a perfect square for some real $\alpha$, which is not a double root of $f$. But if the degree of $g$ is $n$, $g(x) = \alpha$ has atmost $\frac{n}{2}$ complex roots. So, $g(x) = \beta$ for some other $\beta$ must have atleast $\frac{n+2}{2}$ real roots, thus making it a non-perfect square. Therefore, there is atmost $1$ such $\alpha$. This means we can write $f = (x-\alpha)f_1(x)^2$ for some real polynomial $f_1$. But then we have that $f$ has odd degree If $g(f(x))$ is also a perfect square this way, then $g$ also has odd degree but now degree comparison leads to a contradiction. Edit: It seems on comparing with the other AOPS solutions, my proof is quite similar, only I used an alternate method to prove the lemma