Find all nonconstant polynomials $P(z)$ with complex coefficients for which all complex roots of the polynomials $P(z)$ and $P(z) - 1$ have absolute value 1. Ankan Bhattacharya
Problem
Source: USA TSTST 2020 Problem 7
Tags: algebra, USA TST, polynomial, complex numbers
25.01.2021 20:02
Let $u \neq v$ be distinct complex numbers with $|u| = |v| = 1$. We claim that all such polynomials must be of the form $$P(z) = \frac{1}{u-v}z^n - \frac{v}{u-v}$$It is easy to verify that it works. Now we show the other direction. Let $P$ have roots $z_1, z_2, \dots, z_n$ and $Q = P - 1$ have roots $z_1', z_2', \dots, z_n'$. Observe that $$1 = \frac{Q(z_i)}{Q(z_1)} = \frac{c\prod_{j=1}^{n} (z_i - z_j')}{c\prod_{j=1}^{n} (z_1 - z_j')} = \prod_{j=1}^{n} \frac{z_i - z_j'}{z_1 - z_j'}$$Now take the argument of both sides: $$0 \equiv \text{arg}\left(\prod_{j=1}^{n} \frac{z_i - z_j'}{z_1 - z_j'}\right) \equiv \sum_{j=1}^{n} \text{arg}\left(\frac{z_i - z_j'}{z_1 - z_j'}\right) \equiv \sum_{j=1}^{n} \frac{1}{2}\text{arg}\left(\frac{z_i}{z_1}\right) \equiv \frac{n}{2} \text{arg}\left(\frac{z_i}{z_1}\right) \bmod{\pi}$$where the important middle step follows by Inscribed Angle Theorem since all $z_i$ and $z_i'$ lie on the unit circle. We therefore conclude that all of the $z_i$ are spaced out at multiples of $\tfrac{2\pi}{n}$ radians from each other. Thus either the $z_i$ form a regular $n$-gon or there exist multiple roots. Ugly sideshow on multiple roots: As our argument holds for any $z$ on the unit circle, not just $z_i$, we have just proved $F(t) = \text{arg}(Q(e^{it}))$ is nonconstant and linear in $t$. So if $z_0 = e^{it_0}$ is a multiple root of $Q(z) = -1$ then it is also a multiple root of $Q(e^{it}) = -1$ and therefore also a multiple root of $F(t) = \text{arg}(Q(e^{it})) = 0$. But $F'(t_0) \neq 0$ so it cannot be in fact a multiple root. To check how we reduced from $Q \rightarrow \text{arg}(Q)$ one may simple explicitly compute the first derivatives of each in polar form and note that the first being zero implies the second is zero as well. Thus, $z_i$ do form a regular $n$-gon and the same logic holds for the $z_i'$ as well. We see then that there exist distinct complex numbers $u$, $v$ of magnitude $1$ so that $z_i^n = v$ while $z_i'^n = u$. With some algebra, this directly implies the result. $\blacksquare$
25.01.2021 20:07
Heh... cute. Notice that the complex conjugate of $P$ and the polynomial formed by reversing the coefficients of $P$ have the same multiset of roots and thus there is $\omega \in \mathbb{C}$ such that $a_{n-k} = \omega a_k$, where $a_i$ is the coefficient of $x^i$ in $P$. Since $a_k = \omega a_{n-k} = \omega \overline{\omega}a_k$, we have $|\omega| = 1$. Similarly, there exists $\omega'$ for the polynomial $P - 1$. Since the constant terms of $P$, $P - 1$ are different, we have $\omega \neq \omega'$. But this implies that all coefficients of $P$ except the first and last are $0$. The answer is now easy to extract.
25.01.2021 21:18
26.01.2021 00:33
I think this works. Let $P=a_nx^n+a_{n-1}x^{n-1}+\cdots a_0$ with roots $\alpha_1,\alpha_2,\cdots$ and $P-1$ with roots $\beta_1,\beta_2,\cdots$. Now, as all roots are root of unity, we have that $|a_0|=|a_0-1|$. Thus, the real part must be the negative and $Re(a_0)=-Re(a_0-1)\implies Re(a_0)=0.5$. Thus, $a_0=\overline{1-a_0}$. Claim: $\forall i\in \{1,2,\cdots n-1\}, a_i=0$. Proof: Let $a_i\ne 0$. Now, we have \[\sum_{1\le x_1<x_2\cdots <x_{n-i}\le n} \alpha_{x_1}\alpha_{x_2}\cdots \alpha_{x_{n-i}}=\sum_{1\le x_1<x_2\cdots <x_{n-i}\le n} \beta_{x_1}\beta_{x_2}\cdots \beta_{x_{n-i}}\]. But, we also have $\alpha_1\alpha_2\cdots \alpha_n\ne \beta_1\beta_2\cdots \beta_n$. Thus, \[\sum_{1\le x_1<x_2\cdots <x_i\le n} \frac{1}{\alpha_{x_1}\alpha_{x_2}\cdots \alpha_{x_i}}\ne \sum_{1\le x_1<x_2\cdots <x_{i}\le n} \frac{1}{\beta_{x_1}\beta_{x_2}\cdots \beta_{x_{i}}}\] Now, taking conjugates ($\overline{\alpha_j}=\frac{1}{\alpha_j}$ and same for $\beta_j$). \[ \sum_{1\le x_1<x_2\cdots <x_i\le n} \alpha_{x_1}\alpha_{x_2}\cdots \alpha_{x_i}\ne \sum_{1\le x_1<x_2\cdots <x_{i}\le n} \beta_{x_1}\beta_{x_2}\cdots \beta_{x_{i}}\]But, we know this cannot be as $a_{n-i}$ and $a_n$ are common. Hence, $a_i=0$ and the claim follows. Now, we just have that $P=ax^n+0.5+bi$ such that $|0.5+bi|=|a|$ and this is easy to encode.
26.01.2021 08:35
The answer is $P(x)$ should be a polynomial of the form $P(x) = \lambda x^n - \mu$ where $|\lambda| = |\mu|$ and $\operatorname{Re} \mu = \frac{1}{2}$. One may check these all work; let's prove they are the only solutions. We introduce the following notations: \begin{align*} P(x) &= c_n x^n + c_{n-1}x^{n-1} + \dots + c_1 x + c_0 \\ &= c_n (x+\alpha_1) \dots (x+\alpha_n) \\ P(x)-1 &= c_n (x+\beta_1) \dots (x+\beta_n) \end{align*}By taking conjugates, \begin{align*} (x + \alpha_1) \cdots (x + \alpha_n) &= (x + \beta_1) \cdots (x + \beta_n) + c_n^{-1} \\ \implies \left(x + \frac{1}{\alpha_1}\right) \cdots \left(x + \frac{1}{\alpha_n}\right) & = \left(x + \frac{1}{\beta_1}\right) \cdots \left(x + \frac{1}{\beta_n}\right) + (\overline{c_n})^{-1} \qquad (\spadesuit) \end{align*}The equation $(\spadesuit)$ is the main player: Claim: We have $c_k = 0$ for all $k = 1, \dots, n-1$. Proof. By comparing coefficients of $x^k$ in $(\spadesuit)$ we obtain \[ \frac{c_{n-k}}{\prod_i \alpha_i} = \frac{c_{n-k}}{\prod_i \beta_i} \]but $\prod_i \alpha_i - \prod_i \beta_i = \frac{1}{c_n} \neq 0$. Hence $c_k = 0$. $\blacksquare$ It follows that $P(x)$ must be of the form $P(x) = \lambda x^n - \mu$, so that $P(x) = \lambda x^n - (\mu + 1)$. This requires $|\mu| = |\mu+1| = |\lambda|$ which is equivalent to the stated part.
26.01.2021 22:20
This problem could be solved straightforward if one knows how to characterize the polynomials with complex coefficients that have all roots lying on the unit circle. If one knows that there is a constraint the coefficients comply with, most probably there will be success, since this is the key idea. Claim. Let $ P(z)=a_0z^n+a_1z^{n-1}+\dots + a_n, a_k\in\mathbb{C}$ be a polynomial with all roots lying on the unit circle $ |z|=1.$ Then there exists $ \omega\in\mathbb{C}, |\omega|=1$ such that $ a_{n-k}=\omega \overline{a_k}, k=0,1,\dots,n.$ The proof follows, but first to show how this claim helps. Let $ a_k, b_k,k=0,1,\dots,n$ be the coefficients of $ P(z),$ resp. $ P(z)-1.$ So, $ b_k=a_k,k=0,1,\dots,n-1$ and $ b_n=a_n-1.$ Also $ a_{n-k}=\omega_1 \overline{a_k}, b_{n-k}=\omega_2 \overline{b_k},k=0,1,\dots,n, |\omega_1|=|\omega_2|=1. $ Since $ a_0=b_0\ne 0$ and $ a_n\neq b_n$ it follows $ \omega_1\neq \omega_2.$ Then, $ a_{n-k}=\omega_1 \overline{a_k}, k=1,2,\dots,n-1$ and $ a_{n-k}=\omega_2 \overline{a_k}, k=1,2,\dots,n-1$ which means $ a_1=a_2=\dots=a_{n-1}=0.$ Thus, $ P(z)=a_0z^n+a_n$ which easily ends the rest of the problem. Proof of the Claim. The idea is to apply Mobius transform that converts a polynomial with all roots lying on $ |z|=1$ into a polynomial with all roots being real. So, we can use it to obtain some constraints on the coefficients of the initial polynomial. Let $$ \displaystyle f: \mathbb{C}\to \mathbb{C}, f(z):=\frac{z-i}{z+i}.$$Note that $ f$ sends the real line $ \{z:z\in\mathbb{R}\}$ onto the unit circle $ \{z:|z|=1, z\ne 1\}$. Consider $ \displaystyle P_1(z):=P\big( f(z)\big).$ So, when $ z$ runs through $ \mathbb{R},$ $ f$ runs through the unit circle, and since $ P$ has $ n$ zeroes there, then $ P_1$ has $ n$ reals roots (with their multiplicities). Further, $$ \displaystyle P_1(z)=\frac{a_0(z-i)^n +a_1(z-i)^{n-1}(z+i)+\dots +a_n(z+i)^n}{(z+i)^n}$$Thus, the following polynomial $$ Q(z):=a_0(z-i)^n +a_1(z-i)^{n-1}(z+i)+\dots a_n(z+i)^n\qquad (1)$$has $ n$ real roots, hence $ Q(z)=aR(z)$, where $ a\in\mathbb{C}$ and $ R$ is with real coefficients. It yields $$ \displaystyle Q(z)= \omega \overline{Q(\overline{z})} \qquad (2)$$where $ \displaystyle \omega=\frac{a}{\overline a}$ (hence, $ |\omega|=1$). Expanding the condition $ (2)$ it follows $$ Q(z)=\displaystyle \omega \overline{a_n}(z-i)^n +\omega \overline{a_{n-1}}(z-i)^{n-1}(z+i)+\dots + \omega \overline{a_0}(z+i)^n.$$Now, comparing this with $ (1)$ and because the terms (polynomials) $ t_k(z):=(z-i)^{n-k}(z+i)^k,k=0,1,\dots,n$ are linearly independent, it yields $ a_{n-k}=\omega \overline{a_k}, k=0,1,\dots,n.$ Remark. Some additional comments (there is a small issue with the above proof, which was omitted here), as well as proof of why $ t_k (z) $ are linearly independent, you can find in my blog.
28.01.2021 01:02
This problem seems to give a proof to the following interesting claim (the exact value of $1$ doesn't matter, since you can always scale). Claim. Let $\mathcal S$ be a set of $2n$ (distinct?) complex numbers with magnitude $1$. Suppose $\mathcal S$ can be split into two groups of $n$ complex numbers whose first $n-1$ power sums all coincide (that is, the sums are equal, the sums of the squares are equal, and so on). Then these power sums all equal zero. Much to my chagrin, it looks like this doesn't actually make the original problem any easier.
07.02.2021 16:30
Posting $\#7, \#8, \#9$ for momentum --- I'll enjoy my free time while I can. This third test definitely has the hardest problems (and here I claim that A and C are my favorite subjects ...) When the statement is in symmetric polynomials but doesn't work from an algebraic standpoint --- can't help but consider a $\textit{polynomial to be geometric}$. $\color{green} \rule{25cm}{2pt}$ $\color{green} \textbf{Conjugates and Zeroes.}$ First we prove that all coefficients aside from the leftmost and rightmost are equal to zero. Assume that there exists a polynomial $P(x) = c(x-r_1)(x-r_2) \ldots (x-r_n)$ with all $r_i \in \mathbb{C}, |r_i| = 1$ so that $S(x) = P(x)-1 = c(x-s_1)\ldots(x-s_n)$. So, \begin{align*} e_k(r_1,r_2,\ldots,r_n) &= \text{sum of all products of}\ r_i \ \text{with} \ k \ \text{terms} \\ &= \text{previous statement, but with} \ r_i \ \text{replaced into} \ s_i \\ &= e_k(s_1,s_2,\ldots,s_n) \end{align*}for each $1 \leq k \leq n-1$. Here $e_k$ denotes the $k^{th}$ elementary symmetric polynomial with $n$ variables. Assume there exists $e_k \ne 0$. We now consider the value of $\overline{e_k(r_1,r_2,\ldots,r_n)}=\overline{e_k(s_1,s_2,\ldots,s_n)}$. (From this point refer the symmetric polynomial's values to be $e_k(r),e_k(s)$ and $\overline{e_k(r)},\overline{e_k(s)}$ for simplicity) The main observation is that $e_k$ is the sum of $\binom{n}{k}$ terms with absolute value $1$; and $e_{n-k}$ also consists of $\binom{n}{k}$ terms with absolute value $1$. Indeed, the conjugate of any term as a product of $k$ roots is the product of the remaining $n-k$ roots --- and the addition of conjugates is Cauchily-separable! (sorry, couldn't find a better term for this) So, \[ \dfrac{e_{n-k}(r)}{e_n(r)}=\overline{e_k(r)}=\overline{e_k(s)}=\dfrac{e_{n-k}(s)}{e_n(s)} \]As we know that $e_n(r) = e_n(s)+\dfrac{(-1)^{n+1}}{c} \ne e_n(s)$, so by that we also can infer that $e_{n-k}(r)=e_{n-k}(s)=0$. Thus, $\overline{e_k(r)}=\overline{e_k(s)}$ are also zero, and $e_k(r)=0$. So, all terms except for the leading term and $e_n(r),e_n(s)$ are zero. $\blacksquare$ $\color{red} \rule{25cm}{2pt}$ $\color{red} \textbf{Finishing.}$ We now only work in \[ P(x) = z_1x^n+z_2 \]So, in order for all roots of $P(x)$ to have absolute value $1$, $e_n(r) = \pm \dfrac{-z_2}{z_1}$ needs to have absolute value $1$. Here note that $e_n(r)$ having absolute value $1$ is both a necessary and sufficient condition for the roots' absolute value. In the same way, $e_n(s) = \pm \dfrac{-z_2+1}{z_1}$ needs to have absolute value $1$. The final set of solutions are $z_1x^n+z_2$ when all three complex numbers $z_1$, $z_2$ and $z_2-1$ have equal absolute values. A general construction can be obtained by first sketching the value of $z_2$, then followed by selecting a $z_1$ which has absolute value equal to the already established $|z_2|$. $\blacksquare$ $\blacksquare$ $\blacksquare$
12.02.2021 13:03
The problem relies on the necessity (not sufficiency!) of $P(z) = \omega \cdot z^n \overline{P}(z^{-1})$ for polynomials $P$ with all roots on the unit circle. It looks like these polynomials are known as self-inversive polynomials in the literature (see Cohn's theorem & On the zeroes of self-inversive polynomials). The second link gives an iff condition for polynomials whose roots all lie on the unit circle.
14.03.2021 04:09
darn i just spent 45 minutes trying to re-find a sol I claim our answer is all $P(z) = a_1z^n + a_2$ for $n \in \mathbb{Z}^+$ and $a_2$ with real part $-\tfrac12$ and any $|a_1| = |a_2|$. First, we show that all of the $z^1 \to z^{n-1}$ coefficients are $0$. Write $P(z) = a_1(z - w_1)\ldots (z - w_n)$ and $P(z) - 1 = a_1(z - u_1)\ldots (z - u_n)$ where\[\sum_{sym} w_1w_2\ldots w_k = \sum_{sym} u_1u_2\ldots u_k\]for every $k \in [1, n-1]$ and $w_1w_2\ldots w_n = u_1u_2\ldots u_n \pm \tfrac{1}{a_1}$ depending on the parity of $n$. Consider\[Q(z) = \overline{P(\overline{z})} = \overline{a_1}\left(z - \tfrac{1}{w_1}\right)\ldots \left(z - \tfrac{1}{w_n}\right)\]and\[Q(z) - 1 = \overline{P(\overline{z})} - 1 = \overline{P(\overline{z}) - 1} = \overline{a_1}\left(z - \tfrac{1}{u_1}\right)\ldots \left(z - \tfrac{1}{u_n}\right)\]Subtract to get that\[\frac{1}{w_1w_2\ldots w_n}\left(\sum_{sym} w_1w_2 \ldots w_{n-k}\right) = \sum_{sym} \frac{1}{w_1w_2\ldots w_k} = \sum_{sym} \frac{1}{u_1u_2\ldots u_k} = \frac{1}{u_1u_2\ldots u_n}\left(\sum_{sym} u_1u_2 \ldots u_{n-k}\right)\]for all $k \in [1, n-1]$. But $w_1w_2\ldots w_n \neq u_1u_2\ldots u_n$ hence both\[\sum_{sym} w_1w_2 \ldots w_{n-k} = \sum_{sym} u_1u_2 \ldots u_{n-k}\]are $0$. Thus the $z^{n-k}$ coefficient in $P$ is $0$. This holds for all $k \in [1, n-1]$, so indeed, all $z^1 \to z^{n-1}$ coefficients are $0$. Next, we show the necessary constraints on $a_1, a_2$. Indeed, $|a_1| = |a_2|$ is necessary else the roots have magitude $|a_2/a_1|^{1/n} \neq 1$. Furthermore, in order for $|a_2| = |a_2 + 1|$, we must have $\text{Re}(a_2) = -\tfrac 12$. It is clear that all of the polynomials of the aforementioned form work, hence these are our only answers. $\blacksquare$
30.04.2021 18:18
The answer is $P(x)$ should be a polynomial of the form $P(x) = \lambda x^n - \mu$ where $|\lambda| = |\mu|$ and $\operatorname{Re} \mu = 1/2$. One may check these all work; let's prove they are the only solutions. We introduce the following notations: \begin{align*} P(x) &= c_n x^n + c_{n-1}x^{n-1} + \dots + c_1 x + c_0 = c_n (x+\alpha_1) \dots (x+\alpha_n) \\ P(x)-1 &= c_n (x+\beta_1) \dots (x+\beta_n) \\ & (x+\alpha_1)\cdots(x+\alpha_n) - (x+ \beta_1)\cdots(x+\beta_n) = c_n^{-1} &~~~~ (1) \end{align*}Taking $x=0$ in $(1)$ gives $\alpha_1 \cdots \alpha_n - \beta_1 \cdots \beta_n = c_n^{-1}$. So if we consider the complex numbers, $\alpha_1 \cdots \alpha_n ~,~ \beta_1 \cdots \beta_n$, then have the same magnitude and their difference is a real number, so we get that $\alpha_1 \cdots \alpha_n ~,~ -\beta_1 \cdots \beta_n$ are conjugates of each other, which also yields $\alpha_1 \cdots \alpha_n \cdot \beta_1 \cdots \alpha_n = -1$. Let, $\omega = e^{2\pi i/n}$. Fix any $\delta \in \{1,\omega,\omega^2,\cdots,\omega\}$. Then $\delta^n = 1$ and $|\delta|=1$. Putting $x = \delta$ in $(1)$ and taking conjugates on both sides gives \begin{align*} (\delta + \alpha_1)\cdots (\delta + \alpha_n) - (\delta + \beta_1)\cdots(\delta + \beta_n) = c_n^{-1} &~~~~~~ (2) \\ \implies \left( \dfrac{1}{\delta }+\dfrac{1}{\alpha _{1}}\right) \ldots \left( \dfrac{1}{\delta }+\dfrac{1}{\alpha _{n}}\right) - \left( \dfrac{1}{\delta }+\dfrac{1}{\beta _{1}}\right) \ldots \left( \dfrac{1}{\delta }+\dfrac{1}{\beta _{n}}\right) = c_n^{-1} \end{align*}Multiplying both sides by $\delta^n \cdot \alpha_1 \cdots \alpha_n \cdot \beta_1 \cdots \beta_n$ and using $\delta^n = 1 ~;~ \alpha_1 \cdots \alpha_n \cdot \beta_1 \cdots \beta_n = -1$, we get \begin{align*} \beta_1 \cdots \beta_n (\delta + \alpha_1) \cdots (\delta + \alpha_n) - \alpha_1 \cdots \alpha_n (\delta + \beta_1) \cdots (\delta + \beta_n) = -1 & ~~~~~ (3) \end{align*}So using $(2),(3)$ we obtain that $(\delta + \alpha_1) \cdots (\delta + \alpha_n) = \alpha_1 \cdots \alpha_n + 1$. This holds for all $\delta \in \{1,\omega,\cdots,\omega^{n-1} \}$. Hence $$Q(x) = (x+\alpha_1)\cdots (x + \alpha_n) - (\alpha_1 \cdots \alpha_n + 1) = P(x)/c_n - (\alpha_1 \cdots \alpha_n + 1) = (x-1)(x-\omega) \cdots (x - \omega^{n-1}) = x^n - 1$$Thus, $P(x) = c_n x^n - c_n \cdot \alpha_1 \cdots \alpha_n$. Clearly, $|c_n| = |c_n \cdot \alpha_1 \cdots \alpha_n|$. Also, as $c_n \cdot \alpha_1 \cdots \alpha_n - c_n \cdot \beta_1 \cdots \beta_n = 1$, and since $|c_n \cdot \alpha_1 \cdots \alpha_n| = c_n \cdot \beta_1 \cdots \beta_n|$, so we get that $\operatorname{Re} (c_n \cdot \alpha_1 \cdots \alpha_n) = 1/2$, so $P(x)$ is of the stated form, as desired.$\blacksquare$ Can someone please check my solution and tell whether it is correct or not ?
01.01.2022 02:03
Let $n=\deg P$, and define complex numbers $a$ and $r_1,\ldots,r_n$ and $s_1,\ldots,s_n$ such that \begin{align*} P(z) &= a(z-r_1)\cdots (z-r_n) \\ P(z)-1 &= a(z-s_1)\cdots (z-s_n). \end{align*}Define $f_i(x_1,\ldots,x_n)$ to be the coefficient of $z^i$ in $(z-x_1)\cdots(z-x_n)$. Then \begin{align*} P(z) &= \sum_{i=0}^n af_i(r_1,\ldots,r_n)z^i, \\ P(z)-1 &= \sum_{i=0}^n af_i(s_1,\ldots,s_n) z^i. \end{align*}Hence \[ f_i(r_1,\ldots,r_n) = f_i(s_1,\ldots,s_n) \quad \forall i\in [1,n]. \qquad (\diamondsuit)\] Define $Q(\overline{z})=\overline{P(z)}$ to be a polynomial in $\overline{z}$. Since $|r_1|=\cdots=|r_n|=1$, this is $Q(\overline{z})=\overline{a}(\overline{z}-\tfrac{1}{r_1})\cdots (\overline{z}-\tfrac{1}{r_n})$. Note $\overline{P(z)-1}=\overline{P(z)}-1=Q(\overline{z})-1$. Hence \[ \overline{a}\left(\overline{z}-\tfrac{1}{s_1}\right)\cdots \left(\overline{z}-\tfrac{1}{s_n}\right)=\overline{a}\left(\overline{z}-\tfrac{1}{r_1}\right)\cdots \left(\overline{z}-\tfrac{1}{r_n}\right)-1. \]Therefore, similar to previously, \[ f_i\left(\tfrac{1}{s_1},\ldots,\tfrac{1}{s_n}\right) = f_i\left(\tfrac{1}{r_1},\ldots,\tfrac{1}{r_n}\right) \quad \forall i\in [1,n]. \qquad (\heartsuit)\] It is well-known fact of symmetric sums that \[ f_{n-i}\left(\tfrac{1}{x_1},\ldots,\tfrac{1}{x_n}\right) = f_i(x_1,\ldots,x_n)\cdot (-1)^n \cdot \frac{1}{x_1\cdots x_n} \]for any $x_1,\ldots,x_n$. Fix some $i\in [1,n-1]$. By $(\heartsuit)$, $f_{n-i}(1/s_1,\ldots,1/s_n)=f_{n-i}(1/r_1,\ldots,1/r_n)$. Then by the above fact, \[ f_i(r_1,\ldots,r_n)\cdot (-1)^n\cdot \frac{1}{r_1\cdots r_n} = f_i(s_1,\ldots,s_n)\cdot (-1)^n\cdot \frac{1}{s_1\cdots s_n}. \]Now $(\diamondsuit)$ implies $f_i(r_1,\ldots,r_n)=f_i(s_1,\ldots,s_n)$, so unless this common value is $0$, cancelling gives $r_1\cdots r_n=s_1\cdots s_n$. But the constant coefficient of $P(z)$ is $a(-1)^n r_1\cdots r_n$, and the constant coefficient of $P(z)-1$ is $a(-1)^n s_1\cdots s_n$, which is a contradiction since they are hence equal but should be $1$ apart. Therefore, $f_i(r_1,\ldots,r_n)=0$ for all $i\in [1,n-1]$. Hence $P(z)=az^n+b$ for some $a,b\in \mathbb{C}$. Now it is easy to see that any $P(z)$ of this form works iff $|a|=|b|=|b-1|$, i.e. $|a|=|b|$ and $\text{Re}(b)=1/2$.
01.04.2024 00:14
The answer is $P(x)=cx^n+d$ where $d=\frac 12 + yi $ for real $y$ and $|c|=|d|$ which works. Let $P(z)=a_nz^n+\dots+a_1z+a_0$. Define \[\overline{P}(z)=a_0z^n+a_1z^{n-1}+\dots+a_n=z^nP\left(\frac 1z\right)\]If $r$ is a root of $\bar P$, then $P\left(\frac 1r\right)=0$ so $\frac 1r=\overline{r}$ is a root of $P$. Thus $\overline P$ is the polynomial whose roots are the conjugates of $P$'s roots, implying that $a_i=C_1\overline{a_{n-i}}$ for some fixed complex $C_1$. We get that a similar relation holds for $P(z)-1$. In fact, we get that if $i\neq 0, n$ then $a_i=C_2\overline{a_{n-1}}$ for some fixed complex $C_2$ and also that $a_n=C_2\overline{a_0-1}$ and $a_0-1 =C_2\overline{a_n-1}$. As \[C_2=\frac{a_n}{\overline{a_0-1}} \neq \frac{a_n}{\overline{a_0}}=C_1\]we find $C_1 \neq C_2$ so $C_1\overline{a_{n-i}}=a_i=C_2\overline{a_{n-1}}$ for $n=1,\dots,n-1$ gives all $a_1,a_2,\dots,a_n=0$. Then, for the other coefficients, we know that \[a_0=C_1\overline{a_n}=C_1\overline{C_1\overline{a_0}}=C_1\overline{C_1}a_0\]so $|C_1|$ and similarly $|C_2|$ are $1$. Thus $|a_n|=|a_0|=|a_n-1|$ so $a_n=\frac 12 +xi$ and $a_0$ must have equal aboslute value as claimed.
10.04.2024 07:50
Claim: our polynomial must be of the form $ax^n + b$. Proof: Let $n$ be the degree of $P$ and let $f(i)$ denote the $i$th symmetric sum of the roots of $P$. Since its roots all have magnitude $1$, it is easy to verify the equation \[ f(n)\overline{f(i)} = f(n-i)\]holds for all $1 \leq i \leq n-1$. If we modify the constant term of $P(z)$, only the $n$th symmetric sum (the product of the roots) changes; in order for all of these equations to still hold true, we must have $f(i) = 0$ for all $1 \leq i \leq n-1$, which finishes. Now, we can manually check to get all answers: clearly, if $P(x) = ax^n + b$ we must have $\text{Re} (b) = \tfrac{1}{2}$. So, \[ P(x) = b \cdot \left( cx^n + 1\right)\]where $b$ is any complex number with $\text{Re} (b) = \tfrac{1}{2}$ and $c$ is any complex number with $|c| = 1$. Remark: there's nothing special about the magnitude of $1$ in the problem, or the two polynomials differing by $1$; for any numbers the first claim still holds.