Do there exist polynomials $f(x)$, $g(x)$ with real coefficients and a positive integer $k$ satisfying the following condition? (Here, the equation $x^2 = 0$ is considered to have $1$ distinct real roots. The equation $0 = 0$ has infinitely many distinct real roots.) For any real numbers $a, b$ with $(a,b) \neq (0,0)$, the number of distinct real roots of $a f(x) + b g(x) = 0$ is $k$.
Problem
Source: 2017 Korean Winter Program Practice Test 1 Day 1 #3
Tags: algebra, polynomial
18.01.2017 19:42
let g(x) equals f(x) and f(x) has k distinct roots .This satisfies the condition
18.01.2017 20:02
sasu1ke wrote: let g(x) equals f(x) and f(x) has k distinct roots .This satisfies the condition No. Choose in this case $(a,b)=(1,-1)$ and equation has then infinitely many roots, and not $k$
18.01.2017 21:22
drkim wrote: (Here, the equation $x^2 = 0$ is considered to have $1$ distinct roots. The equation $0 = 0$ has infinitely many distinct roots.) Real roots or complex roots?
19.01.2017 02:46
I believe it was real roots.
19.01.2017 13:46
sasu1ke wrote: Take b equals 0 then no of roots of f(x) ,g(x) equals k then take a equals leading coeff of g(x) and b equals -(leading coeff of f(x)) then we get a polynomial with max k-1 roots .so there are no such polynomials No, we are likely speaking of real roots, and so degrees of $f(x),g(x),af(x)+bg(x)$ all may be $>> k$ and so decreasing of $1$ the degree of $af(x)+bg(x)$ relative to greatest degree of $f(x),g(x)$ does not prove anything.
19.01.2017 23:43
No, it's impossible. Denote $\mathbf{v}(t)=\left(f(t), g(t)\right)\,,\, \mathbf{c}=(a,b)$, that's $\mathbf{v}(t)$ and $\mathbf{c}$ are vectors in $\mathbb{R}^2$. The number of solutions of $\mathbf{c}\cdot \mathbf{v}(t)=0$ equals the number of intersection points of the line $\ell$, orthogonal to $\mathbf{c}$ , and the curve $\mathbf{v}(t)\,,\, t\in(-\infty, \infty)$; let's denote the latter by $F$. Thus, if such $F$ exists, the number of intersection points $|F\cap \ell|$ should remain invariant as we rotate $\ell$ around the origin $O$. It can happen if $F$ is some bounded loop, but in this case $F$ is not bounded. We can imagine $F$ loops at infinity. The invariant "breaks" when $\ell$ is at position that just "touches" $F$ at infinity.
20.01.2017 01:52
sorry
20.01.2017 02:02
hmida99 wrote: since both $a$ and $b$ varie in $R$ and $x$ also varies in $R$ we can take $a=b=x-c$ where $c\neq a_i$ for all $i\in (1,2,...,k)$ this makes no sense because $x-c$ is an element of $\mathbb R[x]$ not $\mathbb R$.
20.01.2017 06:42
dgrozev wrote: No, it's impossible. Denote $\mathbf{v}(t)=\left(f(t), g(t)\right)\,,\, \mathbf{c}=(a,b)$, that's $\mathbf{v}(t)$ and $\mathbf{c}$ are vectors in $\mathbb{R}^2$. The number of solutions of $\mathbf{c}\cdot \mathbf{v}(t)=0$ equals the number of intersection points of the line $\ell$, orthogonal to $\mathbf{c}$ , and the curve $\mathbf{v}(t)\,,\, t\in(-\infty, \infty)$; let's denote the latter by $F$. Thus, if such $F$ exists, the number of intersection points $|F\cap \ell|$ should remain invariant as we rotate $\ell$ around the origin $O$. It can happen if $F$ is some bounded loop, but in this case $F$ is not bounded. We can imagine $F$ loops at infinity. The invariant "breaks" when $\ell$ is at position that just "touches" $F$ at infinity. i am new to this kinda theory can u suggest any book for this
20.01.2017 06:47
sasu1ke wrote: dgrozev wrote: No, it's impossible. Denote $\mathbf{v}(t)=\left(f(t), g(t)\right)\,,\, \mathbf{c}=(a,b)$, that's $\mathbf{v}(t)$ and $\mathbf{c}$ are vectors in $\mathbb{R}^2$. The number of solutions of $\mathbf{c}\cdot \mathbf{v}(t)=0$ equals the number of intersection points of the line $\ell$, orthogonal to $\mathbf{c}$ , and the curve $\mathbf{v}(t)\,,\, t\in(-\infty, \infty)$; let's denote the latter by $F$. Thus, if such $F$ exists, the number of intersection points $|F\cap \ell|$ should remain invariant as we rotate $\ell$ around the origin $O$. It can happen if $F$ is some bounded loop, but in this case $F$ is not bounded. We can imagine $F$ loops at infinity. The invariant "breaks" when $\ell$ is at position that just "touches" $F$ at infinity. i am new to this kinda theory can u suggest any book for this Frankly saying you must study a little more than a book to know and master this.
20.01.2017 16:56
I'll try to put it into more elementary graphical terms. WLOG, one can assume that $\deg(g) < \deg(f)$. (Quite trivial, but not so trivial.) Firstly, suppose $f,g$ do not have common roots. Let $x_1<\cdots<x_k$ be the real roots of $g$. Consider the graph $y =\frac{f(x)}{g(x)}$. Then, the graph is split into $k+1$ intervals $(-\infty,x_1),(x_1,x_2),\ldots,(x_k,\infty)$. Write $x_0 = -\infty, x_{k+1}=\infty$ for simplicity. Then, for each $0\leq i\leq k$, $\frac{f(x)}{g(x)}$ is continuous in $(x_i,x_{i+1})$ and $$ \lim_{x\to x_{i}+} \frac{f(x)}{g(x)} = \pm\infty $$$$ \lim_{x\to x_{i+1}-} \frac{f(x)}{g(x)} = \pm\infty $$(The degree assumption is used here.) Therefore, for large enough $N$, the union of two lines $y = N$ and $y = -N$ meet with $y = \frac{f(x)}{g(x)}$ exactly two times in each interval $(x_i,x_{i+1})$, meeting the whole graph in total $2(k+1)$ times. (To show that the lines meet the graph "exactly" two times, I think one must argue that the graph of a rational function is somehow "nice". But anyway all we need here is "at least" two times.) But then, this is a contradiction since there are $k$ roots for $f(x) - Ng(x)=0$ and $k$ roots for $f(x)+Ng(x)=0$. One can extend this argument slightly to handle the case when $f,g$ have common roots. (Not trivial.)
23.01.2017 16:47
dgrozev wrote: No, it's impossible. Denote $\mathbf{v}(t)=\left(f(t), g(t)\right)\,,\, \mathbf{c}=(a,b)$, that's $\mathbf{v}(t)$ and $\mathbf{c}$ are vectors in $\mathbb{R}^2$. The number of solutions of $\mathbf{c}\cdot \mathbf{v}(t)=0$ equals the number of intersection points of the line $\ell$, orthogonal to $\mathbf{c}$ , and the curve $\mathbf{v}(t)\,,\, t\in(-\infty, \infty)$; let's denote the latter by $F$. Thus, if such $F$ exists, the number of intersection points $|F\cap \ell|$ should remain invariant as we rotate $\ell$ around the origin $O$. It can happen if $F$ is some bounded loop, but in this case $F$ is not bounded. We can imagine $F$ loops at infinity. The invariant "breaks" when $\ell$ is at position that just "touches" $F$ at infinity. Why does it have to be a loop ? And I think we have to see the case when $ \mathbf{v}(t)\ $ goes through $ (0,0) $
23.01.2017 18:48
I called it loop because it closes the circuit at infinity. In fact this curve can intersects itself at many points, including the origin. If a line through the origin meets $F$ at such multiple point, we count it with its multiplicity. One can maps $F$ to a curve $F'$ lying on a sphere and passes through its north pole/infinity/, the south pole being the origin. We rotate a meridian. Can the number of its intersection points with $F'$ be invariant?
23.01.2017 19:16
Well, unfortunately I'm a high school student and I'm not good at those, so it'll take some time but yes, I think it's right. Actually we can set $ deg(f) > deg(g) $ and the infinity line becomes $ x $. So, I think it's basically similar to jaydoubleuel's soution actually too
23.01.2017 20:23
Well, I don't urge anyone to present a solution like mine in this thread on any math competition. It should be considered only as an insight why it cannot happen.
05.05.2017 16:33
I hope to be right The same as previous solutions we assume that $ deg(f) > deg(g) $ and they don't have common roots . Now assume that for a real b we have $(f(x) +bg(x),f'(x)+bg'(x))=Q(x)$ so its easy to see that $b(g(x)f'(x)-g'(x)f(x))$ is divisible by $Q(X)$ now $b(g(x)f'(x)-g'(x)f(x))\neq0$ because g and f don't have any common root so we can find infinite b in such a way that $(f(x) +bg(x),f'(x)+bg'(x))=1$. It means that all the root of $f(x) +bg(x) $ for such a b are different which means that $k= deg(f)$ . Now we plug $a=0$ and get a contradiction. If they have a common root we can we can use above solution by considering their greatest common divisor.
07.12.2017 19:40
This is not a new solution but a way to understand dgrozev's elegant solution for the reader who feel's some unfamiliarity. Comments in the parenthesis are actually for the reader who have some experience with topology. One can safely skip those things within (). One possible approach is to consider so called `Gauss map'. For contradiction, let us begin by choosing a pair of polynomial $f(x)$ and $g(x)$ over reals and a positive integer $k$ such that for any real numbers $a$, $b$ with $(a,b) \neq (0,0)$, the number of elements in the set $\left \{ x \in \mathbb{R}| af(x) + bg(x) = 0 \right \}$ is $k$. As noted, we may assume that $f(x)$ and $g(x)$ do not share zeros; one may simply divide $f(x)$ and $g(x)$ by the common linear factor until they do not have more. As dgrozev have argued, consider the curve $\mathbf{v}(t) = (f(t),g(t))$ and vector $\mathbf{c} = (a,b)$. Then the number of solutions of $\mathbf{c} \cdot \mathbf{v}(t) =0$ is the number of traveling point $\mathbf{v}(t)$'s visiting the line $l$ through the origin orthogonal to $\mathbf{c}$. Here, we have identified the vector $\mathbf{v}$ with the point indicated by the vector. Now, let $n(x) = \left| \mathbf{v}(t) \right|$ and consider new curve $\mathbf{u}(t) = \frac{1}{n(x)} \mathbf{v}(t)$. Note that $\mathbf{u}$ lives on the unit circle. Therefore, we can find a function $\theta: \mathbb{R} \rightarrow \mathbb{R}$, $t \mapsto \theta$ which is continuous so that $\theta(t)$ measures one of counter clockwise angle of $\mathbf{u}$ from the positive $x$ axis. (actually smooth where the existence is intuitively clear, however, needs proof; called the lifting lemma on the universal covering space, the hellix). Note the following facts. ⅰ. $u(t)$ converges when $t \rightarrow \infty$ and $t \rightarrow -\infty$, say $P$, $Q$ respectively. ⅱ. Either $P=Q$ or $Q$ is the antipodal of $P$. ⅲ. As $t \rightarrow \infty$, $\theta(t)$ either increases eventually, or decreases eventually. Same is true when $t \rightarrow -\infty$. (There's some ambiguity when we consider $t \rightarrow -\infty$. We mean $\theta(t)$ eventually increases when $t \rightarrow -\infty$ if $\mathbf{u}(t)$ approaches $Q$ counterclockwise. Note that this also means that the number of `turning points',the points at which $\theta(t)$ takes its maximum or minimum, of $\mathbf{u}(t)$ is finite) ⅳ. The number of intersection between $l$ and $\mathbf{v}(t)$ during $t$ moves the whole real line is the same with that between $l$ and $\mathbf{u}(t)$ during $t$ moves the whole real line. Now we should do some casework. But the following also should be suffices to illustrate what's going on. When $P=Q$ and $\theta(t)$ eventually increases, then $t \rightarrow \infty$ and $t \rightarrow -\infty$. We devide $l$ into two parts by the origin. Say the part closer to $P$, blue and the other red. Consider what happen when the blue `passes' to through $P$ counterclockwisely. Assume that the number of intersection point with blue were $m$ when it approaches to $P$ sufficiently close. It becomes $m-2$ suddenly when it touches the point $P$ and remain the same meanwhile after passing the point $P$. This is so called the concept of semi-continuous; value is same with one one-sided limit. Now let us consider what could happen on the red part. Let $T$ be the antipodal of $P$. The number of intersection points on the red should be $k-m$ during it approaches $T$ but should increase by $2$ when it touches $T$. However, this is not possible since without those `turning points' we cannot change the number of intersection points and with those `turning points' we cannot make semi-continuous intersection number; the value at $T$ always equal to the average of left and right limit of the intersection number. I believe that the other case can be easily resolved by the reader. For advanced reader, we may reduce the case by introducing another function $\eta(t) = \theta(2t)$, i.e. view the unit circle as a double covering of itself.