Prove that for any positive integers $x, y, z$ with $xy-z^2 = 1$ one can find non-negative integers $a, b, c, d$ such that $x = a^2 + b^2, y = c^2 + d^2, z = ac + bd$. Set $z = (2q)!$ to deduce that for any prime number $p = 4q + 1$, $p$ can be represented as the sum of squares of two integers.
Problem
Source:
Tags: quadratics, number theory, prime, Sum of Squares, IMO Shortlist
21.09.2010 09:31
amparvardi wrote: Prove that for any positive integers $x, y, z$ with $xy-z^2 = 1$ one can find non-negative integers $a, b, c, d$ such that $x = a^2 + b^2, y = c^2 + d^2, z = ac + bd$. Set $z = (2q)!$ to deduce that for any prime number $p = 4q + 1$, $p$ can be represented as the sum of squares of two integers. Let's take $a,b,c,d$ pairwisely co-prime and such that $ac-bd=1$.Bezout's identity confirms this.Now $z^2+1$ is a bi-square.So $x,y$ are bi-squares too.So let $x=a^2+b^2,y=c^2+d^2.$Then $z=ad+bc$(From Fibbonacci-brahmagupta identity).So such $a,b,c,d$ exists. And if you ask to prove $p$ can be represented as a sum of two squares,then I have the following solution(though I think you asked to deduce using $(2q)!$) $p=4k+1\implies (\frac {-1} p)=1$(from Legendre Symbol) or $-1$ is a quadratic residue of $p$ So there exists an $n$ such that $p|n^2+1$ and all factors of $n^2+1$ are bi-squares.So there exists $a,b$ such that $p=a^2+b^2$
23.08.2022 00:34
We have $xy = (z+i)(z-i)$. By unique factorization in $\mathbb{Z}[i]$ we can write $x=u_1v_1$, $y=u_2v_2$, $z+i = u_1v_2$, $z-i = u_2v_1$ for some (non-zero) $u_1,u_2,v_1,v_2 \in \mathbb{Z}[i]$. Hence $v_1 = \frac{x}{u_1} = \frac{x\overline{u_1}}{N(u_1)} = q_1\overline{u_1}$ and similarly $v_2 = \frac{y}{u_2} = \frac{y\overline{u_2}}{N(u_2)} = q_2\overline{u_2}$ where $q_1$, $q_2$ are \textit{positive} rational numbers. Substituting in the equations for $z$ gives $z+i = q_2u_1\overline{u_2}$, $z-i = q_1\overline{u_1}u_2$. Since $z+i$ and $z-i$ are conjugates (as $z$ is real), so are the corresponding right-hand sides and since $q_1$ and $q_2$ are real we must have $q_1 = q_2$. If the common value is $\frac{k}{\ell}$ for coprime positive integers $k$ and $\ell$, then $\ell z + \ell i = k\mbox{Re}(u_1\overline{u_2}) + ik\mbox{Im}(u_1\overline{u_2})$, from the imaginary parts we get that $k$ divides $\ell$ and so $k=1$. Also, $\ell = \mbox{Im}(u_1\overline{u_2}) \mid \mbox{Re}(u_1\overline{u_2})$. Write $u_1 = a_0 + b_0i$, $u_2 = c_0 + d_0i$. Then $\ell = b_0c_0 - a_0d_0$ divides $a_0c_0 + b_0d_0 = z\ell$ and $x = \frac{u_1\overline{u_1}}{\ell} = \frac{a_0^2+b_0^2}{\ell}$, so $v_1 = \frac{x}{u_1} = \frac{a_0-b_0i}{\ell}$, but since $v_1 \in \mathbb{Z}[i]$ it follows that $\ell \mid a_0,b_0$; similarly $\ell \mid c_0,d_0$. Writing $a_0 = \ell a$, $b_0 = \ell b$, $c_0 = \ell c$, $d_0 = \ell d$ gives $\ell = \ell^2(bc - ad)$, i.e. $\ell(bc-ad) = 1$, thus $\ell = 1$, implying $x = a^2 + b^2$, $y = c^2 + d^2$ and $z = \mbox{Re}({u_1}u_2) = ac + bd$. The condition $xy = z^2 + 1$ insists on $(ad-bc)^2 = xy - z^2 = 1$, i.e. $|ad - bc| = 1$. Conversely, the same computation shows that any $a,b,c,d$ with $|ad-bc| = 1$ give rise to a solution $x=a^2 + b^2$, $y = c^2 + d^2$, $z = ac + bd$ to $xy = z^2 + 1$.
23.08.2022 00:38
Let us also give an elementary descent-type approach to this problem. We will show that we can actually require $a$, $b$, $c$, $d$ to be non-negative. Suppose otherwise and let $(x,y,z)$ be the minimal with respect to $z$ triple of positive integers for which the desired statement does not holds. Clearly $x\neq 1$ (as $a=0$, $b=c=1$, $d=1$ give a valid representation). Hence we may assume $2\leq x \leq y$. Consider the integers $x' = x$, $y' = x+y-2z$ and $z' = z-x$. Clearly $z' < z$. If $z=x$, then $xy=x^2+1$ and $x=1$, and if $z > x$, then $xy = z^2 + x < x^2 + 1$, contradicting $x \leq y$. Also, $x+y \geq 2\sqrt{xy} = 2\sqrt{z^2+1} > 2z$ and $x'y'-z'^2 = x^2+xy-2xz-z^2+2zx-x^2 = xy-z^2 = 1$. Therefore $x',y',z'$ are positive integers with $x'y'-z'^2 = 1$. By what we supposed there have to be $a',b',c',d'$ with $x'=a'^2+b'^2$, $y=c'^2+d'^2$, $z=a'c'+b'd'$. Then $x=x'=a'^2 + b'^2$, $y=x'+y'+2z'=(a'+c')^2+(b'+d')^2$ и $z=x'+z'=a'(a'+c')+b'(b'+d')$ give contraidiction and thus the problem is solved.