Let $a,b$ be two positive integers and $a>b$.We know that $\gcd(a-b,ab+1)=1$ and $\gcd(a+b,ab-1)=1$. Prove that $(a-b)^2+(ab+1)^2$ is not a perfect square.
Problem
Source:
Tags: algorithm, number theory, Euclidean algorithm, number theory unsolved
30.04.2010 19:37
amparvardi wrote: Let $a,b$ be two positive integers and $a>b$.We know that $\gcd(a-b,ab+1)=1$ and $\gcd(a+b,ab-1)=1$.Prove that $(a-b)^2+(ab+1)^2$ is not a perfect square. suppose that $(a-b)^2+(ab+1)^2$ is the square of a positif integer , we have $(a-b)^2+(ab+1)^2=(a^2+1)(b^2+1)$ let $(a^2+1,b^2+1)=\lambda$ then $\lambda \mid a^2+1-(b^2+1) \Rightarrow \lambda \mid (a-b)(a+b)$ assume that $\lambda \neq 1$ , let $p$ be a prime divisor of $\lambda$ , then $p \mid a-b$ or $p\mid a+b$ if $p\mid a-b$ then $p\mid (a^2+1)(b^2+1) \Rightarrow p\mid (a-b)^2+(ab+1)^2 \Rightarrow p\mid ab+1$ and so $p\mid (a-b,ab+1)=1$ contradiction , if $p\mid a+b$ then $p\mid (a-b)^2+(ab+1)^2=(a+b)^2+(ab-1)^2 \Rightarrow p\mid ab-1$ and so $p\mid (a+b,ab-1)$ or $p\mid 1$ contradiction , so $\lambda=1$ Thereofore the both $a^2+1$ and $b^2+1$ are a prefect square and that's not true and we are done.
30.04.2010 20:00
Congrats! there are many solutions for this problem,in fact for prove that $\gcd(a^2+1,b^2+1)=1$. For example,in the exam I used this \[a^2+b^2+2=(a+b)^2-2(ab-1)\]
01.05.2010 15:43
indeed it has many solutions,i did it with pythagorean triple as $gcd(a-b,ab+1)=1$,assumming it a square,we get a primitive triple,and i found contradiction with the general formula of the triple
03.05.2010 08:22
masum billal wrote: indeed it has many solutions,i did it with pythagorean triple as $gcd(a-b,ab+1)=1$,assumming it a square,we get a primitive triple,and i found contradiction with the general formula of the triple Yes, it has many solutions. For example it solves using integer solutions of the equation $x^2+y^2=z^2$.
04.05.2010 18:18
that is what i used
30.01.2011 00:03
I think this solution works. Solution Define $P$ as follows \[P = (a-b)^2 + (ab+1)^2 = a^2 b^2 + a^2 + b^2 + 1 = (a^2 +1)(b^2 + 1)\] Let $m$ denote the largest positive integer dividing $a^2 + 1$ satisfying that no perfect square divides $m$. Let $n$ be similarly defined for $b^2 +1$. Assume for contradiction that $P$ is a perfect square. Then it must follows that $m=n$ and, since neither $a^2 +1$ nor $b^2 +1$ is a perfect square, it must follow that $m = n >1$. Define $\alpha = m = n$. Since $m|(a^2 +1)$ and $n|(b^2 +1)$, $\alpha | \gcd{(a^2 +1, b^2 +1)}$. Now we prove the following lemma. Lemma: $\gcd{(a^2 +1, b^2 +1)} = 1$. By the Euclidean algorithm, we have that \[\gcd{(a^2 +1, b^2 +1)} = \gcd{(a^2 - b^2, a^2 +1)}\] Now observe that \[\gcd{(a-b, a^2 +1)} = \gcd{(a-b, a^2 - a(a-b) +1)} = \gcd{(a-b, ab +1)}=1\] Now observe that \[\gcd{(a+b, a^2 +1)} = \gcd{(a+b, a^2 + 1 - a(a+b))} - \gcd{(a+b, ab-1)}=1\] Therefore $\gcd{(a^2 - b^2, a^2 +1)}=1$. This implies that $\alpha =1$ which contradicts the fact that $P$ is a perfect square.
03.04.2011 16:11
For proving that $gcd(a^2+1,b^2+1)=1$, I did the following job: We have: \[gcd(a-b,ab+1)=gcd(a-b,a^2+1)=gcd(a-b,b^2+1)=1~~~~\mbox{And}\] \[gcd(a+b,ab-1)=gcd(a+b,a^2+1)=gcd(a+b,b^2+1)=1\] Notice that I used the fact $gcd(a,b)=gcd(a,b+ak)=gcd(a+bk,bk)$ where $k\in{\mathbb{Z}}$. Now we have: \[gcd(a-b,a^2+1)=1~~\mbox{And}~~gcd(a+b,a^2+1)=1~~\Longrightarrow\] \[gcd((a-b)(a+b),a^2+1)=gcd(a^2-b^2,a^2+1)=1\] Now suppose that $gcd(a^2+1,b^2+1)=d$, we have: \[d|a^2+1-b^2-1=a^2-b^2~~\mbox{and}~~d|a^2+1~~\Longrightarrow\] \[d|gcd(a^2-b^2,a^2+1)=1~~\Longrightarrow~~\boxed{d=1}\] As we desired
08.05.2017 20:01
masum billal wrote: indeed it has many solutions,i did it with pythagorean triple as $gcd(a-b,ab+1)=1$,assumming it a square,we get a primitive triple,and i found contradiction with the general formula of the triple Can you explain more clearly
29.03.2022 16:44
$(a-b)^2+(ab+1)^2 = a^2b^2 + a^2 + b^2 + 1 = (a^2+1)(b^2+1)$ so we need to prove $\gcd{(a^2+1,b^2+1)} = 1$. $\gcd{(a-b,ab-1)} = \gcd{(a-b,a(a-b)+ab+1)} = \gcd{(a-b,b(a-b)-ab-1)} \implies \gcd{(a-b,a^2+1)} = \gcd{(a-b,b^2+1)} = 1$. $\gcd{(a+b,ab-1)} = \gcd{(a+b,a(a+b)-ab+1)} = \gcd{(a+b,b(a+b)-ab+1)} \implies \gcd{(a+b,a^2+1)} = \gcd{(a+b,b^2+1)} = 1$. $\gcd{(a^2+1,b^2+1)} = \gcd{((a-b)(a+b),a^2+1)} = 1$. we're Done.
08.08.2022 22:14
Let $q = (a-b)^{2}+(ab+1)^{2}$. Notice that: $$ q = (a-b)\cdot(a+ b) + (ab+1)\cdot(ab-1) + 2 \cdot (b^{2} +ab - ab +1) \textbf{(1)} $$ Now suppose for tha sake of contradiction that $ q = x^{2} + 2xy + y ^{2} $ for $x, y \in \mathbb{Z} $ Note that the only way for $q$ to be a perfect square is when $ a=b$, but for hypothesis, this isn't possible. Contradiction.