Problem

Source:

Tags: algorithm, number theory, Euclidean algorithm, number theory unsolved



Let $a,b$ be two positive integers and $a>b$.We know that $\gcd(a-b,ab+1)=1$ and $\gcd(a+b,ab-1)=1$. Prove that $(a-b)^2+(ab+1)^2$ is not a perfect square.