Problem

Source:

Tags: inequalities, limit, induction, Miscellaneous Problems



Let $\alpha(n)$ be the number of digits equal to one in the dyadic representation of a positive integer $n$. Prove that the inequality $\alpha(n^2 ) \le \frac{1}{2} \alpha(n) (1+\alpha(n))$ holds, equality is attained for infinitely $n\in\mathbb{N}$, there exists a sequence $\{n_i\}$ such that $\lim_{i \to \infty} \frac{ \alpha({n_{i}}^2 )}{ \alpha(n_{i}) } = 0$.