Problem

Source: Bulgarian Autumn Math Tournament 12.1

Tags: algebra, Sequence, limit



Let $a_0,a_1,a_2 \dots a_n, \dots$ be an infinite sequence of real numbers, defined by $$a_0 = c$$$$a_{n+1} = {a_n}^2+\frac{a_n}{2}+c$$for some real $c > 0$. Find all values of $c$ for which the sequence converges and the limit for those values.