### Is the largest root of a random real polynomial more likely real or complex?

This question might be hard because it got $35$ upvotes in MSE and also had a $200$ points bounty by Jyrki Lahtonen but it was unanswered. So I am posting it in MO. The number of real roots of a random polynomial with real coefficients is much smaller than the number of complex roots. Assume

*This question might be hard because it got $35$ upvotes in MSE and also had a $200$ points bounty by Jyrki Lahtonen but it was unanswered. So I am posting it in MO.*

The number of real roots of a random polynomial with real coefficients is much smaller than the number of complex roots. Assume that the coefficients are independently and uniformly random in $(-1,1)$ for if not then we can divide each coefficient by the coefficient with the largest absolutely value to scale each coefficient to $(-1,1)$. The number of real roots of a polynomial of degree $n$ is asymptotic to $displaystyle frac{2log n}{pi} + o(1)$. This means that the number of complex roots is approximately $displaystyle n – frac{2log n}{pi}$. Similar asymptotics hold for other distribution of the coefficients.

**Definition **:

The largest (or smallest) root of a polynomial is the root with the largest (or smallest) modulus.

The above graph shows the roots one such polynomial with degree $101$. The largest root is in the top right corner in green.

We can ask if the largest (or the smallest) root more likely to be real or complex? Since there are exponentially more complex roots than real roots as seen from the above asymptotic, my naive guess was that the largest (or the smallest) root is more likely to be complex. However experimental data proved to be quite counterintuitive.

The data shows that:

- Probability that the largest (or smallest) root is real is greater than the probability that it is complex.
- And this probability decreases to some value near $1/2$ as $n to infty$ as shown in the above graph (created using a Monte Carlo simulation with $10^5$ trials for each value of $n$).
**Note**: Instead of uniform distribution, if we assume that the coefficients are normally distributed with mean $0$ and standard deviation $1$ and scaled to $(-1,1)$, the above observation and limiting probabilities hold.

It is counterintuitive that despite being much (exponentially) fewer in number, real roots are more likely to contain the both largest and the smallest roots of a random polynomial. *In this sense, the largest and the smallest roots are both biased towards reals*.

**Question 1**: What is the reason for this bias?

**Question 2**: Does the probability that the largest (or the smallest) root of a polynomial of degree $n$ is real converge (to some value near $frac{1}{2}$ as $n to infty$)?

**Note**: We can quantify the observed bias as follows. Let $P(L|R)$ be the probability that a root is the largest given that it is real and let $P(L|C)$ be the probability that a root is the largest given that it is complex. Similarly, let $P(S|R)$ be the probability that a root is the smallest given that it is real and let $P(S|C)$ be the probability that a root is the smallest given that it is complex. Then the experimental data says that

$$

P(L|R) = P(S|R) approx frac{pi}{4log n},

$$

$$

P(L|C) = P(S|C) approx frac{pi}{2npi – 4log n}.

$$

**Update**: In the linked MSE post, it has now been proved that the probability that the largest root is real is at least

$$

frac{23-16sqrt{2}}{6} approx 6.2 %

$$