Fit Y = aX² + bX + c to data with R², vertex, roots, discriminant, residual analysis, and comparison to linear fit.
When data follows a curved pattern rather than a straight-line trend, quadratic regression can capture that curvature with a model of the form Y = aX² + bX + c.
This calculator fits the three coefficients, reports R² and adjusted R², and then summarizes the features that usually matter in practice: the vertex, the axis of symmetry, the discriminant, and the real roots when they exist. It also compares the fit against a linear model so you can see whether the quadratic term is actually earning its keep.
That makes the page useful for optimization problems, curved physical relationships, and any dataset where a straight line visibly misses the pattern.
Quadratic regression is often useful when the practical question is not just "is there a relationship?" but "where is the peak or trough?" The vertex often matters more than the coefficients themselves.
Seeing the fitted equation beside the vertex, the roots, and the linear-model comparison makes it easier to judge whether the curve is meaningful or whether a simpler model would be enough.
Y = aX² + bX + c (least squares via normal equations). Vertex: (−b/2a, f(−b/2a)). Discriminant: Δ = b²−4ac. Roots: X = (−b±√Δ)/2a.
Result: Y = −0.6488X² + 28.5595X − 62.5000, R² = 0.9958, Vertex: (22.01, 251.85), Linear R² = 0.5816
Revenue peaks at a price of ~22 with predicted revenue of 252. The quadratic model (R²=0.996) vastly outperforms linear (R²=0.582), confirming the data is genuinely curved.
Quadratic regression minimizes Σ(yᵢ − axᵢ² − bxᵢ − c)². Taking partial derivatives with respect to a, b, c and setting them to zero yields three simultaneous equations (the normal equations). These involve sums of x, x², x³, x⁴, y, xy, and x²y. Solving the 3×3 system (Cramer's rule, LU decomposition, or matrix inversion) gives exact coefficients.
If residuals from the quadratic fit still show a systematic pattern, consider: cubic regression (degree 3) for S-shaped curves, logarithmic regression for rapid initial growth followed by leveling, or exponential regression for unlimited acceleration. Always use the simplest model that adequately fits the data — adding unnecessary polynomial terms overfits.
In economics, quadratic regression is the backbone of revenue optimization: R(p) = ap² + bp + c, where p is price. Optimal price = −b/(2a). In agriculture, yield response to fertilizer is often quadratic — too much reduces yield. In pharmacology, the effective dose is the vertex of a quadratic dose-response curve. The vertex is the practical answer; the equation is just the means to find it.
Last updated:
Use quadratic when (1) a scatter plot shows curvature, (2) residuals from linear regression show a U or inverted-U pattern, (3) domain knowledge suggests a peak/trough (e.g., optimal dosage, price optimization), or (4) R² improves substantially over linear.
If a < 0 (parabola opens down), the vertex is the maximum — the X value that maximizes Y. If a > 0 (opens up), the vertex is the minimum. In business, the vertex often represents the optimal price, dosage, or resource allocation.
Minimum 3 (since we solve for 3 parameters), but that gives zero residual degrees of freedom. Realistically, 8+ points give reasonable R² estimates and residual diagnostics. More data near the vertex improves accuracy there.
If a is very close to zero, the quadratic term adds nothing — use linear regression instead. Check if the R² improvement over linear is < 1 percentage point; if so, the simpler linear model is preferable (parsimony principle).
Extrapolating quadratic models is especially risky because parabolas diverge rapidly. A model that fits well for X = 0–10 might predict absurdly high or negative values at X = 50. Limit predictions to within or near the observed X range.
Δ = b²−4ac determines where the parabola crosses Y = 0. If Δ > 0: two X-intercepts (roots). Δ = 0: one tangent root. Δ < 0: no real roots (parabola never crosses Y = 0). This matters when Y represents profit or quantity that can't be negative.