Math Problem Statement
If the roots of the equation p(q-r)x2+q(r-p)x+r(p-q)=0 be equal then show that 1/p+1/r=2/q
Solution
Given the quadratic equation:
We are told that the roots of the equation are equal, meaning it is a perfect square. For a quadratic equation of the form:
the condition for equal roots is that the discriminant must be zero, i.e.,
Here, comparing the given equation with the standard form , we have:
Now, applying the condition for equal roots:
Substituting , , and into this expression:
Expanding both terms:
Next, simplify each part:
- remains as it is.
- For the second term, observe that:
So the expression simplifies to:
Dividing through by , we get:
Rearranging this:
Taking square roots of both sides:
Now, divide both sides by :
Hence, we have proved the required result:
Let me know if you'd like further details on any of these steps!
Here are 5 follow-up questions you might explore:
- What is the discriminant, and why does it determine if a quadratic has equal roots?
- How can this relationship between roots and coefficients be used in real-world applications?
- Can this method of proof be generalized for other quadratic equations?
- How would you solve the equation if the roots were not equal?
- How does symmetry in the equation help simplify the calculation?
Tip: Always double-check your work for common factorization mistakes when dealing with quadratic equations!
Ask a new question for Free
By Image
Drop file here or Click Here to upload
Math Problem Analysis
Mathematical Concepts
Algebra
Quadratic Equations
Discriminant
Equal Roots
Formulas
Quadratic equation formula: ax^2 + bx + c = 0
Discriminant formula: Δ = B^2 - 4AC = 0
Simplification: q^2 = 4pr
Final Result: 1/p + 1/r = 2/q
Theorems
Discriminant Theorem
Quadratic Roots Condition
Suitable Grade Level
Grades 9-12