Answer
Prerequisites
- Linear Discriminant Functions: From part (a), we know for any class .
- Decision Boundary: The boundary between two classes and is the set of points where the classifier is indifferent between the two classes, meaning their discriminant functions are equal: .
- Hyperplane Equation: A hyperplane in -dimensional space can be defined by the equation , where is the normal vector to the hyperplane and is the bias (or offset).
Step-by-Step Derivation
-
Set Discriminant Functions Equal: The decision boundary is defined by the condition where the scores for class and class are identical:
-
Substitute the Linear Forms: Using the results from part (a), substitute the linear equations for and :
-
Rearrange into Hyperplane Form: Move all terms involving to one side and the constant terms to the other side to match the standard hyperplane equation : Let and .
-
Derive the Expression for : Substitute the definitions of and from part (a): Factor out : This matches the required expression for .
-
Derive the Expression for : Substitute the definitions of and from part (a): Group the quadratic terms and the logarithmic terms:
-
Simplify the Logarithmic Term: Using the quotient rule for logarithms ():
-
Simplify the Quadratic Term: We need to show that . Let's expand the right side of this proposed equality: Since is a symmetric covariance matrix, its inverse is also symmetric. Therefore, the scalar value is equal to its transpose . This means the two middle terms cancel each other out: Leaving us with:
-
Final Substitution: Substitute the simplified logarithmic and quadratic terms back into the equation for : This matches the required expression for , completing the proof.