-
Expanding the Objective Function:
We are given the optimization problem from (3.63):
J(θ+,θ−)=21∥y−ΦT(θ+−θ−)∥2+λ∑i(θ+_i+θ−_i)
First, let's expand the squared L2-norm term 21∥y−ΦT(θ+−θ−)∥2:
21(y−ΦT(θ+−θ−))T(y−ΦT(θ+−θ−))
=21yTy−yTΦT(θ+−θ−)+21(θ+−θ−)TΦΦT(θ+−θ−)
Now, we express this in terms of the concatenated vector x=[θ+θ−]. Note that:
θ+−θ−=[I−I][θ+θ−]=[I−I]x
-
Formulating the Quadratic Term H:
Look at the quadratic part of the expansion:
21(xT[I−I])ΦΦT([I−I]x)
=21xT([I−I]ΦΦT[I−I])x
=21xT[ΦΦT−ΦΦT−ΦΦTΦΦT]x
Thus, we identify the Hessian matrix H:
H=[ΦΦT−ΦΦT−ΦΦTΦΦT]
-
Formulating the Linear Term f:
The linear parts come from the cross term in the squared norm and the L1 penalty term.
Cross term:
−yTΦT(θ+−θ−)=−(Φy)T(θ+−θ−)=[−ΦyΦy]T[θ+θ−]
Notice that −[Φy−Φy]Tx matches this perfectly.
Regularization term:
λ∑∗i(θ+_i+θ−_i)=λ1Tθ++λ1Tθ−=(λ[11])T[θ+θ−]=(λ1∗2D)Tx
Combining these linear pieces gives fTx:
fTx=(λ1−[Φy−Φy])Tx
So we identify exactly:
f=λ1−[Φy−Φy]
-
Final Check:
The objective becomes:
J(x)=21xTHx+fTx+21yTy
Since 21yTy is a constant with respect to x, it can be dropped from the minimization objective. The constraints θ+≥0 and θ−≥0 compactly become x≥0.
Hence, the problem is identical to:
min_x21xTHx+fTxs.t. x≥0
Which matches the required form.