Transcription of CS 229, Public Course Problem Set #1 Solutions: Supervised ...
{{id}} {{{paragraph}}}
CS229 Problem Set #1 Solutions1CS 229, Public CourseProblem Set #1 Solutions: Supervised s method for computing least squaresIn this Problem , we will prove that if we use Newton s method solve the least squaresoptimization Problem , then we only need one iteration to converge to .(a) Find the Hessian of the cost functionJ( ) =12 Pmi=1( Tx(i) y(i)) :As shown in the class notes J( ) j=mXi=1( Tx(i) y(i))x(i) 2J( ) j k=mXi=1 k( Tx(i) y(i))x(i)j=mXi=1x(i)jx(i)k= (XTX)jkTherefore, the Hessian ofJ( )isH=XTX. This can also be derived by simply applyingrules from the lecture notes on Linear Algebra.(b) Show that the first iteration of Newton s method gives us = (XTX) 1XT~y, thesolution to our least squares :Given any (0), Newton s method finds (1)according to (1)= (0) H 1 J( (0))= (0) (XTX) 1(XTX (0) XT~y)= (0) (0)+ (XTX) 1XT~y= (XTX) 1XT~ , no matter what (0)we pick, Newton s method always finds after logistic regressionIn this Problem you will implement a locally-weighted version of logisticregression, wherewe weight different training examples differently according to the query point.
2. Locally-weighted logistic regression In this problem you will implement a locally-weighted version of logistic regression, where we weight different training examples differently according to the query point. The locally-weighted logistic regression problem is to maximize ℓ(θ) = − λ 2 θTθ + Xm i=1 w(i) h y(i) logh θ(x (i))+(1−y ...
Domain:
Source:
Link to this page:
Please notify us if you found a problem with this document:
{{id}} {{{paragraph}}}