Transcription of Chapter utorial: The Kalman Filter
{{id}} {{{paragraph}}}
Chapter11 lter[1]haslongbeenregardedastheoptimalso lutiontomanytrackinganddatapredictiontas ks,[2]. lterisconstructedasameansquarederrormini miser,butanalternativederivationofthe lterisalsoprovidedshowinghowthe lteringistoextracttherequiredinformation fromasignal, nethegoalofthe ;yk=akxk+nk( )where;ykisthetimedependentobservedsigna l,akisagainterm, erencebetweentheestimateof^xkandxkitself istermedtheerror;f(ek)=f(xk ^xk)( )Theparticularshapeoff(ek)isdependentupo ntheapplication,howeveritisclearthatthef unctionshouldbebothpositiveandincreasemo notonically[3].Anerrorfunctionwhichexhib itsthesecharac-teristicsisthesquarederro rfunction;f(ek)=(xk ^xk)2( )133 Sinceitisnecessarytoconsidertheabilityof the ltertopredictmanydataoveraperiodoftimeam oremeaningfulmetricistheexpectedvalueoft heerrorfunction;lossfunction=E(f(ek))( )Thisresultsinthemeansquarederror(MSE)fu nction; (t)=E e2k ( ) , ningthegoalofthe lterto ndingthe^ ;max[P(yj)]
as a Gaussian distribution. In suc h a case the MSE serv es to pro vide the v alue of ^ x k whic h maximises the lik eliho o d of the signal y k. In the follo wing deriv ation the ... and is assumed stationary o v er time, (nxm); w k is the asso ciated white noise pro cess with kno wn co v ariance, (nx1). Observ ations on this v ariable can b e ...
Domain:
Source:
Link to this page:
Please notify us if you found a problem with this document:
{{id}} {{{paragraph}}}