Example: quiz answers

The Unscented Kalman Filter for Nonlinear Estimation

Second-order ,therelationshipbetweentheKalmanFilter(K F)andRecursiveLeastSquares(RLS)isgivenin [3].TheuseoftheEKFfortrainingneuralnetwo rkshasbeendevelopedbySinghalandWu[9]andP uskoriousandFeldkamp[8].DualEstimationAs pecialcaseofmachinelearningariseswhenthe inputisunobserved, ,weagainconsideradiscrete-timenonlineard ynamicsystem,(6)(7) ,weintroducetheUnscentedKalmanFilter(UKF ) ,inSection4, ,are-cursiveestimationforcanbeexpressedi ntheform(see[6]),predictionofpredictiono f(8)Thisrecursionprovidestheoptimalminim ummean-squarederror(MMSE)estimateforassu mingthepriorestimateandcurrentobservatio nareGaussianRandomVari-ables(GRV). (9)(10)(11)wheretheoptimalpredictionofis writtenas,andcorrespondstotheexpectation ofanonlinearfunctionoftherandomvariables and(similarinterpretationfortheoptimalpr ediction).Theoptimalgaintermisexpresseda safunctionofposteriorcovariancematrices( with). , ,however,theEKFapproxi-matestheoptimalte rmsas:(12)(13)(14)wherepredictionsareapp roximatedassimplythefunctionofthepriorme anvalueforestimates(noexpectationtaken)1 Thecovariancearedeterminedbylinearizingt hedynamicequations(), ,intheEKFthestatedistributionisapproxima tedbyaGRVwhichisthenpropagatedanalytical lythroughthe first-order [6] ,theEKFcanbeviewedasproviding first-order ,however,canintro-ducelargeerrorsinthetr ueposteriormeanandcovarianceofthetransfo rmed(Gaussian)randomvariable, flaws , ,andwhenpropagatedthroughthetruenon-line arsystem,capturestheposteriormeanandcova rianceaccuratelytothe3rdo

ist for this problem. We present results for the Dual UKF and Joint UKF. Development of a Unscented Smoother for an EM approach [2] was presented in [13]. As in the prior state-estimation example,we utilizea noisy time-series ap-plication modeled with neural networks for illustration of the approaches. In the the dual extended Kalman filter ...

Tags:

  Series, Dual

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of The Unscented Kalman Filter for Nonlinear Estimation

1 Second-order ,therelationshipbetweentheKalmanFilter(K F)andRecursiveLeastSquares(RLS)isgivenin [3].TheuseoftheEKFfortrainingneuralnetwo rkshasbeendevelopedbySinghalandWu[9]andP uskoriousandFeldkamp[8].DualEstimationAs pecialcaseofmachinelearningariseswhenthe inputisunobserved, ,weagainconsideradiscrete-timenonlineard ynamicsystem,(6)(7) ,weintroducetheUnscentedKalmanFilter(UKF ) ,inSection4, ,are-cursiveestimationforcanbeexpressedi ntheform(see[6]),predictionofpredictiono f(8)Thisrecursionprovidestheoptimalminim ummean-squarederror(MMSE)estimateforassu mingthepriorestimateandcurrentobservatio nareGaussianRandomVari-ables(GRV). (9)(10)(11)wheretheoptimalpredictionofis writtenas,andcorrespondstotheexpectation ofanonlinearfunctionoftherandomvariables and(similarinterpretationfortheoptimalpr ediction).Theoptimalgaintermisexpresseda safunctionofposteriorcovariancematrices( with). , ,however,theEKFapproxi-matestheoptimalte rmsas:(12)(13)(14)wherepredictionsareapp roximatedassimplythefunctionofthepriorme anvalueforestimates(noexpectationtaken)1 Thecovariancearedeterminedbylinearizingt hedynamicequations(), ,intheEKFthestatedistributionisapproxima tedbyaGRVwhichisthenpropagatedanalytical lythroughthe first-order [6] ,theEKFcanbeviewedasproviding first-order ,however,canintro-ducelargeerrorsinthetr ueposteriormeanandcovarianceofthetransfo rmed(Gaussian)randomvariable, flaws , ,andwhenpropagatedthroughthetruenon-line arsystem,capturestheposteriormeanandcova rianceaccuratelytothe3rdorder(Taylorse-r iesexpansion) ,1 Thenoisemeansaredenotedbyand, second-order versionsoftheEKFexist, (UT)isamethodforcal-culatingthestatistic sofarandomvariablewhichundergoesanonline artransformation[5].

2 Considerpropagatingaran-domvariable(dime nsion)throughanonlinearfunction,. ,weformamatrixofsigmavectors(withcorresp ondingweights),accord-ingtothefollowing: (15) ( ,1e-3).isasecondaryscalingparameterwhich isusuallysetto0,andisusedtoincorporatepr iorknowledgeofthedistributionof(forGauss iandistributions,isoptimal). ,(16)andthemeanandcovarianceforareapprox imatedus-ingaweightedsamplemeanandcovari anceoftheposteriorsigmapoints,(17)(18)No tethatthismethoddifferssubstantiallyfrom general sam-pling methods( ,Monte-Carlomethodssuchasparticlefilters [1])whichrequireordersofmagnitudemoresam plepointsinanattempttopropagateanaccurat e(possiblynon-Gaussian) ,approximationsareaccuratetoatleastthese cond-order,withtheaccuracyofthirdandhigh erordermomentsdeterminedbythechoiceofand (See[4]foradetaileddiscussionoftheUT).As impleexampleisshowninFigure1fora2-dimens ionalsystem:theleftplotshowsthetruemeana ndcovariancepropagationusingMonte-Carlos ampling;thecenterplotsActual (sampling)Linearized (EKF)UTsigma pointstrue meanUT mean and covarianceweighted sample meanmeanUT covariancecovariancetrue covariancetransformedsigma pointsFigure1 )actual,b)first-orderlinearization(EKF), c) ;therightplotsshowtheperformanceoftheUT( noteonly5sigmapointsarerequired).

3 (UKF)isastraightfor-wardextensionoftheUT totherecursiveestimationinEqua-tion8,whe rethestateRVisredefinedastheconcatenatio noftheoriginalstateandnoisevariables:.Th eUTsigmapointselectionscheme(Equation15) isap-pliedtothisnewaugmentedstateRVtocal culatethecorre-spondingsigmamatrix,. , ,andhasbeenappliedinnonlinearcontrolappl ica-tionsrequiringfull-statefeedback[5]. Intheseapplications,thedynamicmodelrepre sentsaphysicallybasedparamet-ricmodel, ,weextendtheuseoftheUKFtoabroaderclassof nonlinearestimationproblems, , , :For,Calculatesigmapoints:Timeupdate:Mea surementupdateequations:where,,,=composi tescalingparameter,=dimensionofaugmented state,=processnoisecov.,=measurementnois ecov.,= :UnscentedKalmanFilter(UKF) (19)wherethemodel(parameterizedbyw) , :..(20)Intheestimationproblem,thenoisy-t imeseriesistheonlyobservedinputtoeithert heEKForUKFalgorithms(bothutilizetheknown neuralnetworkmodel). (theorig-inalnoisytime-serieshasa3dBSNR) . 505kx(k) Estimation of Mackey Glass time series : EKFcleannoisyEKF 200210220230240250260270280290300 505kx(k) Estimation of Mackey Glass time series : UKFcleannoisyUKF MSEE stimation Error : EKF vs UKF on Mackey GlassEKFUKFF igure2 (seeEquation7).

4 Asexpressedearlier, [2]waspresentedin[13].Asinthepriorstate- estimationexample, [11], ,thestate-spacerepresentationfortheweigh tsisgivenby(21)(22) ,thecurrentestimateoftheweightsisusedint hesignal- Filter , , , [7],thesignal-stateandweightvectorsareco ncatenatedintoasingle,jointstatevector:. Estimationisdonerecursivelybywrit-ingthe state-spaceequationsforthejointstateas:( 23)(24) , (SNR3dB).Thesecondtimeseries(alsochaotic )comesfromanautoregressiveneuralnetworkw ithrandomweightsdrivenbyGaussianprocessn oiseandalso3isusuallysettoasmallconstant whichcanberelatedtothetime-constantforRL Sweightdecay[3].Foradatalengthof1000, (SNR3dB). , MSEC haotic AR neural networkDual UKF dual EKF Joint UKFJ oint MSEM ackey Glass chaotic time seriesDual EKF dual UKF Joint EKFJ oint UKF200210220230240250260270280290300 505kx(k) Estimation of Mackey Glass time series : dual UKFclean noisy dual UKFF igure3 , (isthenumberofweights).Theadvantageofthe UKFovertheEKFinthiscaseisalsonotasobviou s, ,aspointedoutearlier, , ,however, , (averagedover100experimentswithdifferent initialweights) , 210 1 Mackay Robot Arm : Learning curvesepochmean MSEUKF EKF 05101520253010 1100 Ikeda chaotic time series : Learning curvesepochmean MSEUKF EKF Figure4 )Mackay-Robot-Arm,2-12-2 MLP,b)Ikedatimeseries, ,includingstate- Estimation ,duales-timati on, ,exten-sionstobatchlearningandnon-MSEcos tfunctions,aswellasapplicationtootherneu ralandnon-neural( ,paramet-ric) ,wearealsoexploringtheuseoftheUKFasameth odtoimproveParticleFilters[10],aswellasa nextensionoftheUKFitselfthatavoidsthelin earupdateassumptionbyusingadirectBayesia nupdate[12].

5 [1] , , , , ,UniversityofCambridge,Nov1998.[2] , , ,B39:1 38,1977.[3] ,Inc,3edition,1996.[4] ,February2000.[5] ,SimulationandControls.,1997.[6] ,Inc.,NewYork,1986.[7] ,pages197 200,1990.[8] ,volume1,pages771 777,1991.[9] ,pages133 140,SanMateo,CA, [10] , , , , ,UniversityofCambridge, [11] ,1997.[12] ,CSLU,OregonGraduateInstituteofSciencean dTech-nology, ( ).[13] , , , , uller,editors,AdvancesinNeuralInformatio nProcessingSystems12,pages666 ,2000.


Related search queries