Example: barber

Adaptive Filtering - Theory and Applications

Adaptive Filtering - Theory and ApplicationsJos e C. M. BermudezDepartment of Electrical EngineeringFederal University of Santa CatarinaFlorian opolis SCBrazilIRIT - INP-ENSEEIHT, ToulouseMay 2011 Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20111 / 1071 Introduction2 Adaptive Filtering Applications3 Adaptive Filtering Principles4 Iterative Solutions for the Optimum Filtering Problem5 Stochastic Gradient Algorithms6 Deterministic Algorithms7 AnalysisJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20112 / 107 IntroductionJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20113 / 107 Estimation TechniquesSeveral techniques to solve estimation EstimationMaximum Likelihood (ML), Least Squares (LS), Moments, EstimationMinimum MSE (MMSE), Maximum A Posteriori (MAP), EstimationFrequently used in practice when there is a limitation in computationalcomplexity Real-time operationJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20114 / 107 Linear EstimatorsSimpler to determine.

Echo Cancellation Telephone systems and VoIP Echo caused by network impedance mismatches or acoustic environment Objective: model the echo path impulse response x(n): transmitted signal d(n): echo + noise H H Tx Rx H H Tx Rx EC + _ x(n) e(n) d(n) Figure: Network Echo Cancellation Jos´e Bermudez (UFSC) Adaptive Filtering IRIT - Toulouse, 2011 ...

Tags:

  Acoustic, Cancellation, Echo, Echo cancellation

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Adaptive Filtering - Theory and Applications

1 Adaptive Filtering - Theory and ApplicationsJos e C. M. BermudezDepartment of Electrical EngineeringFederal University of Santa CatarinaFlorian opolis SCBrazilIRIT - INP-ENSEEIHT, ToulouseMay 2011 Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20111 / 1071 Introduction2 Adaptive Filtering Applications3 Adaptive Filtering Principles4 Iterative Solutions for the Optimum Filtering Problem5 Stochastic Gradient Algorithms6 Deterministic Algorithms7 AnalysisJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20112 / 107 IntroductionJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20113 / 107 Estimation TechniquesSeveral techniques to solve estimation EstimationMaximum Likelihood (ML), Least Squares (LS), Moments, EstimationMinimum MSE (MMSE), Maximum A Posteriori (MAP), EstimationFrequently used in practice when there is a limitation in computationalcomplexity Real-time operationJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20114 / 107 Linear EstimatorsSimpler to determine.

2 Depend on the first two moments of dataStatistical Approach Optimal Linear Filters Minimum Mean Square Error Require second order statistics of signalsDeterministic Approach Least Squares Estimators Minimum Least Squares Error Require handling of a data observation matrixJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20115 / 107 Limitations of Optimal Filters and LS EstimatorsStatistics of signals may not be available or cannot be accuratelyestimatedThere may not be available time for statistical estimation (real-time)Signals and systems may be non-stationaryMemory required may be prohibitiveComputational load may be prohibitiveJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20116 / 107 Iterative SolutionsSearch the optimal solution starting from an initial guessIterative algorithms are based on classical optimization algorithmsRequire reduced computational effort per iterationNeed several iterations to converge to the optimal solutionThese methods form the basis for the development of adaptivealgorithmsStill require the knowledge of signal statisticsJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20117 / 107 Adaptive FiltersUsually approximate iterative algorithms and:Do not require previous knowledge of the signal statisticsHave a small computational complexity per iterationConverge to a neighborhood of the optimal solutionAdaptive filters are good for.

3 Real-time Applications , when there is no time for statistical estimationApplications with nonstationary signals and/or systemsJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20118 / 107 Properties of Adaptive FiltersThey can operate satisfactorily in unknown and possibly time-varyingenvironments without user interventionThey improve their performance during operation by learningstatistical characteristics from current signal observationsThey can track variations in thesignal operating environment(SOE)Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 20119 / 107 Adaptive Filtering ApplicationsJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201110 / 107 Basic Classes of Adaptive Filtering ApplicationsSystem IdentificationInverse System ModelingSignal PredictionInterference CancelationJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201111 / 107 System Identification++_+x(n)d(n) d(n)y(n)e(n)eo(n)other signalsunknownsystemadaptiveadaptivealgo rithmfilterJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201112 / 107 Applications System IdentificationChannel EstimationCommunications systemsObjective: model the channel to design distortion compensationx(n): training sequencePlant IdentificationControl systemsObjective: model the plant to design a compensatorx(n): training sequenceJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201113 / 107 echo CancellationTelephone systems and VoIPEcho caused by network impedance mismatches or acousticenvironmentObjective: model the echo path impulse responsex(n): transmitted signald(n): echo + noiseHHTxRxHHTxRxEC+_x(n)d(n)e(n)Figure.

4 Network echo CancellationJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201114 / 107 Inverse System ModelingAdaptive filter attempts to estimate unknown system s inverseAdaptive filter input usually corrupted by noiseDesired responsed(n)may not be available+++_UnknownSystemAdaptiveAdapti veFilterAlgorithmDelayothersignalsx(n)s( n)e(n)y(n)d(n)z(n)Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201115 / 107 Applications Inverse System ModelingChannel Equalization+++_ChannelAdaptiveAdaptiveF ilterAlgorithmLocal (n)x(n)s(n)e(n)y(n)d(n)z(n)Objective: reduce intersymbol interferenceInitially training sequence ind(n)After training:d(n)generated from previous decisionsJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201116 / 107 Signal Prediction+_DelayAdaptiveAdaptiveFilterA lgorithmothersignalsx(n no)x(n)e(n)y(n)d(n)most widely used case forward predictionsignalx(n)to be predicted from samples{x(n no), x(n no 1), .. , x(n no L)}Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201117 / 107 Application Signal PredictionDPCM Speech Quantizer - Linear Predictive CodingObjective: Reduce speech transmission bandwidthSignal transmitted all the time: quantization errorPredictor coefficients are transmitted at low rate+_++SpeechsignalsignalDPCMQ uantizerPredictorprediction errorQ[e(n)]e(n)y(n) +Q[e(n)] d(n)d(n)y(n)Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201118 / 107 Interference CancelationOne or more sensor signals are used to remove interference and noiseReference signals correlated with the inteference should also beavailableApplications: array processing for radar and communications biomedical sensing systems active noise control systemsJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201119 / 107 Application Interference CancelationActive Noise ControlRef.

5 Manolakis, Ingle and Kogon,Statistical and Adaptive Signal Processing, of acoustic noise using destructive interferenceSecondary system between the Adaptive filter and the cancelationpoint is unavoidableCancelation is performed in the acoustic environmentJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201120 / 107 Active Noise Control Block Diagrams++ AdaptiveAlgorithmwow(n)x(n)xf(n) SSy(n)ys(n)yg(n)d(n)z(n)e(n)g(ys)Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201121 / 107 Adaptive Filtering PrinciplesJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201122 / 107 Adaptive Filter FeaturesAdaptive filters are composed of three basic modules: Filtering strucure Determines the output of the filter given its input samples Its weights are periodically adjusted by the Adaptive algorithm Can be linear or nonlinear, depending on the application Linear filters can be FIR or IIRP erformance criterion Defined according to application and mathematical tractability Is used to derive the Adaptive algorithm Its value at each iteration affects the Adaptive weight updatesAdaptive algorithm Uses the performance criterion value and the current signals Modifies the Adaptive weights to improve performance Its form and complexity are function of the structure and of theperformance criterionJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201123 / 107 Signal Operating Environment (SOE)Comprises all informations regarding the properties of thesignals andsystemsInput signalsDesired signalUnknown systemsIf the SOE is nonstationaryAquisition or convergence mode:from start until close to bestperformanceTracking mode.

6 Readjustment following SOE s time variationsAdaptation can beSupervised desired signal is available e(n)can be evaluatedUnsupervised desired signal is unavailableJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201124 / 107 Performance EvaluationConvergence rateMisadjustmentTrackingRobustness (disturbances and numerical)Computational requirements (operations and memory)Structure facility of implementation performance surface stabilityJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201125 / 107 Optimum versus Adaptive Filters in Linear EstimationConditions for this studyStationary SOEF ilter structure is transversal FIRAll signals are real valuedPerformance criterion: Mean-square errorE[e2(n)]The Linear Estimation Problem+ Linear FIR Filterwx(n)y(n)d(n)e(n)Jms=E[e2(n)]Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201126 / 107 The Linear Estimation Problem+ Linear FIR Filterwx(n)y(n)d(n)e(n)x(n) = [x(n), x(n 1), , x(n N+ 1)]Ty(n) =xT(n)we(n) =d(n) y(n) =d(n) xT(n)wJms=E[e2(n)] = 2d 2pTw+wTRxxwwherep=E[x(n)d(n)];Rxx=E[x(n) xT(n)]Normal EquationsRxxwo=p wo=R 1xxpforRxx>0 Jmsmin= 2d pTR 1xxpJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201127 / 107 What ifd(n)is nonstationary?

7 + Linear FIR Filterwx(n)y(n)d(n)e(n)x(n) = [x(n), x(n 1), , x(n N+ 1)]Ty(n) =xT(n)w(n)e(n) =d(n) y(n) =d(n) xT(n)w(n)Jms(n) =E[e2(n)] = 2d(n) 2p(n)Tw(n) +wT(n)Rxxw(n)wherep(n) =E[x(n)d(n)];Rxx=E[x(n)xT(n)]Normal EquationsRxxwo(n) =p(n) wo(n) =R 1xxp(n)forRxx>0 Jmsmin(n) = 2d(n) pT(n)R 1xxp(n)Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201128 / 107 Optimum Filters versus Adaptive FiltersOptimum FiltersComputep(n) =E[x(n)d(n)]SolveRxxwo=p(n)Filter withwo(n) y(n) =xT(n)wo(n)Nonstationary SOE:Optimum filter determinedfor each value ofnAdaptive FiltersFiltering:y(n) =xT(n)w(n)Evaluate error:e(n) =d(n) y(n) Adaptive algorithm:w(n+ 1) =w(n) + w[x(n), e(n)] w(n)is chosen so thatw(n)is close towo(n)fornlargeJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201129 / 107 Characteristics of Adaptive FiltersSearch for the optimum solution on the performance surfaceFollow principles of optimization techniquesImplement a recursive optimization solutionConvergence speed may depend on initializationHave stability regionsSteady-state solution fluctuates about the optimumCan track time varying SOEs better than optimum filtersPerformance depends on the performance surfaceJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201130 / 107 Iterative Solutions for theOptimum Filtering ProblemJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201131 / 107 Performance (Cost) FunctionsMean-square error E[e2(n)](Most popular) Adaptive algorithms:Least-Mean Square (LMS), Normalized LMS(NLMS), Affine Projection (AP), Recursive Least Squares (RLS), MSEJrms=E[e2(n)] + kw(n)k2 Adaptive algorithm.

8 Leaky least-mean square (leaky LMS) 1normcriterionJ 1=E[|e(n)|] Adaptive algorithm:Sign-ErrorJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201132 / 107 Performance (Cost) Functions continuedLeast-mean fourth (LMF)criterionJLM F=E[e4(n)] Adaptive algorithm:Least-Mean Fourth (LMF)Least-mean-mixed-norm (LMMN)criterionJLM M N=E[ e2(n) +12(1 )e4(n)] Adaptive algorithm:Least-Mean-Mixed-Norm (LMMN)Constant-moduluscriterionJCM=E[ |xT(n)w(n)|2 2] Adaptive algorithm:Constant-Modulus (CM)Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201133 / 107 MSE Performance Surface Small Input Correlation 20 1001020 200200500100015002000250030003500w1w2 Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201134 / 107 MSE Performance Surface Large Input Correlation 20 15 10 505101520 20 1001020050010001500200025003000350040004 5005000w1w2 Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201135 / 107 The Steepest Descent Algorithm Stationary SOECost FunctionJms(n)=E[e2(n)] = 2d 2pTw(n)+wT(n)Rxxw(n)Weight Update Equationw(n+ 1) =w(n) + c(n) : step-sizec(n): correction term (determines direction of w(n))Steepest descent adjustment:c(n) = Jms(n) Jms(n+ 1) Jms(n)w(n+ 1) =w(n) + [p Rxxw(n)]Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201136 / 107 Weight Update Equation About the Optimum WeightsWeight Error Update Equationw(n+ 1) =w(n) + [p Rxxw(n)]Usingp=Rxxwow(n+ 1) = (I Rxx)w(n) + RxxwoWeight error vector.

9 V(n) =w(n) wov(n+ 1) = (I Rxx)v(n)MatrixI Rxxmust bestablefor convergence (| i|<1)Assuming convergence,limn v(n) =0 Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201137 / 107 Convergence Conditionsv(n+ 1) = (I Rxx)v(n);Rxxpositive definiteEigen-decomposition ofRxxRxx=Q QTv(n+ 1) = (I Q QT)v(n)QTv(n+ 1) =QTv(n) QTv(n)Defining v(n+ 1) =QTv(n+ 1) v(n+ 1) = (I ) v(n)Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201138 / 107 Convergence Properties v(n+ 1) = (I ) v(n) vk(n+ 1) = (1 k) vk(n),k= 1, .. , N vk(n) = (1 k)n vk(0)Convergence modes monotonic if0<1 k<1 oscillatory if 1<1 k<0 Convergence if|1 k|<1 0< <2 maxJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201139 / 107 Optimal Step-Size vk(n) = (1 k)n vk(0);convergence modes:1 kmax|1 k|: slowest modemin|1 k|: fastest modeJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201140 / 107 Optimal Step-Size continued10|1 k|1/ max1/ min o min max{|1 k|}1 o min= (1 o max) o=2 max+ minOptimal slowest modes: 1 +1.

10 = max minJos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201141 / 107 The Learning Curve Jms(n)Jms(n) =Jmsmin+Excess MSEz}|{ vT(n) v(n)=Jmsmin+NXk=1 k v2k(n)Since vk(n) = (1 k)n vk(0),Jms(n) =Jmsmin+NXk=1 k(1 k)2n v2k(0) k(1 k)2>0 monotonic convergenceStability limit is again0< <2 maxJms(n)converges faster thanw(n)Algorithm converges faster as = max/ min 1 Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201142 / 107 Simulation Resultsx(n) = x(n 1) +v(n)020406080100120140160180200 70 60 50 40 30 20 100iterationMSE (dB)Steepest Descent Mean Square ErrorInput: White noise ( = 1)020406080100120140160180200 70 60 50 40 30 20 100iterationMSE (dB)Steepest Descent Mean Square ErrorInput: AR(1), = ( = )Linear system identification - FIR with 20 coefficientsStep-size = power 2v= 10 6 Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201143 / 107 The Newton AlgorithmSteepest descent: linear approx. ofJmsabout the operating pointNewton s method: Quadratic approximation ofJmsExpandingJms(w)in Taylor s series aboutw(n),Jms(w) Jms[w(n)] + TJms[w w(n)]+12[w w(n)]TH(n)[w w(n)]Diferentiating equating to zero atw=w(n+ 1), Jms[w(n+ 1)] = Jms[w(n)] +H[w(n)][w(n+ 1) w(n)] =0w(n+ 1) =w(n) H 1[w(n)] Jms[w(n)]Jos e Bermudez (UFSC) Adaptive FilteringIRIT - Toulouse, 201144 / 107 The Newton Algorithm continued Jms[w(n)] = 2p+ 2 Rxxw(n)H(w(n)) = 2 RxxThus, adding a step-size control,w(n+ 1) =w(n) R 1xx[ p+Rxxw(n)]Quadratic surface conv.