Example: confidence

Covariance and Correlation Math 217 Probability and ...

Covariance and CorrelationMath 217 Probability and StatisticsProf. D. Joyce, Fall joint random vari-ables. TheircovarianceCov(X,Y) is defined byCov(X,Y) =E((X X)(Y Y)).Notice that the variance ofXis just the covarianceofXwith itselfVar(X) =E((X X)2) = Cov(X,X)Analogous to the identity for varianceVar(X) =E(X2) 2 Xthere is an identity for covarianceCov(X) =E(XY) X YHere s the proof:Cov(X,Y)=E((X X)(Y Y))=E(XY XY X Y+ X Y)=E(XY) XE(Y) E(X) Y+ X Y=E(XY) X YCovariance can be positive, zero, or indicates that there s an overall tendencythat when one variable increases, so doe the other,while negative indicates an overall tendency thatwhen one increases the other independent variables, then theircovariance is 0:Cov(X,Y) =E(XY) X Y=E(X)E(Y) X Y= 0 The converse, however, isnotalways (X,Y) can be 0 for variables that are not an example where the Covariance is 0 butXandYaren t independent, let there be threeoutcomes, ( 1,1), (0, 2), and (1,1), all with thesame probability13.

dard deviations, the correlation becomes bounded ... kind of thing that goes on in linear algebra. In fact, it is the same thing exactly. Take a set of real-valued random variables, not necessarily inde-pendent. Their linear combinations form a vector space. Their covariance is …

Tags:

  Linear, Bounded

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Covariance and Correlation Math 217 Probability and ...

1 Covariance and CorrelationMath 217 Probability and StatisticsProf. D. Joyce, Fall joint random vari-ables. TheircovarianceCov(X,Y) is defined byCov(X,Y) =E((X X)(Y Y)).Notice that the variance ofXis just the covarianceofXwith itselfVar(X) =E((X X)2) = Cov(X,X)Analogous to the identity for varianceVar(X) =E(X2) 2 Xthere is an identity for covarianceCov(X) =E(XY) X YHere s the proof:Cov(X,Y)=E((X X)(Y Y))=E(XY XY X Y+ X Y)=E(XY) XE(Y) E(X) Y+ X Y=E(XY) X YCovariance can be positive, zero, or indicates that there s an overall tendencythat when one variable increases, so doe the other,while negative indicates an overall tendency thatwhen one increases the other independent variables, then theircovariance is 0:Cov(X,Y) =E(XY) X Y=E(X)E(Y) X Y= 0 The converse, however, isnotalways (X,Y) can be 0 for variables that are not an example where the Covariance is 0 butXandYaren t independent, let there be threeoutcomes, ( 1,1), (0, 2), and (1,1), all with thesame probability13.

2 They re clearly not indepen-dent since the value ofXdetermines the value ofY. Note that X= 0 and Y= 0, soCov(X,Y) =E((X X)(Y Y))=E(XY)=13( 1) +130 +131 = 0We ve already seen that whenXandYare in-dependent, the variance of their sum is the sum oftheir variances. There s a general formula to dealwith their sum when they aren t independent. Acovariance term appears in that (X+Y) = Var(X) + Var(Y) + 2 Cov(X,Y)Here s the proofVar(X+Y)=E((X+Y)2) E(X+Y)2=E(X2+ 2XY+Y2) ( X+ Y)2=E(X2) + 2E(XY) +E(Y2) 2X 2 X Y 2Y=E(X2) 2X+ 2(E(XY) X Y)+E(Y2) 2Y= Var(X) + 2 Cov(X,Y) + Var(Y)Bilinearity of is linearin each coordinate. That means two things. First,you can pass constants through either coordinate:Cov(aX,Y) =aCov(X,Y) = Cov(X,aY).Second, it preserves sums in each coordinate:Cov(X1+X2,Y) = Cov(X1,Y) + Cov(X2,Y)andCov(X,Y1+Y2) = Cov(X,Y1) + Cov(X,Y2).

3 1 Here s a proof of the first equation in the firstcondition:Cov(aX,Y) =E((aX E(aX))(Y E(Y)))=E(a(X E(X))(Y E(Y)))=aE((X E(X))(Y E(Y)))=aCov(X,Y)The proof of the second condition is also XYof two jointvariablesXandYis a normalized version of theircovariance. It s defined by the equation XY=Cov(X,Y) X that independent variables have 0 correla-tion as well as 0 dividing by the product X Yof the stan-dard deviations, the Correlation becomes boundedbetween plus and minus 1. 1 XY are various ways you can prove that in-equality. Here s one. We ll start by proving0 Var(X X Y Y)= 2(1 XY).There are actually two equations there, and we canprove them at the same note the 0 parts follow from the factvariance is nonnegative. Next use the propertyproved above about the variance of a (X X Y Y)= Var(X X)+ Var( Y Y)+ 2 Cov(X X, Y Y)Now use the fact that Var(cX) =c2 Var(X) torewrite that as1 2 XVar(X) +1 2 YVar( Y) + 2 Cov(X X, Y Y)But Var(X) = 2 Xand Var( Y) = Var(Y) = 2Y,so that equals2 + 2 Cov(X X, Y Y)By the bilinearity of Covariance , that equals2 2 x YCov(X,Y) = 2 2 XY)and we ve shown that0 2(1 , divide by 2 move one term to the other sideof the inequality to get XY 1,so 1 XY exercise should remind you of the samekind of thing that goes on in linear algebra.

4 Infact, it is the same thing exactly. Take a set ofreal-valued random variables, not necessarily inde-pendent. Their linear combinations form a vectorspace. Their Covariance is the inner product (alsocalled the dot product or scalar product) of twovectors in that Y= Cov(X,Y)The norm X ofXis the square root of X 2defined by X 2=X X= Cov(X,X) =V(X) = 2 Xand, so, the angle betweenXandYis defined bycos =X Y X Y =Cov(X,Y) X Y= XYthat is, is the arccosine of the Correlation 217 Home Page ~djoyce/ma217/2


Related search queries