Example: air traffic controller

Regression with a Binary Dependent Variable - Chapter 9

Regressionwitha Binary DependentVariableChapter9 MichaelAshCPPAL ecture22 CourseNotesIEndgameITake-home nalIDistributedFriday 19 MayIDueTuesday 23 May (Paper or emailedPDFok; no Word,Excel,etc.)IProblemSet7 IOptional,worth up to 2 percentagepointsof extracreditIDueFriday 19 MayIRegressionwitha Binary DependentVariableBinary DependentVariablesIOutcomecanbe coded1 or 0 (yes or no,approvedor denied,successor failure)Examples?IInterprettheregression as modelingtheprobability thatthedependentvariableequalsone(Y= 1).IRecallthatfor a Binary Variable ,E(Y) = Pr(Y= 1)HMDA exampleIOutcome:loandenialis coded1, loanapproval0 IKeyexplanatory Variable :blackIOtherexplanatory variables:P=I, credithistory, LTV, Probability Model(LPM)Yi= 0+ 1X1i+ 2X2i+ + kXki+ 1expressesthechangein probability thatY= 1 associatedwitha ^Yiexpressestheprobability thatYi= 1Pr(Y= 1jX1;X2; : : : ;Xk) = 0+ 1X1+ 2X2+ + kXk=^YShortcomingsof theLPMI\NonconformingPredictedProbabilit ies"Probabilitiesmustlogicallybe between0 and1, construction(always userobuststandarderrors)ProbitandLogitRe gr

Logit or Logistic Regression Logit, or logistic regression, uses a slightly di erent functional form of the CDF (the logistic function) instead of the standard normal CDF. The coe cients of the index can look di erent, but the probability results are usually very similar to the results from probit and from the LPM.

Tags:

  Logistics, Dependent, Regression, Binary, Logistic regression, Binary dependent

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Regression with a Binary Dependent Variable - Chapter 9

1 Regressionwitha Binary DependentVariableChapter9 MichaelAshCPPAL ecture22 CourseNotesIEndgameITake-home nalIDistributedFriday 19 MayIDueTuesday 23 May (Paper or emailedPDFok; no Word,Excel,etc.)IProblemSet7 IOptional,worth up to 2 percentagepointsof extracreditIDueFriday 19 MayIRegressionwitha Binary DependentVariableBinary DependentVariablesIOutcomecanbe coded1 or 0 (yes or no,approvedor denied,successor failure)Examples?IInterprettheregression as modelingtheprobability thatthedependentvariableequalsone(Y= 1).IRecallthatfor a Binary Variable ,E(Y) = Pr(Y= 1)HMDA exampleIOutcome:loandenialis coded1, loanapproval0 IKeyexplanatory Variable :blackIOtherexplanatory variables:P=I, credithistory, LTV, Probability Model(LPM)Yi= 0+ 1X1i+ 2X2i+ + kXki+ 1expressesthechangein probability thatY= 1 associatedwitha ^Yiexpressestheprobability thatYi= 1Pr(Y= 1jX1;X2.)

2 Xk) = 0+ 1X1+ 2X2+ + kXk=^YShortcomingsof theLPMI\NonconformingPredictedProbabilit ies"Probabilitiesmustlogicallybe between0 and1, construction(always userobuststandarderrors)ProbitandLogitRe gressionIAddressesnonconformingpredicted probabilitiesin theLPMIB asicstrategy:boundpredictedvaluesbetween 0 and1 bytransforminga linear index, 0+ 1X1+ 2X2+ + kXk,whichcanrangeover( 1;1) intosomethingthatrangesover[0;1]IWhenthe indexis bigandpositive,Pr(Y= 1)! bigandnegative,Pr(Y= 1)! to transform?Usea thecumulativestandard normaldistribution, .Theindex 0+ 1X1+ 2X2+ + kXkis treatedas (Y= 1jX1;X2; : : : ;Xk) = ( 0+ 1X1+ 2X2+ + kXk)InterpretingtheresultsPr(Y= 1jX1;X2; : : : ;Xk) = ( 0+ 1X1+ 2X2+ + kXk)I jpositive(negative)meansthatan increaseinXjincreases(decreases)theproba bility ofY= jreportshow theindexchangeswitha changeinX, buttheindexis onlyan inputto jis hard to interpretbecausethechangeinprobability for a changeinXjis non-linear, dependsonallX1;X2; : : : ; interpretationis computingthepredictedprobability^Yfor alternativevaluesofXISameinterpretationo f standard errors,hypothesistests,andcon denceintervalsas withOLSHMDA \Pr(deny= 1jP=I.

3 Black) = ( 2:26+2:74P=I+0:71black)(0:16)(0:44)(0:08 3)IWhiteapplicantwithP=I= 0:3:\Pr(deny= 1jP=I;black) = ( 2:26 + 2:74 0:3 + 0:71 0) = ( 1:44)= 7:5%IBlackapplicantwithP=I= 0:3:\Pr(deny= 1jP=I;black) = ( 2:26 + 2:74 0:3 + 0:71 1) = ( 0:71)= 23:3%Logitor LogisticRegressionLogit,or logisticregression,usesa slightlydi erentfunctionalformof theCDF(thelogisticfunction)insteadof thestandard cientsof theindexcanlook di erent,buttheprobabilityresultsare usuallyverysimilar to theLPM,thethreemodelsgeneratesimilar LogitandProbitModelsIOLS(andLPM,whichis an applicationof OLS)hasaclosed-formformulafor^ ILogitandProbitrequirenumericalmethods to nd^ 's thatbest t LeastSquaresOneapproachis to choosecoe cientsb0;b1; : : : ;bkthatminimizethesumof squaresof how far theactualoutcome,Yi, is fromtheprediction, (b0+b1X1i+ +bkXki).

4 NXi=i[Yi (b0+b1X1i+ +bkXki)]2 MaximumLikelihood EstimationIAnalternativeapproachis to choosecoe cientsb0;b1; : : : ;bkthatmake thecurrentsample,Y1; : : : ;Ynas likelyas possibleto example,if youobservedataf4;6;8g, thepredictedmeanthatwouldmake thissamplemostlikelyto occuris ^ MLE= (logit)commandsareunderStatistics! Binary OutcomesInferenceandMeasuresof FitIStandard errors,hypothesistests,andcon denceintervalsareexactlyas in OLS,buttheyreferto thecoe cientsandmustbe translatedintoprobabilitiesby cut-o , ,0:50,andcheckthefractioncorrectlypredic ted, .sensitivity/speci cityChoosea cut-o .Sensitivity is thefractionof observedpositive-outcomesthatare correctlyclassi city is thefractionof observednegativeoutcomesthatare correctlyspeci analogousto theR2 IExpressesthepredictivequality of themodelwithexplanatory variablesrelativeto thepredictivequality of thesampleproportionpofcaseswhereYi= 1 IAdjustsfor addingextraregressorsSensitivity andSpeci cutoffSensitivitySpecificityReviewingthe HMDA results( )ILPM,logit,probit(minor di erences)IFourprobitspeci ,controllingfor a widerangeof otherexplanatory ModelsLimitedDependentVariable(LDV)ICoun tData(discretenon-negativeintegers),Y20; 1;2.

5 , , , ,mode of ,chooser, probit,ICansometimesconvertto severalbinary sampleselectionmodels.


Related search queries