Example: tourism industry

89782 03 c03 p073-122 - Cengage Learning

733 Multiple regression Analysis: EstimationIn Chapter 2, we learned how to use simple regression analysis to explain a dependentvariable,y, as a function of a single independent variable,x. The primary drawback inusing simple regression analysis for empirical work is that it is very difficult to drawceteris paribus conclusions about how x affects y: the key assumption, that allother factors affecting y are uncorrelated with x is often regression analysis is more amenable to ceteris paribus analysis because itallows us to explicitly control for many other factors that simultaneously affect the depen-dent variable.

Multiple regression analysis is also useful for generalizing functional relationships between variables. As an example, suppose family consumption (cons) is a quadratic func-tion of family income (inc):cons b 0 b 1

Tags:

  Regression, 89782 03 c03 p073 122, 89782, P073

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of 89782 03 c03 p073-122 - Cengage Learning

1 733 Multiple regression Analysis: EstimationIn Chapter 2, we learned how to use simple regression analysis to explain a dependentvariable,y, as a function of a single independent variable,x. The primary drawback inusing simple regression analysis for empirical work is that it is very difficult to drawceteris paribus conclusions about how x affects y: the key assumption, that allother factors affecting y are uncorrelated with x is often regression analysis is more amenable to ceteris paribus analysis because itallows us to explicitly control for many other factors that simultaneously affect the depen-dent variable.

2 This is important both for testing economic theories and for evaluating policyeffects when we must rely on nonexperimental data. Because multiple regression modelscan accommodate many explanatory variables that may be correlated, we can hope to infercausality in cases where simple regression analysis would be , if we add more factors to our model that are useful for explaining y, thenmore of the variation in y can be explained. Thus, multiple regression analysis can be usedto build better models for predicting the dependent additional advantage of multiple regression analysis is that it can incorporate fairlygeneral functional form relationships.

3 In the simple regression model, only one functionof a single explanatory variable can appear in the equation. As we will see, the multipleregression model allows for much more formally introduces the multiple regression model and further discussesthe advantages of multiple regression over simple regression . In Section , we demon-strate how to estimate the parameters in the multiple regression model using the methodof ordinary least squares. In Sections , , and , we describe various statisticalproperties of the OLS estimators, including unbiasedness and multiple regression model is still the most widely used vehicle for empiricalanalysis in economics and other social sciences.

4 Likewise, the method of ordinary leastsquares is popularly used for estimating the parameters of the multiple regression Motivation for Multiple RegressionThe Model with Two Independent VariablesWe begin with some simple examples to show how multiple regression analysis can beused to solve problems that cannot be solved by simple 5/26/05 11:46 AM Page 73 The first example is a simple variation of the wage equation introduced in Chapter 2for obtaining the effect of education on hourly wage:wage b0 b1educ b2exper u,( )where exper is years of labor market experience. Thus,wage is determined by the twoexplanatory or independent variables, education and experience, and by other unobservedfactors, which are contained in u.

5 We are still primarily interested in the effect of educon wage, holding fixed all other factors affecting wage; that is, we are interested in theparameter with a simple regression analysis relating wage to educ, equation ( )effectively takes exper out of the error term and puts it explicitly in the equation. Becauseexper appears in the equation, its coefficient,b2, measures the ceteris paribus effect ofexper on wage, which is also of some surprisingly, just as with simple regression , we will have to make assumptionsabout how u in ( ) is related to the independent variables,educ and exper.

6 However, aswe will see in Section , there is one thing of which we can be confident: because ( )contains experience explicitly, we will be able to measure the effect of education on wage,holding experience fixed. In a simple regression analysis which puts exper in the errorterm we would have to assume that experience is uncorrelated with education, a tenu-ous a second example, consider the problem of explaining the effect of per studentspending (expend) on the average standardized test score (avgscore) at the high schoollevel. Suppose that the average test score depends on funding, average family income(avginc), and other unobservables:avgscore b0 b1expend b2avginc u.

7 ( )The coefficient of interest for policy purposes is b1, the ceteris paribus effect of expendon avgscore. By including avginc explicitly in the model, we are able to control for itseffect on avgscore. This is likely to be important because average family income tendsto be correlated with per student spending: spending levels are often determined by bothproperty and local income taxes. In simple regression analysis,avginc would be includedin the error term, which would likely be correlated with expend, causing the OLS esti-mator of b1 in the two-variable model to be the two previous similar examples, we have shown how observable factors otherthan the variable of primary interest [educ in equation ( ) and expend in equation ( )]can be included in a regression model.

8 Generally, we can write a model with two inde-pendent variables asy b0 b1x1 b2x2 u,( )where b0 is the intercept,b1 measures the change in y with respect to x1, holding otherfactors fixed, and b2 measures the change in y with respect to x2, holding other 1 regression Analysis with Cross-Sectional 5/26/05 11:46 AM Page 74 Multiple regression analysis is also useful for generalizing functional relationshipsbetween variables. As an example, suppose family consumption (cons) is a quadratic func-tion of family income (inc):cons b0 b1inc b2inc2 u,( )where u contains other factors affecting consumption.

9 In this model, consumption dependson only one observed factor, income; so it might seem that it can be handled in a simpleregression framework. But the model falls outside simple regression because it containstwo functions of income,inc and inc2 (and therefore three parameters,b0,b1, and b2).Nevertheless, the consumption function is easily written as a regression model with twoindependent variables by letting x1 inc and x2 , there will be no difference in using the method of ordinary least squares(introduced in Section ) to estimate equations as different as ( ) and ( ). Eachequation can be written as ( ), which is all that matters for computation.

10 There is,however, an important difference in how one interprets the parameters. In equation ( ), 1 is the ceteris paribus effect of educ on wage. The parameter 1 has no such interpreta-tion in ( ). In other words, it makes no sense to measure the effect of inc on conswhileholding inc2 fixed, because if inc changes, then so must inc2! Instead, the change in con-sumption with respect to the change in income the marginal propensity to consume is approximated by 1 2 Appendix A for the calculus needed to derive this equation. In other words, themarginal effect of income on consumption depends on 2 as well as on 1 and the levelof income.


Related search queries