Example: stock market

Student Solutions Manual to accompany Applied …

StudentSolutions Manualto accompanyApplied LinearStatistical ModelsFifth EditionMichael H. KutnerEmory UniversityChristopher J. NachtsheimUniversity of MinnesotaJohn NeterUniversity of GeorgiaWilliam LiUniversity of Minnesota2005 McGraw-Hill/IrwinChicago, ILBoston, MAPREFACEThis Student Solutions Manual gives intermediate and final numerical results for all starred(*) end-of-chapter Problems with computational elements contained inApplied LinearStatistical M odels, 5th edition. No Solutions are given for Exercises, Projects, or presenting calculational results we frequently show, for ease in checking, more digitsthan are significant for the original data. students and other users may obtain slightlydifferent answers than those presented here, because of different rounding procedures.

PREFACE This Student Solutions Manual gives intermediate and final numerical results for all starred (*) end-of-chapter Problems with computational elements contained in Applied Linear Statistical Models, 5th edition.No solutions are given for Exercises, Projects, or Case

Tags:

  Manual, Solutions, Students, Student solutions manual

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Student Solutions Manual to accompany Applied …

1 StudentSolutions Manualto accompanyApplied LinearStatistical ModelsFifth EditionMichael H. KutnerEmory UniversityChristopher J. NachtsheimUniversity of MinnesotaJohn NeterUniversity of GeorgiaWilliam LiUniversity of Minnesota2005 McGraw-Hill/IrwinChicago, ILBoston, MAPREFACEThis Student Solutions Manual gives intermediate and final numerical results for all starred(*) end-of-chapter Problems with computational elements contained inApplied LinearStatistical M odels, 5th edition. No Solutions are given for Exercises, Projects, or presenting calculational results we frequently show, for ease in checking, more digitsthan are significant for the original data. students and other users may obtain slightlydifferent answers than those presented here, because of different rounding procedures.

2 Whena problem requires a percentile ( of thetorFdistributions) not included in the AppendixB Tables, users may either interpolate in the table or employ an available computer programfor finding the needed value. Again, slightly different values may be obtained than the onesshown data sets for all Problems, Exercises, Projects and Case Studies are contained in thecompact disk provided with the text to facilitate data entry. It is expected that the studentwill use a computer or have access to computer output for all but the simplest data sets,where use of a basic calculator would be adequate. For most students , hands-on experiencein obtaining the computations by computer will be an important part of the educationalexperience in the we have checked the Solutions very carefully, it is possible that some errors are stillpresent.

3 We would be most grateful to have any errors called to our attention. Errata canbe reported via the website for the book: acknowledge with thanks the assistance of Lexin Li and Yingwen Dong in the checkingof Chapters 1-14 of this Manual . We, of course, are responsible for any errors or omissionsthat H. KutnerChristopher J. NachtsheimJohn NeterWilliam LiiiiContents1 LINEAR REGRESSION WITH ONE PREDICTOR VARIABLE1-12 INFERENCES IN REGRESSION AND CORRELATION ANALYSIS 2-13 DIAGNOSTICS AND REMEDIAL MEASURES3-14 SIMULTANEOUS INFERENCES AND OTHER TOPICS IN REGRES-SION ANALYSIS4-15 MATRIX APPROACH TOSIMPLE LINEAR REGRESSION ANALY-SIS5-16 MULTIPLE REGRESSION I6-17 MULTIPLE REGRESSION II7-18 MODELS FOR QUANTITATIVE AND QUALITATIVE PREDICTORS 8-19 BUILDING THE REGRESSION MODEL I: MODEL SELECTION ANDVALIDATION9-110 BUILDING THE REGRESSION MODEL II: DIAGNOSTICS10-111 BUILDING THE REGRESSION MODEL III.

4 REMEDIAL MEASURES11-112 AUTOCORRELATION IN TIME SERIES DATA12-113 INTRODUCTION TO NONLINEAR REGRESSION AND NEURAL NET-WORKS13-114 LOGISTIC REGRESSION, POISSON REGRESSION,AND GENERAL-IZED LINEAR MODELS14-115 INTRODUCTION TO THE DESIGN OF EXPERIMENTAL AND OB-SERVATIONAL STUDIES15-116 SINGLE-FACTOR STUDIES16-117 ANALYSIS OF FACTOR LEVEL MEANS17-1iii18 ANOVA DIAGNOSTICS AND REMEDIAL MEASURES18-119 TWO-FACTOR STUDIES WITH EQUAL SAMPLE SIZES19-120 TWO-FACTOR STUDIES ONE CASE PER TREATMENT20-121 RANDOMIZED COMPLETE BLOCK DESIGNS21-122 ANALYSIS OF COVARIANCE22-123 TWO-FACTOR STUDIES UNEQUAL SAMPLE SIZES23-124 MULTIFACTOR STUDIES24-125 RANDOM AND MIXED EFFECTS MODELS25-126 NESTED DESIGNS, SUBSAMPLING, AND PARTIALLY NESTED DE-SIGNS26-127 REPEATED MEASURES AND RELATED DESIGNS27-128 BALANCED INCOMPLETE BLOCK, LATIN SQUARE, AND RELATEDDESIGNS28-129 EXPLORATORY EXPERIMENTS TWO-LEVEL FACTORIAL ANDFRACTIONAL FACTORIAL DESIGNS29-130 RESPONSE SURFACE METHODOLOGY30-1ivChapter 1 LINEAR REGRESSION WITH ONEPREDICTOR a.

5 Y= + Yh= a. Y= + Yh= ( X, Y)=(1, ) :12..4445ei: .. e2i= , MSE= , e2i= ,MSE= , a. Y= (1)b1= , (2) Yh= , (3)e8= ,(4)MSE= 2 INFERENCES IN REGRESSIONAND CORRELATION (.95; 43) = , (.4831), 1 : 1=0,Ha: 1 = = ( 0)/.4831 = If|t | , otherwiseHa. 0+c. : 1 14,Ha: 1> = ( 14)/.4831 = Ift , otherwiseHa. (.975; 8) = ,b1= ,s{b1}=.469, (.469), 1 : 1=0,Ha: 1 = =( 0)/.469= If|t | concludeH0,otherwiseHa. ,s{b0}=.663, (.663), 0 : 0 9,Ha: 0> = ( 9)/.663= Ift concludeH0,otherwiseHa. : 1=0: =|2 0|/.5=4,power=.93H0: 0 9: =|11 9|/.75= , power =. a. Yh= ,s{ Yh}= ,t(.95; 43) = , ( ), E{Yh} {pred}= , ( ), Yh(new) , yes,yesc.

6 , , Mean time per machine (.90; 2, 43) = 2( ) = ,W= , ( ), 0+ 1Xh , yes, : Yh= ,s{ Yh}=.663,t(.995; 8) = , (.663), E{Yh} : Yh= ,s{ Yh}= , ( ), E{Yh} {pred}= , ( ), Yh(new) {predmean}= , ( ), Yh(new) , 44 =3( ) Total number of broken ampules 3( ) = (.99; 2,8) = 2( ) = ,W= : (.663), 0+ 1Xh : ( ), 0+ 1Xh , , 76, , 43 , 44 SourceSSdfMSRegression76, 76, , 43 , 44 Correction for mean 261, , uncorrected342,124 : 1=0,Ha: 1 = =76, = ,F(.90; 1,43)= concludeH0, or , coefficient of determinationd. +. : 1=0,Ha: 1 = = ,F(.95; 1,8) = IfF concludeH0, otherwiseHa. =( 0)/.469= , (t )2=( )2= ,r=.

7 9492, : 1 0,Ha: 1< {b1}= ,t =( 0)/.090197 = ,t(.05; 58) = concludeH0, otherwiseHa. 0+ (.975; 58) = , (.090197), 1 a. Yh= ,s{ Yh}= ,t(.975; 58) = , ( ), E{Yh} {Yh(new)}= , ( ), Yh(new) (.95; 2,58) = 2( ) = ,W= , ( ), 0+ 1Xh , yes, :12..5960Yi Yi: .. Yi .. , 11, , , : 1=0,Ha: 1 = =11, = ,F(.90; 1,58)= IfF concludeH0, or . ,r= b..95285, : 12=0,Ha: 12 = =(.95285 13)/ 1 (.95285)2= ,t(.995; 13) = If|t | concludeH0,otherwiseHa. : 12=0,Ha: 12 = =(.87 101)/ 1 (.87)2= ,t(.95; 101) = If|t | concludeH0, otherwiseHa. = , {z }=.1,z(.95)= , (.1), ,.824 12 . 212 . a. , : 12=0,Ha: 12 = =( 58)/ 1 ( )2= ,t(.)

8 975; 58) = If|t | concludeH0, : There is no association betweenXandYHa: There is an association betweenXandYt = 58 1 ( )2= ( ,58)= If|t | , concludeH0, otherwise, concludeHa. 3 DIAGNOSTICS AND and :12..4445 Yi: .. : .. order:12..4445 Ordered residual: .. value: .. : Normal,Ha: not Ifr .9785 concludeH0, otherwiseHa. =15,155,SSE= ,X2BP= (15,155/2) ( )2= , 2(.95; 1) = IfX2BP conclude error variance constant, otherwise errorvariance not constant. Conclude error variance :12345678910ei: .8 .8 . Order:12345678910 Ordered residual: .8 . value.

9 2 .6 : Normal,Ha: not Ifr .879 concludeH0, = ,SSE= ,X2BP=( ) ( )2= , 2(.90; 1) = conclude error variance constant, otherwise error variance notconstant. Conclude error variance and :12..5960 .. Yi: .. order:12..5960 Ordered residual: .. value: .. : Normal,Ha: not Ifr concludeH0, otherwiseHa. =31, ,SSE=3, ,X2BP= (31, ) (3, )2= , 2(.99; 1) = IfX2BP conclude error variance constant, otherwise error variance not constant. Con-clude error variance constant. :E{Y}= 0+ 1X,Ha:E{Y} = 0+ ,SSLF= ,F = ( ) ( )= ,F(.95; 8,35)= IfF concludeH0, b. :.3 ..7 SSE: Y = + :12345 Y i: :678910 Y i: Y= ( + )23-2 Chapter 4 SIMULTANEOUS INFERENCESAND OTHER TOPICS INREGRESSION Opposite directions, negative (.)

10 9875; 43) = ,b0= ,s{b0}= ,b1= ,s{b1}= ( ) 0 ( ) 1 Opposite directions, negative (.9975; 8) = ,b0= ,s{b0}=.6633,b1= ,s{b1}=. (.6633) 0 (.4690) 1 (.9975; 14) = ,b0= ,s{b0}= ,b1= ,s{b1}= ( ) 0 ( ) 1 Opposite directionsc. (.90; 2,43)= ,W= : ( ) E{Yh} : ( ) E{Yh} 7: ( ) E{Yh} (.90; 2,43)= ,S= ;B=t(.975; 43) = ; : ( ) Yh(new) 7: ( ) Yh(new) (.95; 2,8)= ,W= : (.6633) E{Yh} : (.4690) E{Yh} : (.6633) E{Yh} (.99167; 8) = , (.95; 3,8)= ,S= : ( ) Yh(new) : ( ) Yh(new) : ( ) Yh(new) , (.95; 2,58)= ,W= 45: ( ) E{Yh} 55: ( ) E{Yh} 65: ( ) E{Yh} (.


Related search queries