Regression models for data sciense (779323), страница 8
Текст из файла (страница 8)
Error t value Pr(>|t|)-259.617.32 -14.99 2.523e-19(Intercept)x3721.081.7945.50 6.751e-40Remember, we reject if our P-value is less than our desired type I error rate. In both cases the test isfor whether or not the parameter is zero. This is almost always of interest for the slope, but frequentlya zero intercept isn’t of interest so that P-value is often disregarded.For the slope, a value of zero represents no linear relationship between the predictor and response.So, the P-value is for performing a test of whether any (linear) relationship exist or not.Getting a confidence intervalRecall from your inference class, a fair number of confidence intervals take the form of an estimateplus or minus a t quantile times a standard error. Let’s use that formula to create confidence intervalsfor our regression parameters. Let’s first do the intercept.> sumCoef <- summary(fit)$coefficients> sumCoef[1,1] + c(-1, 1) * qt(.975, df = fit$df) * sumCoef[1, 2][1] -294.5 -224.8Now let’s do the slope:> (sumCoef[2,1] + c(-1, 1) * qt(.975, df = fit$df) * sumCoef[2, 2]) / 10[1] 355.6 388.6So, we would interpret this as: “with 95% confidence, we estimate that a 0.1 carat increase in diamondsize results in a 355.6 to 388.6 increase in price in (Singapore) dollars”.Prediction of outcomesWatch this before beginning⁶⁹Finally, let’s consider prediction again.
Consider the problem of predicting Y at a value of X. In ourexample, this is predicting the price of a diamond given the carat.We’ve already covered that the estimate for prediction at point x0 is:⁶⁹https://www.youtube.com/watch?v=aMirqYW6VrY&index=18&list=PLpl-gQkQivXjqHAJd2t-J_One_fYE55tC50Regression inferenceβ̂0 + β̂1 x0A standard error is needed to create a prediction interval. This is important, since predictions bythemselves don’t convey anything about how accurate we would expect the prediction to be. Takeour diamond example. Because the model fits so well, we would be surprised if we tried to sell adiamond and the offers were well off our model prediction (since it seems to fit quite well).There’s a subtle, but important, distinction between intervals for the regression line at point x0 andthe prediction of what a y would be at point x0 .
What differs is the standard error:For the line at x0 the standard error is,√σ̂1(x0 − X̄)2+ ∑n2ni=1 (Xi − X̄)For the prediction interval at x0 the standard error is√σ̂1+1(x0 − X̄)2+ ∑n2ni=1 (Xi − X̄)Notice that the prediction interval standard error is a little large than error for a line. Think of it thisway. If we want to predict a Y value at a particular X value, and we knew the actual true slope andintercept, there would still be error.
However, if we only wanted to predict the value at the line atthat X value, there would be no variance, since we already know the line.Thus, the variation for the line only considers how hard it is to estimate the regression line at thatX value. The prediction interval includes that variation, as well as the extra variation unexplainedby the relationship between Y and X. So, it has to be a little wider.For the diamond example, here’s both the mean value and prediction interval. (code and plot).Notice that to get the various intervals, one has to use one of the options interval="confidence"or interval="prediction" in the prediction function.library(ggplot2)newx = data.frame(x = seq(min(x), max(x), length = 100))p1 = data.frame(predict(fit, newdata= newx,interval = ("confidence")))p2 = data.frame(predict(fit, newdata = newx,interval = ("prediction")))p1$interval = "confidence"p2$interval = "prediction"p1$x = newx$xp2$x = newx$xdat = rbind(p1, p2)names(dat)[1] = "y"51Regression inferenceggggg====ggplot(dat, aes(x = x, y = y))g + geom_ribbon(aes(ymin = lwr, ymax = upr, fill = interval), alpha = 0.2)g + geom_line()g + geom_point(data = data.frame(x = x, y=y), aes(x = x, y = y), size = 4)Image of prediction and mean value interval.Summary notes• Both intervals have varying widths.– Least width at the mean of the Xs.• We are quite confident in the regression line, so that interval is very narrow.– If we knew β0 and β1 this interval would have zero width.• The prediction interval must incorporate the variability in the data around the line.– Even if we knew β0 and β1 this interval would still have width.
*Regression inference52Exercises1. Test whether the slope coefficient for the father.son data is different from zero (father aspredictor, son as outcome). Watch a video solution.⁷⁰2. Refer to question 1. Form a confidence interval for the slope coefficient. Watch a videosolution⁷¹3. Refer to question 1. Form a confidence interval for the intercept (center the fathers’ heightsfirst to get an intercept that is easier to interpret). Watch a video solution.⁷²4. Refer to question 1. Form a mean value interval for the expected son’s height at the averagefather’s height. Watch a video solution.⁷³5.
Refer to question 1. Form a prediction interval for the son’s height at the average father’sheight. Watch a video solution.⁷⁴6. Load the mtcars dataset. Fit a linear regression with miles per gallon as the outcomeand horsepower as the predictor. Test whether or not the horsepower power coefficient isstatistically different from zero. Interpret your test.7. Refer to question 6.
Form a confidence interval for the slope coefficient.8. Refer to quesiton 6. Form a confidence interval for the intercept (center the HP variable first).9. Refer to question 6. Form a mean value interval for the expected MPG for the average HP.10. Refer to question 6.
Form a prediction interval for the expected MPG for the average HP.11. Refer to question 6. Create a plot that has the fitted regression line plus curves at the expectedvalue and prediction intervals.⁷⁰https://www.youtube.com/watch?v=6hkBsUAQU7E&list=PLpl-gQkQivXji7JK1OP1qS7zalwUBPrX0&index=32⁷¹https://www.youtube.com/watch?v=eExHWvQImEE&list=PLpl-gQkQivXji7JK1OP1qS7zalwUBPrX0&index=33⁷²https://www.youtube.com/watch?v=GeDmfhm2bhc&index=34&list=PLpl-gQkQivXji7JK1OP1qS7zalwUBPrX0⁷³https://www.youtube.com/watch?v=dLV_Jopsbl4&list=PLpl-gQkQivXji7JK1OP1qS7zalwUBPrX0&index=35⁷⁴https://www.youtube.com/watch?v=-rx-71QsUnY&list=PLpl-gQkQivXji7JK1OP1qS7zalwUBPrX0&index=36Multivariable regression analysisWatch this before beginning.⁷⁵In this chapter we extend linear regression so that our models can contain more variables.
A naturalfirst approach is to assume additive effects, basically extending our linear model to a plane orhyperplane. This technique represents one of the most widely used and successful methods instatistics.Multivariable regression analyses: adjustmentIf I were to present evidence of a relationship between breath mint useage (mints per day, X) andpulmonary function (measured in FEV), you would be skeptical. Likely, you would say, ‘smokerstend to use more breath mints than non smokers, smoking is related to a loss in pulmonary function.That’s probably the culprit.’ If asked what would convince you, you would likely say, ‘If non-smokingbreath mint users had lower lung function than non-smoking non-breath mint users and, similarly, ifsmoking breath mint users had lower lung function than smoking non-breath mint users, I’d be moreinclined to believe you’.
In other words, to even consider my results, I would have to demonstratethat they hold while holding smoking status fixed.This is one of the main uses of multivariate regression, to consider a relationship between a predictorand response, while accounting for other variables.Multivariable regression analyses: predictionAn insurance company is interested in how last year’s claims can predict a person’s time in thehospital this year. They want to use an enormous amount of data contained in claims to predict asingle number.
Simple linear regression is not equipped to handle more than one predictor. Howcan one generalize SLR to incorporate lots of regressors for the purpose of prediction? What are theconsequences of adding lots of regressors? Surely there must be consequences to throwing variablesin that aren’t related to Y? Surely there must also be consequences to omitting variables that are?The linear modelThe general linear model extends simple linear regression (SLR) by adding terms linearly into themodel.⁷⁵https://www.youtube.com/watch?v=qsXtdSNbg5E&index=19&list=PLpl-gQkQivXjqHAJd2t-J_One_fYE55tC54Multivariable regression analysisYi = β1 X1i + β2 X2i + .