官术网_书友最值得收藏!

Multiple Linear Regression

In the simple linear regression discussed previously, we only have one independent variable. If we include multiple independent variables in our analysis, we get a multiple linear regression model. Multiple linear regression is represented in a way that's similar to simple linear regression.

Let's consider a case where we want to fit a linear regression model that has three independent variables, X1, X2, and X3. The formula for the multiple linear regression equation will look like Figure 2.5:

Figure 2.5: Multiple linear regression equation

Each independent variable will have its own coefficient or parameter (that is, β1 β2 or β3). The βs coefficient tells us how a change in their respective independent variable influences the dependent variable if all other independent variables are unchanged.

Estimating the Regression Coefficients (β0, β1, β2 and β3)

The regression coefficients in Figure 2.5 are estimated using the same least squares approach that was discussed when simple linear regression was introduced. To satisfy the least squares method, the chosen coefficients must minimize the sum of squared residuals.

Later in the chapter, we will make use of the Python programming language to compute these coefficient estimates practically.

Logarithmic Transformations of Variables

As has been mentioned already, sometimes the relationship between the dependent and independent variables is not linear. This limits the use of linear regression. To get around this, depending on the nature of the relationship, the logarithm function can be used to transform the variable of interest. What happens then is that the transformed variable tends to have a linear relationship with the other untransformed variables, enabling the use of linear regression to fit the data. This will be illustrated in practice on the dataset being analyzed later in the exercises of the book.

Correlation Matrices

In Figure 2.3, we saw how a linear relationship between two variables can be analyzed using a straight-line graph. Another way of visualizing the linear relationship between variables is with a correlation matrix. A correlation matrix is a kind of cross-table of numbers showing the correlation between pairs of variables, that is, how strongly the two variables are connected (this can be thought of as how a change in one variable will cause a change in the other variable). It is not easy analyzing raw figures in a table. A correlation matrix can, therefore, be converted to a form of "heatmap" so that the correlation between variables can easily be observed using different colors. An example of this is shown in Exercise 2.01, Loading and Preparing the Data for Analysis.

主站蜘蛛池模板: 安乡县| 灵丘县| 图片| 平原县| 枞阳县| 二连浩特市| 泰和县| 德清县| 西青区| 江都市| 页游| 长武县| 双峰县| 固安县| 泸州市| 土默特左旗| 贵州省| 朔州市| 青浦区| 明星| 微山县| 宜宾县| 崇文区| 临江市| 根河市| 五指山市| 嘉黎县| 武清区| 安国市| 那坡县| 徐闻县| 义马市| 雷山县| 永清县| 井陉县| 化隆| 乌什县| 黄龙县| 宜兰县| 古田县| 皮山县|