• No results found

6 Analyzes and Results

6.5 Regression Assumptions

6.5.3 Regression Assumption 3

“There is not perfect multicollinearity (i.e., there is no exact linear relationship between two or more of the independent variables)” Berry (1993:12).

The third regression assumption requires that there is no exact linear relationship between two or more of the independent variables, also referred to as perfect multicollinearity. When there is perfect multicollinearity between the independent variables, it means that the independent variables vary perfectly in relation to each other, and it will be impossible to increase one of them while keeping the other one constant. It will hence be difficult to distinguish between the independent variables’ individual effects on the dependent variable, and to yield unique and correct coefficient estimators. Hence, perfect multicollinearity can result in large standard errors and inaccurate estimations of the coefficients (Berry, 1993; Gujarati, 1995).

In particular, there are three methods of examining the existent of multicollinearity. The first one is through a correlation analysis, the second one is through a Variance Inflation Factor (VIF) test, and the third method is by conducting a Tolerance test. Correlation analyzes show the correlation between independent variables, where 1 indicates perfect correlation between the variables (Gujarati, 1995). The requirement for correlations and no perfect

multicollinearity, is that the Pearson-correlation shall be under 0.6, but values between 0.6 and 0.8 are acceptable if the population is big enough (N>200) (Sandvik, 2013b). Further, the

138 Minu Singh and Cigdem Yavuz

VIF and Tolerance tests are statistical measures of multicollinearity, where VIF-values over 10 represent large multicollinearity and a value equal to 1 represents no multicollinearity. The Tolerance test is a function of the VIF-values, and indicates perfect correlation if the value is equal to 0, and no correlation if the value is 1 (Sandvik, 2013b). We will first start by

examining multicollinearity between the variables in our first research model, and then discuss this assumption for our variables in our second research model, using these three methods.

In subchapter 6.2, we have previous presented the Pearson-correlations for the variables in our first research model, without interactions and dummy variables. We have seen that there are not high correlation values, except from the correlation between the independent variables change in Tobin’s Q and change in P/B. This correlation has a value of 0.553, but we do not consider this as problematic in our study as it below the requirement of the maximum value of 0.6. However, when we conduct correlation analyzes for the interactions, we get high

correlation values between the independent variable (change in ROE) and the interactions.

This can be seen in appendix B.3.

The analyzes show that change in ROE correlates highly with its interactions with beta, age, CEO change, board size and board gender, which also indicate high multicollinearity between these variables. Hence, we see that we will meet problems with the regressions of the

hypotheses that require these interactions, but we will conduct VIF and Tolerance tests to see if the correlation values are problematic. We are however not examining the correlations between the interactions as we are not going to use them at the same time in the regressions.

Table 6.19 – VIF and Tolerance for hypotheses 1 and 2

139 Minu Singh and Cigdem Yavuz

Table 6.19 shows the VIF and Tolerance values for the independent variables in the

regressions for hypotheses 1 and 2 in our first research model. We see that the requirements for both VIF and Tolerance values are met, and that we hence do not have perfect

multicollinearity between the independent variables in these regressions. However, when we examine the hypothesis with interactions (moderators), we see that the requirements for VIF and Tolerance values are met for some of the independent variables, except from the

independent variables in the regressions for hypotheses 6a, 8a and 8b, which can be seen in appendices E to L under the regression analyzes for each of the hypotheses. The table below show the VIF and Tolerance values for the hypotheses 6a, 8a and 8b.

Table 6.20 – VIF and Tolerance for hypotheses 6a, 8a and 8b The Tolerance and VIF values are quite high for the independent variables and interactions in these hypotheses, which indicates high multicollinearity. These had also high correlations in the Pearson-correlation analyzes, which can be seen in appendix B.3. Hence, we see that the assumption of no perfect multicollinearity is not fulfilled in some of the regressions with interactions in our study, which can result in inaccurate estimations of the regression coefficients because of large standard errors.

In order to solve this problem, we have chosen to center the interactions in hypotheses 6a, 8a and 8b that show high multicollinearity from all of the three methods. We are interested in examining the interactions effect in our hypotheses, so it will be problematic if we are not able to isolate the interactions’ individual effects on the dependent variable, and if we base our results on inaccurate regression coefficients. We are able to center these variables as they are continuous and unbounded, and by centering the interactions, we subtract the mean values

140 Minu Singh and Cigdem Yavuz

from the variables in the interactions, in order for them to correlate less with the variables they are conducted of (Allison, 2012; Sandvik, 2013a).

After centering the variables in the interactions, we get satisfying Pearson-correlation values between the independent variables and the interactions, as well as satisfying VIF and

Tolerance values that meet the requirements, which can be seen in appendices B.4, H, K and L. Hence, we have reduced the problem of perfect multicollinearity, and we will be able to see the different variables’ effects on the dependent variable. We thereby use the centered

interactions when we test for hypotheses 6a, 8a and 8b in subchapter 6.3 to increase the likelihood of achieving accurate regression coefficients without large standard errors.

We have also presented a correlation analysis for the variables in our second research model, which can be seen in subchapter 6.2. The Pearson-correlations are high between many of the independent variables, but these are below the maximum of 0.8 when there is a large

population. The highest correlation value of 0.672 is between market value and revenue, but we do not consider this as a problem in our study. The VIF and Tolerance test for the

regressions for the second research models, which can be seen in appendix M, also show low multicollinearity. The values meet the requirement for both VIF and Tolerance values, and we can hence see that we do not have problems with correlation between the independent

variables in our second research model. We can conclude that there is no indication of multicollinearity between the independent variables in our analyzes for both of our research models. Hence, the assumption of no perfect multicollinearity is met in our study.