# Inconsistent Beta Coefficient Sign Between Multiple and Simple Regression

4 messages
Open this post in threaded view
|
Report Content as Inappropriate

## Inconsistent Beta Coefficient Sign Between Multiple and Simple Regression

 I am conducting a multiple Linear regression with 3 predictors, all variables are continuous and N=51.  Before doing linear regression analysis, I did first a simple linear regression and found that all the predictors have positive correlation with the outcome variable. Surprisingly, when I did the multiple linear regression, one of the predictors have negative B unstandardized coefficients, with less than -1.0 coefficient, VIF value less than 10, and these are no indication of multicollinearity problem. Is there anything wrong with the result? Please kindly give any advice how to explain this case. Thank you. Joe
Open this post in threaded view
|
Report Content as Inappropriate

## Re: Inconsistent Beta Coefficient Sign Between Multiple and Simple Regression

 Important to know the intercorrelations of the predictors as well as the correlations of the dependent variable and the predictors.  Google suppression in the context of multiple regression and you likely have such a situation! -- Joe wrote I am conducting a multiple Linear regression with 3 predictors, all variables are continuous and N=51.  Before doing linear regression analysis, I did first a simple linear regression and found that all the predictors have positive correlation with the outcome variable. Surprisingly, when I did the multiple linear regression, one of the predictors have negative B unstandardized coefficients, with less than -1.0 coefficient, VIF value less than 10, and these are no indication of multicollinearity problem. Is there anything wrong with the result? Please kindly give any advice how to explain this case. Thank you. Joe Please reply to the list and not to my personal email. Those desiring my consulting or training services please feel free to email me. --- "Nolite dare sanctum canibus neque mittatis margaritas vestras ante porcos ne forte conculcent eas pedibus suis." Cum es damnatorum possederunt porcos iens ut salire off sanguinum cliff in abyssum?"
Open this post in threaded view
|
Report Content as Inappropriate

## Re: Inconsistent Beta Coefficient Sign Between Multiple and Simple Regression

 In reply to this post by Joe There are two ways to regard multicollinearity.  The strictly-mathematicalapproach says "no multicollinearity exists, if it doesn't create a near-zero as a divisor."  Your comments reflect that tradition.Multicollinearity that messes up the easiest conclusions is indicated, by looser definition, when there are loadings in the opposite direction from the correlation.Or when you have any standardized coefficient  greater than abs(1).I always use those guides rather than the VIF but I think your results showthat your use of the VIF isn't sensitive enough.Your result shows that some difference between variables adds more tothe prediction than taking their sum.  (Sometimes, the solution is as easyas computing the simple DIFF= A- B  for two related variables on the samescale.  If you also have scaling problems, taking a ratio may simply therelations.)See "suppressor variables".  I get 91 hits when I Google Groups for < suppressor  author:ulrich >-- Rich Ulrich > Date: Sun, 27 Nov 2011 02:57:27 -0800> From: [hidden email]> Subject: Inconsistent Beta Coefficient Sign Between Multiple and Simple Regression> To: [hidden email]> > I am conducting a multiple Linear regression with 3 predictors, all variables> are continuous and N=51. Before doing linear regression analysis, I did> first a simple linear regression and found that all the predictors have> positive correlation with the outcome variable.> Surprisingly, when I did the multiple linear regression, one of the> predictors have negative B unstandardized coefficients, with less than -1.0> coefficient, VIF value less than 10, and these are no indication of> multicollinearity problem.> > Is there anything wrong with the result? Please kindly give any advice how> to explain this case.> > Thank you.> Joe