- University of North Dakota
- Research Guides
- SMHS Library Resources
- Statistics - explanations and formulas
- Multiple Regression Coefficients

In multiple regression analysis, a regression coefficient is the weight associated with a unit change of a given predictor variable in terms of the outcome variable, given the relationship of that predictor variable to the other predictor variables already in the regression model. (The regression equation is typically referred to as the regression model.)

For example: Y = a + b_{1}X_{1} + b_{2}X_{2} + b_{3}X_{3} + e

In the above equation, “Y” is the outcome variable (often referred to as the “dependent” or “criterion” variable), “a” is a constant (often referred to as the “intercept”), and each “b” is the coefficient of its corresponding predictor variable “X.” (Predictor variables are often referred to as “explanatory” or “independent” variable.) The “e” in the equation is referred to as the residual, which is the difference between the actual value and the value predicted from the regression model.

If the regression coefficient is standardized, it is typically referred to as a “Beta” coefficient. If the coefficient is unstandardized, it is typically referred to as a “b” coefficient