# What constitutes a significant association?

Correlation coefficients (here denoted as r) range from -1 to +1, with -1 signifying a perfect negative linear association, and +1 signifying a perfect positive linear association. The square of the coefficient (or *r* square) denotes the per cent of the variation in one variable that is related to the variation in the other. Thus, an *r* of 0.60 (r squared = 0.36) means that 36% of the variance in the dependent variable is related to the variance in the independent variable. The statistical significance of a given correlation coefficient indicates the likelihood that the coefficient would be found in a sample by chance when no significant association actually exists between the variables.

In regression-based analyses, the overall model “fit” with the observed data is described in terms of the proportion of variance in the dependent variable that is explained by the variance in the independent variables (the overall *multiple-correlation coefficient,* or R^{2} value). Statistical significance is used to indicate whether the overall model provides better-than-chance prediction of the dependent variable.

In order to further understand how each independent variable contributes to the prediction of the dependent variable, one examines the set of regression *coefficients* for the independent variables. The size (and sign) of the coefficient for each independent variable indicates how much the dependent variable is expected to increase (if positive) or decrease (if negative) when the independent variable increases by one unit, while holding all the other independent variables constant.