About 50 results
Open links in new tab
  1. How should outliers be dealt with in linear regression analysis?

    What statistical tests or rules of thumb can be used as a basis for excluding outliers in linear regression analysis? Are there any special considerations for multilinear regression?

  2. When conducting multiple regression, when should you center your ...

    Jun 5, 2012 · In some literature, I have read that a regression with multiple explanatory variables, if in different units, needed to be standardized. (Standardizing consists in subtracting the mean and dividin...

  3. In linear regression, when is it appropriate to use the log of an ...

    Aug 24, 2021 · This is because any regression coefficients involving the original variable - whether it is the dependent or the independent variable - will have a percentage point change interpretation.

  4. What's the difference between correlation and simple linear regression ...

    Aug 1, 2013 · Regression is an analysis (estimation of parameters of a model and statistical test of their significance) of the adequacy of a particular functional relationship.

  5. Choosing variables to include in a multiple linear regression model

    Is using correlation matrix to select predictors for regression correct? A correlation analysis is quite different to multiple regression, because in the latter case we need to think about "partialling out" …

  6. How does the correlation coefficient differ from regression slope?

    Jan 10, 2015 · The regression slope measures the "steepness" of the linear relationship between two variables and can take any value from $-\infty$ to $+\infty$. Slopes near zero mean that the …

  7. regression - When is R squared negative? - Cross Validated

    Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is equivalent to …

  8. correlation - What is the difference between linear regression on y ...

    The Pearson correlation coefficient of x and y is the same, whether you compute pearson(x, y) or pearson(y, x). This suggests that doing a linear regression of y given x or x given y should be the ...

  9. Why is ANOVA equivalent to linear regression? - Cross Validated

    Oct 3, 2015 · ANOVA and linear regression are equivalent when the two models test against the same hypotheses and use an identical encoding. The models differ in their basic aim: ANOVA is mostly …

  10. How to derive the standard error of linear regression coefficient

    another way of thinking about the n-2 df is that it's because we use 2 means to estimate the slope coefficient (the mean of Y and X) df from Wikipedia: "...In general, the degrees of freedom of an …