About 50 results
Open links in new tab
  1. How should outliers be dealt with in linear regression analysis ...

    What statistical tests or rules of thumb can be used as a basis for excluding outliers in linear regression analysis? Are there any special considerations for multilinear regression?

  2. How to determine which variables are statistically significant in ...

    How to determine which variables are statistically significant in multiple regression? Ask Question Asked 13 years, 4 months ago Modified 3 years, 4 months ago

  3. Why not approach classification through regression?

    86 "..approach classification problem through regression.." by "regression" I will assume you mean linear regression, and I will compare this approach to the "classification" approach of fitting a logistic …

  4. When is it ok to remove the intercept in a linear regression model ...

    Hence, if the sum of squared errors is to be minimized, the constant must be chosen such that the mean of the errors is zero.) In a simple regression model, the constant represents the Y-intercept of the …

  5. Can I merge multiple linear regressions into one regression?

    Oct 3, 2021 · Although one can compute a single regression for all data points, if you include model assumptions such as i.i.d. normal errors, the for all points combined can't be "correct" if the four …

  6. Poisson regression to estimate relative risk for binary outcomes

    Brief Summary Why is it more common for logistic regression (with odds ratios) to be used in cohort studies with binary outcomes, as opposed to Poisson regression (with relative risks)? Backgrou...

  7. In linear regression, when is it appropriate to use the log of an ...

    Aug 24, 2021 · This is because any regression coefficients involving the original variable - whether it is the dependent or the independent variable - will have a percentage point change interpretation.

  8. regression - What does it mean to regress a variable against another ...

    Dec 4, 2014 · When we say, to regress Y Y against X X, do we mean that X X is the independent variable and Y the dependent variable? i.e. Y = aX + b Y = a X + b.

  9. regression - When is R squared negative? - Cross Validated

    With linear regression with no constraints, R2 R 2 must be positive (or zero) and equals the square of the correlation coefficient, r r. A negative R2 R 2 is only possible with linear regression when either …

  10. Interpretation of log transformed predictors in logistic regression

    Mar 3, 2020 · One of the predictors in my logistic model has been log transformed. How do you interpret the estimated coefficient of the log transformed predictor and how do you calculate the impact of that …