Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/12900/wh…
regression - When is R squared negative? - Cross Validated
Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is equivalent to the squared correlation between the predictor and the dependent variable -- again, this must be non-negative.
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/633091/s…
Support Vector Regression vs. Linear Regression - Cross Validated
Linear regression can use the same kernels used in SVR, and SVR can also use the linear kernel. Given only the coefficients from such models, it would be impossible to distinguish between them in the general case (with SVR, you might get sparse coefficients depending on the penalization, due to $\epsilon$-insensitive loss)
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/175/how-…
How should outliers be dealt with in linear regression analysis ...
What statistical tests or rules of thumb can be used as a basis for excluding outliers in linear regression analysis? Are there any special considerations for multilinear regression?
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/447455/m…
Multivariable vs multivariate regression - Cross Validated
Multivariable regression is any regression model where there is more than one explanatory variable. For this reason it is often simply known as "multiple regression". In the simple case of just one explanatory variable, this is sometimes called univariable regression. Unfortunately multivariable regression is often mistakenly called multivariate regression, or vice versa. Multivariate ...
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/253035/t…
regression - Trying to understand the fitted vs residual plot? - Cross ...
A good residual vs fitted plot has three characteristics: The residuals "bounce randomly" around the 0 line. This suggests that the assumption that the relationship is linear is reasonable. The res...
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/866/when…
regression - When should I use lasso vs ridge? - Cross Validated
Ridge regression is useful as a general shrinking of all coefficients together. It is shrinking to reduce the variance and over fitting. It relates to the prior believe that coefficient values shouldn't be too large (and these can become large in fitting when there is collinearity) Lasso is useful as a shrinking of a selection of the coefficients.
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/581563/i…
Interpreting Z-Scores of Linear Regression Coefficients
Well, under the hypothetical scenario that the true regression coefficient is equal to 0, statisticians have figured out how likely a given Z-score is (using the normal distribution curve). Z-scores greater than 2 (in absolute value) only occur about 5% of the time when the true regression coefficient is equal to 0.
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/76226/in…
regression - Interpreting the residuals vs. fitted values plot for ...
None of the three plots show correlation (at least not linear correlation, which is the relevant meaning of 'correlation' in the sense in which it is being used in "the residuals and the fitted values are uncorrelated").
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/54533/wh…
regression - What do normal residuals mean and what does this tell me ...
Pretty basic question: What does a normal distribution of residuals from a linear regression mean? In terms of, how does this reflect on my original data from the regression? I'm totally stumped,
Global web icon
stackexchange.com
https://stats.stackexchange.com/questions/29781/wh…
When conducting multiple regression, when should you center your ...
In some literature, I have read that a regression with multiple explanatory variables, if in different units, needed to be standardized. (Standardizing consists in subtracting the mean and dividin...