0
votes

Using scikit-learn to fit a one dimensional model, without an intercept:

lm = sklearn.linear_models.LinearRegression(fit_intercept=False).
lm.fit(x, y)

When evaluating the score using the training data I get a negative .score().

lm.score(x, y)

 -0.00256

Why? Does the R2 score compare the variance of my intercept-less model with a model with an intercept?

(Note that it is the same data that I used to fit the model.)

1
duplicate of this question on crossvalidated. Since you are not fitting intercept, pay attention to the 2nd answer there - lanenok
Thanks, however the only constraint is not to use an intercept. Does scikit-learn compare my model (without intercept) to a "horizontal line" with, or without, an intercept? - LearnOPhile
In R2 calculations, the model is compared with the mean value of y, which can be plotted as horizontal line. It is hard to help you more without any info about your dataset and the nature of your problem. Why, for example, are you sure that your intercept is zero? - lanenok

1 Answers

1
votes

From Wikipedia article on R^2:

Important cases where the computational definition of R2 can yield negative values, depending on the definition used, arise [...] where linear regression is conducted without including an intercept.

(emphasis mine).