Weighted Logistic Regression In Python
Solution 1:
I notice that this question is quite old now but hopefully this can help someone. With sklearn, you can use the SGDClassifier class to create a logistic regression model by simply passing in 'log' as the loss:
sklearn.linear_model.SGDClassifier(loss='log', ...).
This class implements weighted samples in the fit()
function:
classifier.fit(X, Y, sample_weight=weights)
where weights is a an array containing the sample weights that must be (obviously) the same length as the number of data points in X.
See http://scikit-learn.org/dev/modules/generated/sklearn.linear_model.SGDClassifier.html for full documentation.
Solution 2:
The “balanced” mode uses the values of y to automatically adjust weights inversely proportional to class frequencies in the input data as n_samples / (n_classes * np.bincount(y))
from sklearn.linear_model importLogisticRegressionmodel= LogisticRegression(class_weight='balanced')
model = model.fit(X, y)
EDIT
Sample Weights can be added in the fit method. You just have to pass an array of n_samples. Check out documentation -
Hope this does it...
Solution 3:
I think what you want is statsmodels
. It has great support for GLM and other linear methods. If you're coming from R, you'll find the syntax very familiar.
Solution 4:
Have a look at scikits.learn logistic regression implementation
Solution 5:
Do you know Numpy? If no, take a look also to Scipy and matplotlib.
Post a Comment for "Weighted Logistic Regression In Python"