Version 1.6#
For a short description of the main highlights of the release, please refer to Release Highlights for scikit-learn 1.6.
Legend for changelogs
Major Feature something big that you couldn’t do before.
Feature something that you couldn’t do before.
Efficiency an existing feature now may not require as much computation or memory.
Enhancement a miscellaneous minor improvement.
Fix something that previously didn’t work as documented – or according to reasonable expectations – should now work.
API Change you will need to change your code to have the same effect in the future; or a feature will be removed in the future.
Version 1.6.1#
January 2025
Changed models#
Fix The
tags.input_tags.sparseflag was corrected for a majority of estimators. By Antoine Baker #30187
Changes impacting many modules#
Fix
_more_tags,_get_tags, and_safe_tagsare now raising aDeprecationWarninginstead of aFutureWarningto only notify developers instead of end-users. By Guillaume Lemaitre in #30573
sklearn.metrics#
Fix Fix regression when scikit-learn metric called on PyTorch CPU tensors would raise an error (with array API dispatch disabled which is the default). By Loïc Estève #30454
sklearn.model_selection#
Fix
cross_validate,cross_val_predict, andcross_val_scorenow acceptparams=Nonewhen metadata routing is enabled. By Adrin Jalali #30451
sklearn.tree#
Fix Use
log2instead oflnfor building trees to maintain behavior of previous versions. By Thomas Fan #30557
sklearn.utils#
Enhancement
utils.estimator_checks.check_estimator_sparse_tagensures that the estimator taginput_tags.sparseis consistent with itsfitmethod (accepting sparse inputXor raising the appropriate error). By Antoine Baker #30187Fix Raise a
DeprecationWarningwhen there is no concrete implementation of__sklearn_tags__in the MRO of the estimator. We request to inherit fromBaseEstimatorthat implements__sklearn_tags__. By Guillaume Lemaitre #30516
Version 1.6.0#
December 2024
Changes impacting many modules#
Enhancement
__sklearn_tags__was introduced for setting tags in estimators. More details in Estimator Tags. By Thomas Fan and Adrin Jalali #29677Enhancement Scikit-learn classes and functions can be used while only having a
import sklearnimport line. For example,import sklearn; sklearn.svm.SVC()now works. By Thomas Fan #29793Fix Classes
metrics.ConfusionMatrixDisplay,metrics.RocCurveDisplay,calibration.CalibrationDisplay,metrics.PrecisionRecallDisplay,metrics.PredictionErrorDisplayandinspection.PartialDependenceDisplaynow properly handle Matplotlib aliases for style parameters (e.g.,candcolor,lsandlinestyle, etc). By Joseph Barbier #30023API Change
utils.validation.validate_datais introduced and replaces previously privatebase.BaseEstimator._validate_datamethod. This is intended for third party estimator developers, who should use this function in most cases instead ofutils.check_arrayandutils.check_X_y. By Adrin Jalali #29696
Support for Array API#
Additional estimators and functions have been updated to include support for all Array API compliant inputs.
See Array API support (experimental) for more details.
Feature
model_selection.GridSearchCV,model_selection.RandomizedSearchCV,model_selection.HalvingGridSearchCVandmodel_selection.HalvingRandomSearchCVnow support Array API compatible inputs when their base estimators do. By Tim Head and Olivier Grisel #27096Feature
sklearn.metrics.f1_scorenow supports Array API compatible inputs. By Omar Salman #27369Feature
preprocessing.LabelEncodernow supports Array API compatible inputs. By Omar Salman #27381Feature
sklearn.metrics.mean_absolute_errornow supports Array API compatible inputs. By Edoardo Abati #27736Feature
sklearn.metrics.mean_tweedie_deviancenow supports Array API compatible inputs. By Thomas Li #28106Feature
sklearn.metrics.pairwise.cosine_similaritynow supports Array API compatible inputs. By Edoardo Abati #29014Feature
sklearn.metrics.pairwise.paired_cosine_distancesnow supports Array API compatible inputs. By Edoardo Abati #29112Feature
sklearn.metrics.cluster.entropynow supports Array API compatible inputs. By Yaroslav Korobko #29141Feature
sklearn.metrics.mean_squared_errornow supports Array API compatible inputs. By Yaroslav Korobko #29142Feature
sklearn.metrics.pairwise.additive_chi2_kernelnow supports Array API compatible inputs. By Yaroslav Korobko #29144Feature
sklearn.metrics.d2_tweedie_scorenow supports Array API compatible inputs. By Emily Chen #29207Feature
sklearn.metrics.max_errornow supports Array API compatible inputs. By Edoardo Abati #29212Feature
sklearn.metrics.mean_poisson_deviancenow supports Array API compatible inputs. By Emily Chen #29227Feature
sklearn.metrics.mean_gamma_deviancenow supports Array API compatible inputs. By Emily Chen #29239Feature
sklearn.metrics.pairwise.cosine_distancesnow supports Array API compatible inputs. By Emily Chen #29265Feature
sklearn.metrics.pairwise.chi2_kernelnow supports Array API compatible inputs. By Yaroslav Korobko #29267Feature
sklearn.metrics.mean_absolute_percentage_errornow supports Array API compatible inputs. By Emily Chen #29300Feature
sklearn.metrics.pairwise.paired_euclidean_distancesnow supports Array API compatible inputs. By Emily Chen #29389Feature
sklearn.metrics.pairwise.euclidean_distancesandsklearn.metrics.pairwise.rbf_kernelnow support Array API compatible inputs. By Omar Salman #29433Feature
sklearn.metrics.pairwise.linear_kernel,sklearn.metrics.pairwise.sigmoid_kernel, andsklearn.metrics.pairwise.polynomial_kernelnow support Array API compatible inputs. By Omar Salman #29475Feature
sklearn.metrics.mean_squared_log_errorandsklearn.metrics.root_mean_squared_log_errornow support Array API compatible inputs. By Virgil Chan #29709Feature
preprocessing.MinMaxScalerwithclip=Truenow supports Array API compatible inputs. By Shreekant Nandiyawar #29751Support for the soon to be deprecated
cupy.array_apimodule has been removed in favor of directly supporting the top levelcupymodule, possibly via thearray_api_compat.cupycompatibility wrapper. By Olivier Grisel #29639
Metadata routing#
Refer to the Metadata Routing User Guide for more details.
Feature
semi_supervised.SelfTrainingClassifiernow supports metadata routing. The fit method now accepts**fit_paramswhich are passed to the underlying estimators via theirfitmethods. In addition, thepredict,predict_proba,predict_log_proba,scoreanddecision_functionmethods also accept**paramswhich are passed to the underlying estimators via their respective methods. By Adam Li #28494Feature
ensemble.StackingClassifierandensemble.StackingRegressornow support metadata routing and pass**fit_paramsto the underlying estimators via theirfitmethods. By Stefanie Senger #28701Feature
model_selection.learning_curvenow supports metadata routing for thefitmethod of its estimator and for its underlying CV splitter and scorer. By Stefanie Senger #28975Feature
compose.TransformedTargetRegressornow supports metadata routing in itsfitandpredictmethods and routes the corresponding params to the underlying regressor. By Omar Salman #29136Feature
feature_selection.SequentialFeatureSelectornow supports metadata routing in itsfitmethod and passes the corresponding params to themodel_selection.cross_val_scorefunction. By Omar Salman #29260Feature
model_selection.permutation_test_scorenow supports metadata routing for thefitmethod of its estimator and for its underlying CV splitter and scorer. By Adam Li #29266Feature
feature_selection.RFEandfeature_selection.RFECVnow support metadata routing. By Omar Salman #29312Feature
model_selection.validation_curvenow supports metadata routing for thefitmethod of its estimator and for its underlying CV splitter and scorer. By Stefanie Senger #29329Fix Metadata is routed correctly to grouped CV splitters via
linear_model.RidgeCVandlinear_model.RidgeClassifierCVandUnsetMetadataPassedErroris fixed forlinear_model.RidgeClassifierCVwith default scoring. By Stefanie Senger #29634Fix Many method arguments which shouldn’t be included in the routing mechanism are now excluded and the
set_{method}_requestmethods are not generated for them. By Adrin Jalali #29920
Dropping official support for PyPy#
Due to limited maintainer resources and small number of users, official PyPy support has been dropped. Some parts of scikit-learn may still work but PyPy is not tested anymore in the scikit-learn Continuous Integration. By Loïc Estève #29128
Dropping support for building with setuptools#
From scikit-learn 1.6 onwards, support for building with setuptools has been removed. Meson is the only supported way to build scikit-learn, see Building from source for more details. By Loïc Estève #29400
Free-threaded CPython 3.13 support#
scikit-learn has preliminary support for free-threaded CPython, in particular free-threaded wheels are available for all of our supported platforms.
Free-threaded (also known as nogil) CPython 3.13 is an experimental version of CPython 3.13 which aims at enabling efficient multi-threaded use cases by removing the Global Interpreter Lock (GIL).
For more details about free-threaded CPython see py-free-threading doc, in particular how to install a free-threaded CPython and Ecosystem compatibility tracking.
Feel free to try free-threaded on your use case and report any issues!
By Loïc Estève and many other people in the wider Scientific Python and CPython ecosystem, for example Nathan Goldbaum, Ralf Gommers, Edgar Andrés Margffoy Tuay. #30360
sklearn.base#
Enhancement Added a function
base.is_clustererwhich determines whether a given estimator is of category clusterer. By Christian Veenhuis #28936API Change Passing a class object to
is_classifier,is_regressor, andis_outlier_detectoris now deprecated. Pass an instance instead. By Adrin Jalali #30122
sklearn.calibration#
API Change
cv="prefit"is deprecated forCalibratedClassifierCV. UseFrozenEstimatorinstead, asCalibratedClassifierCV(FrozenEstimator(estimator)). By Adrin Jalali #30171
sklearn.cluster#
API Change The
copyparameter ofcluster.Birchwas deprecated in 1.6 and will be removed in 1.8. It has no effect as the estimator does not perform in-place operations on the input data. By Yao Xiao #29124
sklearn.compose#
Enhancement
sklearn.compose.ColumnTransformerverbose_feature_names_outnow accepts string format or callable to generate feature names. By Marc Bresson #28934
sklearn.covariance#
Efficiency
covariance.MinCovDetfitting is now slightly faster. By Antony Lee #29835
sklearn.cross_decomposition#
Fix
cross_decomposition.PLSRegressionproperly raises an error whenn_componentsis larger thann_samples. By Thomas Fan #29710
sklearn.datasets#
Feature
datasets.fetch_fileallows downloading arbitrary data-file from the web. It handles local caching, integrity checks with SHA256 digests and automatic retries in case of HTTP errors. By Olivier Grisel #29354
sklearn.decomposition#
Enhancement
LatentDirichletAllocationnow has anormalizeparameter intransformandfit_transformmethods to control whether the document topic distribution is normalized. By Adrin Jalali #30097Fix
IncrementalPCAwill now only raise aValueErrorwhen the number of samples in the input data topartial_fitis less than the number of components on the first call topartial_fit. Subsequent calls topartial_fitno longer face this restriction. By Thomas Gessey-Jones #30224
sklearn.discriminant_analysis#
Fix
discriminant_analysis.QuadraticDiscriminantAnalysiswill now causeLinAlgWarningin case of collinear variables. These errors can be silenced using thereg_paramattribute. By Alihan Zihna #19731
sklearn.ensemble#
Feature
ensemble.ExtraTreesClassifierandensemble.ExtraTreesRegressornow support missing-values in the data matrixX. Missing-values are handled by randomly moving all of the samples to the left, or right child node as the tree is traversed. By Adam Li #28268Efficiency Small runtime improvement of fitting
ensemble.HistGradientBoostingClassifierandensemble.HistGradientBoostingRegressorby parallelizing the initial search for bin thresholds. By Christian Lorentzen #28064Efficiency
ensemble.IsolationForestnow runs parallel jobs during predict offering a speedup of up to 2-4x on sample sizes larger than 2000 usingjoblib. By Adam Li and Sérgio Pereira #28622Enhancement The verbosity of
ensemble.HistGradientBoostingClassifierandensemble.HistGradientBoostingRegressorgot a more granular control. Now,verbose = 1prints only summary messages,verbose >= 2prints the full information as before. By Christian Lorentzen #28179API Change The parameter
algorithmofensemble.AdaBoostClassifieris deprecated and will be removed in 1.8. By Jérémie du Boisberranger #29997
sklearn.feature_extraction#
Fix
feature_extraction.text.TfidfVectorizernow correctly preserves thedtypeofidf_based on the input data. By Guillaume Lemaitre #30022
sklearn.frozen#
Major Feature
FrozenEstimatoris now introduced which allows freezing an estimator. This means calling.fiton it has no effect, and doing aclone(frozenestimator)returns the same estimator instead of an unfitted clone. #29705 By Adrin Jalali #29705
sklearn.impute#
Fix
impute.KNNImputerexcludes samples with nan distances when computing the mean value for uniform weights. By Xuefeng Xu #29135Fix When
min_valueandmax_valueare array-like and some features are dropped due tokeep_empty_features=False,impute.IterativeImputerno longer raises an error and now indexes correctly. By Guntitat Sawadwuthikul #29451Fix Fixed
impute.IterativeImputerto make sure that it does not skip the iterative process whenkeep_empty_featuresis set toTrue. By Arif Qodari #29779API Change Add a warning in
impute.SimpleImputerwhenkeep_empty_feature=Falseandstrategy="constant". In this case empty features are not dropped and this behaviour will change in 1.8. By Arthur Courselle and Simon Riou #29950
sklearn.linear_model#
Enhancement The
solver="newton-cholesky"inlinear_model.LogisticRegressionandlinear_model.LogisticRegressionCVis extended to support the full multinomial loss in a multiclass setting. By Christian Lorentzen #28840Fix In
linear_model.Ridgeandlinear_model.RidgeCV, afterfit, thecoef_attribute is now of shape(n_samples,)like other linear models. By Maxwell Liu, Guillaume Lemaitre, and Adrin Jalali #19746Fix
linear_model.LogisticRegressionCVcorrects sample weight handling for the calculation of test scores. By Shruti Nath #29419Fix
linear_model.LassoCVandlinear_model.ElasticNetCVnow take sample weights into accounts to define the search grid for the internally tunedalphahyper-parameter. By John Hopfensperger and Shruti Nath #29442Fix
linear_model.LogisticRegression,linear_model.PoissonRegressor,linear_model.GammaRegressor,linear_model.TweedieRegressornow take sample weights into account to decide when to fall back tosolver='lbfgs'wheneversolver='newton-cholesky'becomes numerically unstable. By Antoine Baker #29818Fix
linear_model.RidgeCVnow properly uses predictions on the same scale as the target seen duringfit. These predictions are stored incv_results_whenscoring != None. Previously, the predictions were rescaled by the square root of the sample weights and offset by the mean of the target, leading to an incorrect estimate of the score. By Guillaume Lemaitre, Jérôme Dockes and Hanmin Qin #29842Fix
linear_model.RidgeCVnow properly supports custom multioutput scorers by letting the scorer manage the multioutput averaging. Previously, the predictions and true targets were both squeezed to a 1D array before computing the error. By Guillaume Lemaitre #29884Fix
linear_model.LinearRegressionnow sets thecondparameter when calling thescipy.linalg.lstsqsolver on dense input data. This ensures more numerically robust results on rank-deficient data. In particular, it empirically fixes the expected equivalence property between fitting with reweighted or with repeated data points. By Antoine Baker #30040Fix
linear_model.LogisticRegressionand other linear models that acceptsolver="newton-cholesky"now report the correct number of iterations when they fall back to the"lbfgs"solver because of a rank deficient Hessian matrix. By Olivier Grisel #30100Fix
SGDOneClassSVMnow correctly inherits fromOutlierMixinand the tags are correctly set. By Guillaume Lemaitre #30227API Change Deprecates
copy_Xinlinear_model.TheilSenRegressoras the parameter has no effect.copy_Xwill be removed in 1.8. By Adam Li #29105
sklearn.manifold#
Efficiency
manifold.locally_linear_embeddingandmanifold.LocallyLinearEmbeddingnow allocate more efficiently the memory of sparse matrices in the Hessian, Modified and LTSA methods. By Giorgio Angelotti #28096
sklearn.metrics#
Efficiency
sklearn.metrics.classification_reportis now faster by caching classification labels. By Adrin Jalali #29738Enhancement
metrics.RocCurveDisplay.from_estimator,metrics.RocCurveDisplay.from_predictions,metrics.PrecisionRecallDisplay.from_estimator, andmetrics.PrecisionRecallDisplay.from_predictionsnow accept a new keyworddespineto remove the top and right spines of the plot in order to make it clearer. By Yao Xiao #26367Enhancement
sklearn.metrics.check_scoringnow acceptsraise_excto specify whether to raise an exception if a subset of the scorers in multimetric scoring fails or to return an error code. By Stefanie Senger #28992Fix
metrics.roc_auc_scorewill now correctly return np.nan and warn user if only one class is present in the labels. By Hleb Levitski and Janez Demšar #27412, #30013Fix The functions
metrics.mean_squared_log_errorandmetrics.root_mean_squared_log_errornow check whether the inputs are within the correct domain for the function \(y=\log(1+x)\), rather than \(y=\log(x)\). The functionsmetrics.mean_absolute_error,metrics.mean_absolute_percentage_error,metrics.mean_squared_errorandmetrics.root_mean_squared_errornow explicitly check whether a scalar will be returned whenmultioutput=uniform_average. By Virgil Chan #29709API Change The
assert_all_finiteparameter of functionsmetrics.pairwise.check_pairwise_arraysandmetrics.pairwise_distancesis renamed intoensure_all_finite.force_all_finitewill be removed in 1.8. By Jérémie du Boisberranger #29404API Change
scoring="neg_max_error"should be used instead ofscoring="max_error"which is now deprecated. By Farid “Freddie” Taba #29462API Change The default value of the
response_methodparameter ofmetrics.make_scorerwill change fromNoneto"predict"andNonewill be removed in 1.8. In the meantime,Noneis equivalent to"predict". By Jérémie du Boisberranger #30001
sklearn.model_selection#
Enhancement
GroupKFoldnow has the ability to shuffle groups into different folds whenshuffle=True. By Zachary Vealey #28519Enhancement There is no need to call
fiton aFixedThresholdClassifierif the underlying estimator is already fitted. By Adrin Jalali #30172Fix Improve error message when
model_selection.RepeatedStratifiedKFold.splitis called without ayargument By Anurag Varma #29402
sklearn.neighbors#
Enhancement
neighbors.NearestNeighbors,neighbors.KNeighborsClassifier,neighbors.KNeighborsRegressor,neighbors.RadiusNeighborsClassifier,neighbors.RadiusNeighborsRegressor,neighbors.KNeighborsTransformer,neighbors.RadiusNeighborsTransformer, andneighbors.LocalOutlierFactornow work withmetric="nan_euclidean", supportingnaninputs. By Carlo Lemos, Guillaume Lemaitre, and Adrin Jalali #25330Enhancement Add
neighbors.NearestCentroid.decision_function,neighbors.NearestCentroid.predict_probaandneighbors.NearestCentroid.predict_log_probato theneighbors.NearestCentroidestimator class. Support the case whenXis sparse andshrinking_thresholdis notNoneinneighbors.NearestCentroid. By Matthew Ning #26689Enhancement Make
predict,predict_proba, andscoreofneighbors.KNeighborsClassifierandneighbors.RadiusNeighborsClassifieracceptX=Noneas input. In this case predictions for all training set points are returned, and points are not included into their own neighbors. By Dmitry Kobak #30047Fix
neighbors.LocalOutlierFactorraises a warning in thefitmethod when duplicate values in the training data lead to inaccurate outlier detection. By Henrique Caroço #28773
sklearn.neural_network#
Fix
neural_network.MLPRegressordoes no longer crash when the model diverges and thatearly_stoppingis enabled. By Marc Bresson #29773
sklearn.pipeline#
Major Feature
pipeline.Pipelinecan now transform metadata up to the step requiring the metadata, which can be set using thetransform_inputparameter. By Adrin Jalali #28901Enhancement
pipeline.Pipelinenow warns about not being fitted before calling methods that require the pipeline to be fitted. This warning will become an error in 1.8. By Adrin Jalali #29868Fix Fixed an issue with tags and estimator type of
Pipelinewhen pipeline is empty. This allows the HTML representation of an empty pipeline to be rendered correctly. By Gennaro Daniele Acciaro #30203
sklearn.preprocessing#
Enhancement Added
warnoption tohandle_unknownparameter inpreprocessing.OneHotEncoder. By Hleb Levitski #28637Enhancement The HTML representation of
preprocessing.FunctionTransformerwill show the function name in the label. By Yao Xiao #29158Fix
preprocessing.PowerTransformernow usesscipy.special.inv_boxcoxto outputnanif the input of BoxCox’s inverse is invalid. By Xuefeng Xu #27875
sklearn.semi_supervised#
API Change
semi_supervised.SelfTrainingClassifierdeprecated thebase_estimatorparameter in favor ofestimator. By Adam Li #28494
sklearn.tree#
Feature
tree.ExtraTreeClassifierandtree.ExtraTreeRegressornow support missing-values in the data matrixX. Missing-values are handled by randomly moving all of the samples to the left, or right child node as the tree is traversed. By Adam Li and Loïc Estève #27966, #30318Fix Escape double quotes for labels and feature names when exporting trees to Graphviz format. By Santiago M. Mola. #17575
sklearn.utils#
Enhancement
utils.check_arraynow acceptsensure_non_negativeto check for negative values in the passed array, until now only available through callingutils.check_non_negative. By Tamara Atanasoska #29540Enhancement
check_estimatorandparametrize_with_checksnow check and fail if the classifier has thetags.classifier_tags.multi_class = Falsetag but does not fail on multi-class data. By Adrin Jalali #29874Enhancement
utils.validation.check_is_fittednow passes on stateless estimators. An estimator can indicate it’s stateless by setting therequires_fittag. See Estimator Tags for more information. By Adrin Jalali #29880Enhancement Changes to
check_estimatorandparametrize_with_checks.check_estimatorintroduces new arguments:on_skip,on_fail, andcallbackto control the behavior of the check runner. Refer to the API documentation for more details.generate_only=Trueis deprecated incheck_estimator. Useestimator_checks_generatorinstead.The
_xfail_checksestimator tag is now removed, and now in order to indicate which tests are expected to fail, you can pass a dictionary to thecheck_estimatoras theexpected_failed_checksparameter. Similarly, theexpected_failed_checksparameter inparametrize_with_checkscan be used, which is a callable returning a dictionary of the form:{ "check_name": "reason to mark this check as xfail", }
Fix
utils.estimator_checks.parametrize_with_checksandutils.estimator_checks.check_estimatornow support estimators that haveset_outputcalled on them. By Adrin Jalali #29869API Change The
assert_all_finiteparameter of functionsutils.check_array,utils.check_X_y,utils.as_float_arrayis renamed intoensure_all_finite.force_all_finitewill be removed in 1.8. By Jérémie du Boisberranger #29404API Change
utils.estimator_checks.check_sample_weights_invariancereplaced byutils.estimator_checks.check_sample_weight_equivalence_on_dense_datawhich uses integer (including zero) weights andutils.estimator_checks.check_sample_weight_equivalence_on_sparse_datawhich does the same on sparse data. By Antoine Baker #29818, #30137API Change Using
_estimator_typeto set the estimator type is deprecated. Inherit fromClassifierMixin,RegressorMixin,TransformerMixin, orOutlierMixininstead. Alternatively, you can setestimator_typeinTagsin the__sklearn_tags__method. By Adrin Jalali #30122
Code and documentation contributors
Thanks to everyone who has contributed to the maintenance and improvement of the project since version 1.5, including:
Aaron Schumacher, Abdulaziz Aloqeely, abhi-jha, Acciaro Gennaro Daniele, Adam J. Stewart, Adam Li, Adeel Hassan, Adeyemi Biola, Aditi Juneja, Adrin Jalali, Aisha, Akanksha Mhadolkar, Akihiro Kuno, Alberto Torres, alexqiao, Alihan Zihna, Aniruddha Saha, antoinebaker, Antony Lee, Anurag Varma, Arif Qodari, Arthur Courselle, ArthurDbrn, Arturo Amor, Aswathavicky, Audrey Flanders, aurelienmorgan, Austin, awwwyan, AyGeeEm, a.zy.lee, baggiponte, BlazeStorm001, bme-git, Boney Patel, brdav, Brigitta Sipőcz, Cailean Carter, Camille Troillard, Carlo Lemos, Christian Lorentzen, Christian Veenhuis, Christine P. Chai, claudio, Conrad Stevens, datarollhexasphericon, Davide Chicco, David Matthew Cherney, Dea María Léon, Deepak Saldanha, Deepyaman Datta, dependabot[bot], dinga92, Dmitry Kobak, Domenico, Drew Craeton, dymil, Edoardo Abati, EmilyXinyi, Eric Larson, Evelyn, fabianhenning, Farid “Freddie” Taba, Gael Varoquaux, Giorgio Angelotti, Hleb Levitski, Guillaume Lemaitre, Guntitat Sawadwuthikul, Haesun Park, Hanjun Kim, Henrique Caroço, hhchen1105, Hugo Boulenger, Ilya Komarov, Inessa Pawson, Ivan Pan, Ivan Wiryadi, Jaimin Chauhan, Jakob Bull, James Lamb, Janez Demšar, Jérémie du Boisberranger, Jérôme Dockès, Jirair Aroyan, João Morais, Joe Cainey, Joel Nothman, John Enblom, JorgeCardenas, Joseph Barbier, jpienaar-tuks, Julian Chan, K.Bharat Reddy, Kevin Doshi, Lars, Loic Esteve, Lucas Colley, Lucy Liu, lunovian, Marc Bresson, Marco Edward Gorelli, Marco Maggi, Marco Wolsza, Maren Westermann, MarieS-WiMLDS, Martin Helm, Mathew Shen, mathurinm, Matthew Feickert, Maxwell Liu, Meekail Zain, Michael Dawson, Miguel Cárdenas, m-maggi, mrastgoo, Natalia Mokeeva, Nathan Goldbaum, Nathan Orgera, nbrown-ScottLogic, Nikita Chistyakov, Nithish Bolleddula, Noam Keidar, NoPenguinsLand, Norbert Preining, notPlancha, Olivier Grisel, Omar Salman, ParsifalXu, Piotr, Priyank Shroff, Priyansh Gupta, Quentin Barthélemy, Rachit23110261, Rahil Parikh, raisadz, Rajath, renaissance0ne, Reshama Shaikh, Roberto Rosati, Robert Pollak, rwelsch427, Santiago Castro, Santiago M. Mola, scikit-learn-bot, sean moiselle, SHREEKANT VITTHAL NANDIYAWAR, Shruti Nath, Søren Bredlund Caspersen, Stefanie Senger, Stefano Gaspari, Steffen Schneider, Štěpán Sršeň, Sylvain Combettes, Tamara, Thomas, Thomas Gessey-Jones, Thomas J. Fan, Thomas Li, ThorbenMaa, Tialo, Tim Head, Tuhin Sharma, Tushar Parimi, Umberto Fasci, UV, vedpawar2254, Velislav Babatchev, Victoria Shevchenko, viktor765, Vince Carey, Virgil Chan, Wang Jiayi, Xiao Yuan, Xuefeng Xu, Yao Xiao, yareyaredesuyo, Zachary Vealey, Ziad Amerr