.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/linear_model/plot_ridge_path.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via JupyterLite or Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_linear_model_plot_ridge_path.py: =========================================================== Plot Ridge coefficients as a function of the regularization =========================================================== Shows the effect of collinearity in the coefficients of an estimator. .. currentmodule:: sklearn.linear_model :class:`Ridge` Regression is the estimator used in this example. Each color represents a different feature of the coefficient vector, and this is displayed as a function of the regularization parameter. This example also shows the usefulness of applying Ridge regression to highly ill-conditioned matrices. For such matrices, a slight change in the target variable can cause huge variances in the calculated weights. In such cases, it is useful to set a certain regularization (alpha) to reduce this variation (noise). When alpha is very large, the regularization effect dominates the squared loss function and the coefficients tend to zero. At the end of the path, as alpha tends toward zero and the solution tends towards the ordinary least squares, coefficients exhibit big oscillations. In practise it is necessary to tune alpha in such a way that a balance is maintained between both. .. GENERATED FROM PYTHON SOURCE LINES 29-42 .. code-block:: Python # Author: Fabian Pedregosa -- # License: BSD 3 clause import matplotlib.pyplot as plt import numpy as np from sklearn import linear_model # X is the 10x10 Hilbert matrix X = 1.0 / (np.arange(1, 11) + np.arange(0, 10)[:, np.newaxis]) y = np.ones(10) .. GENERATED FROM PYTHON SOURCE LINES 43-45 Compute paths ------------- .. GENERATED FROM PYTHON SOURCE LINES 45-55 .. code-block:: Python n_alphas = 200 alphas = np.logspace(-10, -2, n_alphas) coefs = [] for a in alphas: ridge = linear_model.Ridge(alpha=a, fit_intercept=False) ridge.fit(X, y) coefs.append(ridge.coef_) .. GENERATED FROM PYTHON SOURCE LINES 56-58 Display results --------------- .. GENERATED FROM PYTHON SOURCE LINES 58-69 .. code-block:: Python ax = plt.gca() ax.plot(alphas, coefs) ax.set_xscale("log") ax.set_xlim(ax.get_xlim()[::-1]) # reverse axis plt.xlabel("alpha") plt.ylabel("weights") plt.title("Ridge coefficients as a function of the regularization") plt.axis("tight") plt.show() .. image-sg:: /auto_examples/linear_model/images/sphx_glr_plot_ridge_path_001.png :alt: Ridge coefficients as a function of the regularization :srcset: /auto_examples/linear_model/images/sphx_glr_plot_ridge_path_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 0.330 seconds) .. _sphx_glr_download_auto_examples_linear_model_plot_ridge_path.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/1.4.X?urlpath=lab/tree/notebooks/auto_examples/linear_model/plot_ridge_path.ipynb :alt: Launch binder :width: 150 px .. container:: lite-badge .. image:: images/jupyterlite_badge_logo.svg :target: ../../lite/lab/?path=auto_examples/linear_model/plot_ridge_path.ipynb :alt: Launch JupyterLite :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_ridge_path.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_ridge_path.py ` .. include:: plot_ridge_path.recommendations .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_