.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/linear_model/plot_bayesian_ridge_curvefit.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via JupyterLite or Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_linear_model_plot_bayesian_ridge_curvefit.py: ============================================ Curve Fitting with Bayesian Ridge Regression ============================================ Computes a Bayesian Ridge Regression of Sinusoids. See :ref:`bayesian_ridge_regression` for more information on the regressor. In general, when fitting a curve with a polynomial by Bayesian ridge regression, the selection of initial values of the regularization parameters (alpha, lambda) may be important. This is because the regularization parameters are determined by an iterative procedure that depends on initial values. In this example, the sinusoid is approximated by a polynomial using different pairs of initial values. When starting from the default values (alpha_init = 1.90, lambda_init = 1.), the bias of the resulting curve is large, and the variance is small. So, lambda_init should be relatively small (1.e-3) so as to reduce the bias. Also, by evaluating log marginal likelihood (L) of these models, we can determine which one is better. It can be concluded that the model with larger L is more likely. .. GENERATED FROM PYTHON SOURCE LINES 28-31 .. code-block:: Python # Author: Yoshihiro Uchida .. GENERATED FROM PYTHON SOURCE LINES 32-34 Generate sinusoidal data with noise ----------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 34-48 .. code-block:: Python import numpy as np def func(x): return np.sin(2 * np.pi * x) size = 25 rng = np.random.RandomState(1234) x_train = rng.uniform(0.0, 1.0, size) y_train = func(x_train) + rng.normal(scale=0.1, size=size) x_test = np.linspace(0.0, 1.0, 100) .. GENERATED FROM PYTHON SOURCE LINES 49-51 Fit by cubic polynomial ----------------------- .. GENERATED FROM PYTHON SOURCE LINES 51-58 .. code-block:: Python from sklearn.linear_model import BayesianRidge n_order = 3 X_train = np.vander(x_train, n_order + 1, increasing=True) X_test = np.vander(x_test, n_order + 1, increasing=True) reg = BayesianRidge(tol=1e-6, fit_intercept=False, compute_score=True) .. GENERATED FROM PYTHON SOURCE LINES 59-61 Plot the true and predicted curves with log marginal likelihood (L) ------------------------------------------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 61-93 .. code-block:: Python import matplotlib.pyplot as plt fig, axes = plt.subplots(1, 2, figsize=(8, 4)) for i, ax in enumerate(axes): # Bayesian ridge regression with different initial value pairs if i == 0: init = [1 / np.var(y_train), 1.0] # Default values elif i == 1: init = [1.0, 1e-3] reg.set_params(alpha_init=init[0], lambda_init=init[1]) reg.fit(X_train, y_train) ymean, ystd = reg.predict(X_test, return_std=True) ax.plot(x_test, func(x_test), color="blue", label="sin($2\\pi x$)") ax.scatter(x_train, y_train, s=50, alpha=0.5, label="observation") ax.plot(x_test, ymean, color="red", label="predict mean") ax.fill_between( x_test, ymean - ystd, ymean + ystd, color="pink", alpha=0.5, label="predict std" ) ax.set_ylim(-1.3, 1.3) ax.legend() title = "$\\alpha$_init$={:.2f},\\ \\lambda$_init$={}$".format(init[0], init[1]) if i == 0: title += " (Default)" ax.set_title(title, fontsize=12) text = "$\\alpha={:.1f}$\n$\\lambda={:.3f}$\n$L={:.1f}$".format( reg.alpha_, reg.lambda_, reg.scores_[-1] ) ax.text(0.05, -1.0, text, fontsize=12) plt.tight_layout() plt.show() .. image-sg:: /auto_examples/linear_model/images/sphx_glr_plot_bayesian_ridge_curvefit_001.png :alt: $\alpha$_init$=1.90,\ \lambda$_init$=1.0$ (Default), $\alpha$_init$=1.00,\ \lambda$_init$=0.001$ :srcset: /auto_examples/linear_model/images/sphx_glr_plot_bayesian_ridge_curvefit_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 0.297 seconds) .. _sphx_glr_download_auto_examples_linear_model_plot_bayesian_ridge_curvefit.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/1.4.X?urlpath=lab/tree/notebooks/auto_examples/linear_model/plot_bayesian_ridge_curvefit.ipynb :alt: Launch binder :width: 150 px .. container:: lite-badge .. image:: images/jupyterlite_badge_logo.svg :target: ../../lite/lab/?path=auto_examples/linear_model/plot_bayesian_ridge_curvefit.ipynb :alt: Launch JupyterLite :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_bayesian_ridge_curvefit.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_bayesian_ridge_curvefit.py ` .. include:: plot_bayesian_ridge_curvefit.recommendations .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_