.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/gaussian_process/plot_gpr_noisy.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via JupyterLite or Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_gaussian_process_plot_gpr_noisy.py: ========================================================================= Ability of Gaussian process regression (GPR) to estimate data noise-level ========================================================================= This example shows the ability of the :class:`~sklearn.gaussian_process.kernels.WhiteKernel` to estimate the noise level in the data. Moreover, we show the importance of kernel hyperparameters initialization. .. GENERATED FROM PYTHON SOURCE LINES 11-15 .. code-block:: Python # Authors: The scikit-learn developers # SPDX-License-Identifier: BSD-3-Clause .. GENERATED FROM PYTHON SOURCE LINES 16-22 Data generation --------------- We will work in a setting where `X` will contain a single feature. We create a function that will generate the target to be predicted. We will add an option to add some noise to the generated target. .. GENERATED FROM PYTHON SOURCE LINES 22-33 .. code-block:: Python import numpy as np def target_generator(X, add_noise=False): target = 0.5 + np.sin(3 * X) if add_noise: rng = np.random.RandomState(1) target += rng.normal(0, 0.3, size=target.shape) return target.squeeze() .. GENERATED FROM PYTHON SOURCE LINES 34-36 Let's have a look to the target generator where we will not add any noise to observe the signal that we would like to predict. .. GENERATED FROM PYTHON SOURCE LINES 36-39 .. code-block:: Python X = np.linspace(0, 5, num=80).reshape(-1, 1) y = target_generator(X, add_noise=False) .. GENERATED FROM PYTHON SOURCE LINES 40-47 .. code-block:: Python import matplotlib.pyplot as plt plt.plot(X, y, label="Expected signal") plt.legend() plt.xlabel("X") _ = plt.ylabel("y") .. image-sg:: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_001.png :alt: plot gpr noisy :srcset: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 48-51 The target is transforming the input `X` using a sine function. Now, we will generate few noisy training samples. To illustrate the noise level, we will plot the true signal together with the noisy training samples. .. GENERATED FROM PYTHON SOURCE LINES 51-55 .. code-block:: Python rng = np.random.RandomState(0) X_train = rng.uniform(0, 5, size=20).reshape(-1, 1) y_train = target_generator(X_train, add_noise=True) .. GENERATED FROM PYTHON SOURCE LINES 56-68 .. code-block:: Python plt.plot(X, y, label="Expected signal") plt.scatter( x=X_train[:, 0], y=y_train, color="black", alpha=0.4, label="Observations", ) plt.legend() plt.xlabel("X") _ = plt.ylabel("y") .. image-sg:: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_002.png :alt: plot gpr noisy :srcset: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 69-87 Optimisation of kernel hyperparameters in GPR --------------------------------------------- Now, we will create a :class:`~sklearn.gaussian_process.GaussianProcessRegressor` using an additive kernel adding a :class:`~sklearn.gaussian_process.kernels.RBF` and :class:`~sklearn.gaussian_process.kernels.WhiteKernel` kernels. The :class:`~sklearn.gaussian_process.kernels.WhiteKernel` is a kernel that will able to estimate the amount of noise present in the data while the :class:`~sklearn.gaussian_process.kernels.RBF` will serve at fitting the non-linearity between the data and the target. However, we will show that the hyperparameter space contains several local minima. It will highlights the importance of initial hyperparameter values. We will create a model using a kernel with a high noise level and a large length scale, which will explain all variations in the data by noise. .. GENERATED FROM PYTHON SOURCE LINES 87-97 .. code-block:: Python from sklearn.gaussian_process import GaussianProcessRegressor from sklearn.gaussian_process.kernels import RBF, WhiteKernel kernel = 1.0 * RBF(length_scale=1e1, length_scale_bounds=(1e-2, 1e3)) + WhiteKernel( noise_level=1, noise_level_bounds=(1e-10, 1e1) ) gpr = GaussianProcessRegressor(kernel=kernel, alpha=0.0) gpr.fit(X_train, y_train) y_mean, y_std = gpr.predict(X, return_std=True) .. rst-class:: sphx-glr-script-out .. code-block:: none /home/circleci/project/sklearn/gaussian_process/kernels.py:452: ConvergenceWarning: The optimal value found for dimension 0 of parameter k1__k2__length_scale is close to the specified upper bound 1000.0. Increasing the bound and calling fit again may find a better value. .. GENERATED FROM PYTHON SOURCE LINES 98-111 .. code-block:: Python plt.plot(X, y, label="Expected signal") plt.scatter(x=X_train[:, 0], y=y_train, color="black", alpha=0.4, label="Observations") plt.errorbar(X, y_mean, y_std, label="Posterior mean ± std") plt.legend() plt.xlabel("X") plt.ylabel("y") _ = plt.title( ( f"Initial: {kernel}\nOptimum: {gpr.kernel_}\nLog-Marginal-Likelihood: " f"{gpr.log_marginal_likelihood(gpr.kernel_.theta)}" ), fontsize=8, ) .. image-sg:: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_003.png :alt: Initial: 1**2 * RBF(length_scale=10) + WhiteKernel(noise_level=1) Optimum: 0.763**2 * RBF(length_scale=1e+03) + WhiteKernel(noise_level=0.525) Log-Marginal-Likelihood: -23.499266455424188 :srcset: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 112-124 We see that the optimum kernel found still has a high noise level and an even larger length scale. The length scale reaches the maximum bound that we allowed for this parameter and we got a warning as a result. More importantly, we observe that the model does not provide useful predictions: the mean prediction seems to be constant: it does not follow the expected noise-free signal. Now, we will initialize the :class:`~sklearn.gaussian_process.kernels.RBF` with a larger `length_scale` initial value and the :class:`~sklearn.gaussian_process.kernels.WhiteKernel` with a smaller initial noise level lower while keeping the parameter bounds unchanged. .. GENERATED FROM PYTHON SOURCE LINES 124-131 .. code-block:: Python kernel = 1.0 * RBF(length_scale=1e-1, length_scale_bounds=(1e-2, 1e3)) + WhiteKernel( noise_level=1e-2, noise_level_bounds=(1e-10, 1e1) ) gpr = GaussianProcessRegressor(kernel=kernel, alpha=0.0) gpr.fit(X_train, y_train) y_mean, y_std = gpr.predict(X, return_std=True) .. GENERATED FROM PYTHON SOURCE LINES 132-146 .. code-block:: Python plt.plot(X, y, label="Expected signal") plt.scatter(x=X_train[:, 0], y=y_train, color="black", alpha=0.4, label="Observations") plt.errorbar(X, y_mean, y_std, label="Posterior mean ± std") plt.legend() plt.xlabel("X") plt.ylabel("y") _ = plt.title( ( f"Initial: {kernel}\nOptimum: {gpr.kernel_}\nLog-Marginal-Likelihood: " f"{gpr.log_marginal_likelihood(gpr.kernel_.theta)}" ), fontsize=8, ) .. image-sg:: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_004.png :alt: Initial: 1**2 * RBF(length_scale=0.1) + WhiteKernel(noise_level=0.01) Optimum: 1.05**2 * RBF(length_scale=0.569) + WhiteKernel(noise_level=0.134) Log-Marginal-Likelihood: -18.429732528984054 :srcset: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 147-157 First, we see that the model's predictions are more precise than the previous model's: this new model is able to estimate the noise-free functional relationship. Looking at the kernel hyperparameters, we see that the best combination found has a smaller noise level and shorter length scale than the first model. We can inspect the Log-Marginal-Likelihood (LML) of :class:`~sklearn.gaussian_process.GaussianProcessRegressor` for different hyperparameters to get a sense of the local minima. .. GENERATED FROM PYTHON SOURCE LINES 157-169 .. code-block:: Python from matplotlib.colors import LogNorm length_scale = np.logspace(-2, 4, num=80) noise_level = np.logspace(-2, 1, num=80) length_scale_grid, noise_level_grid = np.meshgrid(length_scale, noise_level) log_marginal_likelihood = [ gpr.log_marginal_likelihood(theta=np.log([0.36, scale, noise])) for scale, noise in zip(length_scale_grid.ravel(), noise_level_grid.ravel()) ] log_marginal_likelihood = np.reshape(log_marginal_likelihood, noise_level_grid.shape) .. GENERATED FROM PYTHON SOURCE LINES 170-187 .. code-block:: Python vmin, vmax = (-log_marginal_likelihood).min(), 50 level = np.around(np.logspace(np.log10(vmin), np.log10(vmax), num=20), decimals=1) plt.contour( length_scale_grid, noise_level_grid, -log_marginal_likelihood, levels=level, norm=LogNorm(vmin=vmin, vmax=vmax), ) plt.colorbar() plt.xscale("log") plt.yscale("log") plt.xlabel("Length-scale") plt.ylabel("Noise-level") plt.title("Log-marginal-likelihood") plt.show() .. image-sg:: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_005.png :alt: Log-marginal-likelihood :srcset: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 188-198 We see that there are two local minima that correspond to the combination of hyperparameters previously found. Depending on the initial values for the hyperparameters, the gradient-based optimization might or might not converge to the best model. It is thus important to repeat the optimization several times for different initializations. This can be done by setting the `n_restarts_optimizer` parameter of the :class:`~sklearn.gaussian_process.GaussianProcessRegressor` class. Let's try again to fit our model with the bad initial values but this time with 10 random restarts. .. GENERATED FROM PYTHON SOURCE LINES 199-209 .. code-block:: Python kernel = 1.0 * RBF(length_scale=1e1, length_scale_bounds=(1e-2, 1e3)) + WhiteKernel( noise_level=1, noise_level_bounds=(1e-10, 1e1) ) gpr = GaussianProcessRegressor( kernel=kernel, alpha=0.0, n_restarts_optimizer=10, random_state=0 ) gpr.fit(X_train, y_train) y_mean, y_std = gpr.predict(X, return_std=True) .. GENERATED FROM PYTHON SOURCE LINES 210-224 .. code-block:: Python plt.plot(X, y, label="Expected signal") plt.scatter(x=X_train[:, 0], y=y_train, color="black", alpha=0.4, label="Observations") plt.errorbar(X, y_mean, y_std, label="Posterior mean ± std") plt.legend() plt.xlabel("X") plt.ylabel("y") _ = plt.title( ( f"Initial: {kernel}\nOptimum: {gpr.kernel_}\nLog-Marginal-Likelihood: " f"{gpr.log_marginal_likelihood(gpr.kernel_.theta)}" ), fontsize=8, ) .. image-sg:: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_006.png :alt: Initial: 1**2 * RBF(length_scale=10) + WhiteKernel(noise_level=1) Optimum: 1.05**2 * RBF(length_scale=0.569) + WhiteKernel(noise_level=0.134) Log-Marginal-Likelihood: -18.429732528970845 :srcset: /auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_006.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 225-227 As we hoped, random restarts allow the optimization to find the best set of hyperparameters despite the bad initial values. .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 5.508 seconds) .. _sphx_glr_download_auto_examples_gaussian_process_plot_gpr_noisy.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/1.6.X?urlpath=lab/tree/notebooks/auto_examples/gaussian_process/plot_gpr_noisy.ipynb :alt: Launch binder :width: 150 px .. container:: lite-badge .. image:: images/jupyterlite_badge_logo.svg :target: ../../lite/lab/index.html?path=auto_examples/gaussian_process/plot_gpr_noisy.ipynb :alt: Launch JupyterLite :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gpr_noisy.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gpr_noisy.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gpr_noisy.zip ` .. include:: plot_gpr_noisy.recommendations .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_