.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/mixture/plot_gmm_selection.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via JupyterLite or Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_mixture_plot_gmm_selection.py: ================================ Gaussian Mixture Model Selection ================================ This example shows that model selection can be performed with Gaussian Mixture Models (GMM) using :ref:`information-theory criteria `. Model selection concerns both the covariance type and the number of components in the model. In this case, both the Akaike Information Criterion (AIC) and the Bayes Information Criterion (BIC) provide the right result, but we only demo the latter as BIC is better suited to identify the true model among a set of candidates. Unlike Bayesian procedures, such inferences are prior-free. .. GENERATED FROM PYTHON SOURCE LINES 16-20 .. code-block:: Python # Authors: The scikit-learn developers # SPDX-License-Identifier: BSD-3-Clause .. GENERATED FROM PYTHON SOURCE LINES 21-28 Data generation --------------- We generate two components (each one containing `n_samples`) by randomly sampling the standard normal distribution as returned by `numpy.random.randn`. One component is kept spherical yet shifted and re-scaled. The other one is deformed to have a more general covariance matrix. .. GENERATED FROM PYTHON SOURCE LINES 28-39 .. code-block:: Python import numpy as np n_samples = 500 np.random.seed(0) C = np.array([[0.0, -0.1], [1.7, 0.4]]) component_1 = np.dot(np.random.randn(n_samples, 2), C) # general component_2 = 0.7 * np.random.randn(n_samples, 2) + np.array([-4, 1]) # spherical X = np.concatenate([component_1, component_2]) .. GENERATED FROM PYTHON SOURCE LINES 40-41 We can visualize the different components: .. GENERATED FROM PYTHON SOURCE LINES 41-50 .. code-block:: Python import matplotlib.pyplot as plt plt.scatter(component_1[:, 0], component_1[:, 1], s=0.8) plt.scatter(component_2[:, 0], component_2[:, 1], s=0.8) plt.title("Gaussian Mixture components") plt.axis("equal") plt.show() .. image-sg:: /auto_examples/mixture/images/sphx_glr_plot_gmm_selection_001.png :alt: Gaussian Mixture components :srcset: /auto_examples/mixture/images/sphx_glr_plot_gmm_selection_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 51-70 Model training and selection ---------------------------- We vary the number of components from 1 to 6 and the type of covariance parameters to use: - `"full"`: each component has its own general covariance matrix. - `"tied"`: all components share the same general covariance matrix. - `"diag"`: each component has its own diagonal covariance matrix. - `"spherical"`: each component has its own single variance. We score the different models and keep the best model (the lowest BIC). This is done by using :class:`~sklearn.model_selection.GridSearchCV` and a user-defined score function which returns the negative BIC score, as :class:`~sklearn.model_selection.GridSearchCV` is designed to **maximize** a score (maximizing the negative BIC is equivalent to minimizing the BIC). The best set of parameters and estimator are stored in `best_parameters_` and `best_estimator_`, respectively. .. GENERATED FROM PYTHON SOURCE LINES 70-90 .. code-block:: Python from sklearn.mixture import GaussianMixture from sklearn.model_selection import GridSearchCV def gmm_bic_score(estimator, X): """Callable to pass to GridSearchCV that will use the BIC score.""" # Make it negative since GridSearchCV expects a score to maximize return -estimator.bic(X) param_grid = { "n_components": range(1, 7), "covariance_type": ["spherical", "tied", "diag", "full"], } grid_search = GridSearchCV( GaussianMixture(), param_grid=param_grid, scoring=gmm_bic_score ) grid_search.fit(X) .. raw:: html
GridSearchCV(estimator=GaussianMixture(),
                 param_grid={'covariance_type': ['spherical', 'tied', 'diag',
                                                 'full'],
                             'n_components': range(1, 7)},
                 scoring=<function gmm_bic_score at 0x7fdec0aa5ee0>)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 91-97 Plot the BIC scores ------------------- To ease the plotting we can create a `pandas.DataFrame` from the results of the cross-validation done by the grid search. We re-inverse the sign of the BIC score to show the effect of minimizing it. .. GENERATED FROM PYTHON SOURCE LINES 97-113 .. code-block:: Python import pandas as pd df = pd.DataFrame(grid_search.cv_results_)[ ["param_n_components", "param_covariance_type", "mean_test_score"] ] df["mean_test_score"] = -df["mean_test_score"] df = df.rename( columns={ "param_n_components": "Number of components", "param_covariance_type": "Type of covariance", "mean_test_score": "BIC score", } ) df.sort_values(by="BIC score").head() .. raw:: html
Number of components Type of covariance BIC score
19 2 full 1046.829429
20 3 full 1084.038689
21 4 full 1114.517272
22 5 full 1148.512281
23 6 full 1179.977890


.. GENERATED FROM PYTHON SOURCE LINES 114-125 .. code-block:: Python import seaborn as sns sns.catplot( data=df, kind="bar", x="Number of components", y="BIC score", hue="Type of covariance", ) plt.show() .. image-sg:: /auto_examples/mixture/images/sphx_glr_plot_gmm_selection_002.png :alt: plot gmm selection :srcset: /auto_examples/mixture/images/sphx_glr_plot_gmm_selection_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 126-142 In the present case, the model with 2 components and full covariance (which corresponds to the true generative model) has the lowest BIC score and is therefore selected by the grid search. Plot the best model ------------------- We plot an ellipse to show each Gaussian component of the selected model. For such purpose, one needs to find the eigenvalues of the covariance matrices as returned by the `covariances_` attribute. The shape of such matrices depends on the `covariance_type`: - `"full"`: (`n_components`, `n_features`, `n_features`) - `"tied"`: (`n_features`, `n_features`) - `"diag"`: (`n_components`, `n_features`) - `"spherical"`: (`n_components`,) .. GENERATED FROM PYTHON SOURCE LINES 142-177 .. code-block:: Python from matplotlib.patches import Ellipse from scipy import linalg color_iter = sns.color_palette("tab10", 2)[::-1] Y_ = grid_search.predict(X) fig, ax = plt.subplots() for i, (mean, cov, color) in enumerate( zip( grid_search.best_estimator_.means_, grid_search.best_estimator_.covariances_, color_iter, ) ): v, w = linalg.eigh(cov) if not np.any(Y_ == i): continue plt.scatter(X[Y_ == i, 0], X[Y_ == i, 1], 0.8, color=color) angle = np.arctan2(w[0][1], w[0][0]) angle = 180.0 * angle / np.pi # convert to degrees v = 2.0 * np.sqrt(2.0) * np.sqrt(v) ellipse = Ellipse(mean, v[0], v[1], angle=180.0 + angle, color=color) ellipse.set_clip_box(fig.bbox) ellipse.set_alpha(0.5) ax.add_artist(ellipse) plt.title( f"Selected GMM: {grid_search.best_params_['covariance_type']} model, " f"{grid_search.best_params_['n_components']} components" ) plt.axis("equal") plt.show() .. image-sg:: /auto_examples/mixture/images/sphx_glr_plot_gmm_selection_003.png :alt: Selected GMM: full model, 2 components :srcset: /auto_examples/mixture/images/sphx_glr_plot_gmm_selection_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 1.173 seconds) .. _sphx_glr_download_auto_examples_mixture_plot_gmm_selection.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/1.6.X?urlpath=lab/tree/notebooks/auto_examples/mixture/plot_gmm_selection.ipynb :alt: Launch binder :width: 150 px .. container:: lite-badge .. image:: images/jupyterlite_badge_logo.svg :target: ../../lite/lab/index.html?path=auto_examples/mixture/plot_gmm_selection.ipynb :alt: Launch JupyterLite :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gmm_selection.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gmm_selection.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gmm_selection.zip ` .. include:: plot_gmm_selection.recommendations .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_