.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/manifold/plot_lle_digits.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via JupyterLite or Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_manifold_plot_lle_digits.py: ============================================================================= Manifold learning on handwritten digits: Locally Linear Embedding, Isomap... ============================================================================= We illustrate various embedding techniques on the digits dataset. .. GENERATED FROM PYTHON SOURCE LINES 9-18 .. code-block:: Python # Authors: Fabian Pedregosa # Olivier Grisel # Mathieu Blondel # Gael Varoquaux # Guillaume Lemaitre # License: BSD 3 clause (C) INRIA 2011 .. GENERATED FROM PYTHON SOURCE LINES 19-22 Load digits dataset ------------------- We will load the digits dataset and only use six first of the ten available classes. .. GENERATED FROM PYTHON SOURCE LINES 22-29 .. code-block:: Python from sklearn.datasets import load_digits digits = load_digits(n_class=6) X, y = digits.data, digits.target n_samples, n_features = X.shape n_neighbors = 30 .. GENERATED FROM PYTHON SOURCE LINES 30-31 We can plot the first hundred digits from this data set. .. GENERATED FROM PYTHON SOURCE LINES 31-39 .. code-block:: Python import matplotlib.pyplot as plt fig, axs = plt.subplots(nrows=10, ncols=10, figsize=(6, 6)) for idx, ax in enumerate(axs.ravel()): ax.imshow(X[idx].reshape((8, 8)), cmap=plt.cm.binary) ax.axis("off") _ = fig.suptitle("A selection from the 64-dimensional digits dataset", fontsize=16) .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_001.png :alt: A selection from the 64-dimensional digits dataset :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 40-46 Helper function to plot embedding --------------------------------- Below, we will use different techniques to embed the digits dataset. We will plot the projection of the original data onto each embedding. It will allow us to check whether or digits are grouped together in the embedding space, or scattered across it. .. GENERATED FROM PYTHON SOURCE LINES 46-84 .. code-block:: Python import numpy as np from matplotlib import offsetbox from sklearn.preprocessing import MinMaxScaler def plot_embedding(X, title): _, ax = plt.subplots() X = MinMaxScaler().fit_transform(X) for digit in digits.target_names: ax.scatter( *X[y == digit].T, marker=f"${digit}$", s=60, color=plt.cm.Dark2(digit), alpha=0.425, zorder=2, ) shown_images = np.array([[1.0, 1.0]]) # just something big for i in range(X.shape[0]): # plot every digit on the embedding # show an annotation box for a group of digits dist = np.sum((X[i] - shown_images) ** 2, 1) if np.min(dist) < 4e-3: # don't show points that are too close continue shown_images = np.concatenate([shown_images, [X[i]]], axis=0) imagebox = offsetbox.AnnotationBbox( offsetbox.OffsetImage(digits.images[i], cmap=plt.cm.gray_r), X[i] ) imagebox.set(zorder=1) ax.add_artist(imagebox) ax.set_title(title) ax.axis("off") .. GENERATED FROM PYTHON SOURCE LINES 85-103 Embedding techniques comparison ------------------------------- Below, we compare different techniques. However, there are a couple of things to note: * the :class:`~sklearn.ensemble.RandomTreesEmbedding` is not technically a manifold embedding method, as it learn a high-dimensional representation on which we apply a dimensionality reduction method. However, it is often useful to cast a dataset into a representation in which the classes are linearly-separable. * the :class:`~sklearn.discriminant_analysis.LinearDiscriminantAnalysis` and the :class:`~sklearn.neighbors.NeighborhoodComponentsAnalysis`, are supervised dimensionality reduction method, i.e. they make use of the provided labels, contrary to other methods. * the :class:`~sklearn.manifold.TSNE` is initialized with the embedding that is generated by PCA in this example. It ensures global stability of the embedding, i.e., the embedding does not depend on random initialization. .. GENERATED FROM PYTHON SOURCE LINES 103-158 .. code-block:: Python from sklearn.decomposition import TruncatedSVD from sklearn.discriminant_analysis import LinearDiscriminantAnalysis from sklearn.ensemble import RandomTreesEmbedding from sklearn.manifold import ( MDS, TSNE, Isomap, LocallyLinearEmbedding, SpectralEmbedding, ) from sklearn.neighbors import NeighborhoodComponentsAnalysis from sklearn.pipeline import make_pipeline from sklearn.random_projection import SparseRandomProjection embeddings = { "Random projection embedding": SparseRandomProjection( n_components=2, random_state=42 ), "Truncated SVD embedding": TruncatedSVD(n_components=2), "Linear Discriminant Analysis embedding": LinearDiscriminantAnalysis( n_components=2 ), "Isomap embedding": Isomap(n_neighbors=n_neighbors, n_components=2), "Standard LLE embedding": LocallyLinearEmbedding( n_neighbors=n_neighbors, n_components=2, method="standard" ), "Modified LLE embedding": LocallyLinearEmbedding( n_neighbors=n_neighbors, n_components=2, method="modified" ), "Hessian LLE embedding": LocallyLinearEmbedding( n_neighbors=n_neighbors, n_components=2, method="hessian" ), "LTSA LLE embedding": LocallyLinearEmbedding( n_neighbors=n_neighbors, n_components=2, method="ltsa" ), "MDS embedding": MDS(n_components=2, n_init=1, max_iter=120, n_jobs=2), "Random Trees embedding": make_pipeline( RandomTreesEmbedding(n_estimators=200, max_depth=5, random_state=0), TruncatedSVD(n_components=2), ), "Spectral embedding": SpectralEmbedding( n_components=2, random_state=0, eigen_solver="arpack" ), "t-SNE embedding": TSNE( n_components=2, n_iter=500, n_iter_without_progress=150, n_jobs=2, random_state=0, ), "NCA embedding": NeighborhoodComponentsAnalysis( n_components=2, init="pca", random_state=0 ), } .. GENERATED FROM PYTHON SOURCE LINES 159-162 Once we declared all the methods of interest, we can run and perform the projection of the original data. We will store the projected data as well as the computational time needed to perform each projection. .. GENERATED FROM PYTHON SOURCE LINES 162-177 .. code-block:: Python from time import time projections, timing = {}, {} for name, transformer in embeddings.items(): if name.startswith("Linear Discriminant Analysis"): data = X.copy() data.flat[:: X.shape[1] + 1] += 0.01 # Make X invertible else: data = X print(f"Computing {name}...") start_time = time() projections[name] = transformer.fit_transform(data, y) timing[name] = time() - start_time .. rst-class:: sphx-glr-script-out .. code-block:: none Computing Random projection embedding... Computing Truncated SVD embedding... Computing Linear Discriminant Analysis embedding... Computing Isomap embedding... Computing Standard LLE embedding... Computing Modified LLE embedding... Computing Hessian LLE embedding... Computing LTSA LLE embedding... Computing MDS embedding... Computing Random Trees embedding... Computing Spectral embedding... Computing t-SNE embedding... Computing NCA embedding... .. GENERATED FROM PYTHON SOURCE LINES 178-179 Finally, we can plot the resulting projection given by each method. .. GENERATED FROM PYTHON SOURCE LINES 179-184 .. code-block:: Python for name in timing: title = f"{name} (time {timing[name]:.3f}s)" plot_embedding(projections[name], title) plt.show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_002.png :alt: Random projection embedding (time 0.001s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_002.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_003.png :alt: Truncated SVD embedding (time 0.003s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_003.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_004.png :alt: Linear Discriminant Analysis embedding (time 0.006s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_004.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_005.png :alt: Isomap embedding (time 0.780s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_005.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_006.png :alt: Standard LLE embedding (time 0.172s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_006.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_007.png :alt: Modified LLE embedding (time 0.411s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_007.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_008.png :alt: Hessian LLE embedding (time 0.529s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_008.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_009.png :alt: LTSA LLE embedding (time 0.407s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_009.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_010.png :alt: MDS embedding (time 2.983s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_010.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_011.png :alt: Random Trees embedding (time 0.187s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_011.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_012.png :alt: Spectral embedding (time 0.159s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_012.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_013.png :alt: t-SNE embedding (time 2.558s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_013.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_014.png :alt: NCA embedding (time 2.626s) :srcset: /auto_examples/manifold/images/sphx_glr_plot_lle_digits_014.png :class: sphx-glr-multi-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 15.437 seconds) .. _sphx_glr_download_auto_examples_manifold_plot_lle_digits.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/1.4.X?urlpath=lab/tree/notebooks/auto_examples/manifold/plot_lle_digits.ipynb :alt: Launch binder :width: 150 px .. container:: lite-badge .. image:: images/jupyterlite_badge_logo.svg :target: ../../lite/lab/?path=auto_examples/manifold/plot_lle_digits.ipynb :alt: Launch JupyterLite :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_lle_digits.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_lle_digits.py ` .. include:: plot_lle_digits.recommendations .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_