.. DO NOT EDIT.
.. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY.
.. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE:
.. "auto_examples/manifold/plot_manifold_sphere.py"
.. LINE NUMBERS ARE GIVEN BELOW.

.. only:: html

    .. note::
        :class: sphx-glr-download-link-note

        Click :ref:`here <sphx_glr_download_auto_examples_manifold_plot_manifold_sphere.py>`
        to download the full example code or to run this example in your browser via Binder

.. rst-class:: sphx-glr-example-title

.. _sphx_glr_auto_examples_manifold_plot_manifold_sphere.py:


=============================================
Manifold Learning methods on a severed sphere
=============================================

An application of the different :ref:`manifold` techniques
on a spherical data-set. Here one can see the use of
dimensionality reduction in order to gain some intuition
regarding the manifold learning methods. Regarding the dataset,
the poles are cut from the sphere, as well as a thin slice down its
side. This enables the manifold learning techniques to
'spread it open' whilst projecting it onto two dimensions.

For a similar example, where the methods are applied to the
S-curve dataset, see :ref:`sphx_glr_auto_examples_manifold_plot_compare_methods.py`

Note that the purpose of the :ref:`MDS <multidimensional_scaling>` is
to find a low-dimensional representation of the data (here 2D) in
which the distances respect well the distances in the original
high-dimensional space, unlike other manifold-learning algorithms,
it does not seeks an isotropic representation of the data in
the low-dimensional space. Here the manifold problem matches fairly
that of representing a flat map of the Earth, as with
`map projection <https://en.wikipedia.org/wiki/Map_projection>`_

.. GENERATED FROM PYTHON SOURCE LINES 28-163



.. image-sg:: /auto_examples/manifold/images/sphx_glr_plot_manifold_sphere_001.png
   :alt: Manifold Learning with 1000 points, 10 neighbors, LLE (0.052 sec), LTSA (0.077 sec), Hessian LLE (0.14 sec), Modified LLE (0.1 sec), Isomap (0.22 sec), MDS (0.49 sec), Spectral Embedding (0.042 sec), t-SNE (3.8 sec)
   :srcset: /auto_examples/manifold/images/sphx_glr_plot_manifold_sphere_001.png
   :class: sphx-glr-single-img


.. rst-class:: sphx-glr-script-out

 .. code-block:: none

    standard: 0.052 sec
    ltsa: 0.077 sec
    hessian: 0.14 sec
    modified: 0.1 sec
    ISO: 0.22 sec
    MDS: 0.49 sec
    Spectral Embedding: 0.042 sec
    t-SNE: 3.8 sec






|

.. code-block:: default


    # Author: Jaques Grobler <jaques.grobler@inria.fr>
    # License: BSD 3 clause

    from time import time
    import numpy as np
    import matplotlib.pyplot as plt
    from matplotlib.ticker import NullFormatter
    from sklearn import manifold
    from sklearn.utils import check_random_state

    # Unused but required import for doing 3d projections with matplotlib < 3.2
    import mpl_toolkits.mplot3d  # noqa: F401
    import warnings

    # Variables for manifold learning.
    n_neighbors = 10
    n_samples = 1000

    # Create our sphere.
    random_state = check_random_state(0)
    p = random_state.rand(n_samples) * (2 * np.pi - 0.55)
    t = random_state.rand(n_samples) * np.pi

    # Sever the poles from the sphere.
    indices = (t < (np.pi - (np.pi / 8))) & (t > ((np.pi / 8)))
    colors = p[indices]
    x, y, z = (
        np.sin(t[indices]) * np.cos(p[indices]),
        np.sin(t[indices]) * np.sin(p[indices]),
        np.cos(t[indices]),
    )

    # Plot our dataset.
    fig = plt.figure(figsize=(15, 8))
    plt.suptitle(
        "Manifold Learning with %i points, %i neighbors" % (1000, n_neighbors), fontsize=14
    )

    ax = fig.add_subplot(251, projection="3d")
    ax.scatter(x, y, z, c=p[indices], cmap=plt.cm.rainbow)
    ax.view_init(40, -10)

    sphere_data = np.array([x, y, z]).T

    # Perform Locally Linear Embedding Manifold learning
    methods = ["standard", "ltsa", "hessian", "modified"]
    labels = ["LLE", "LTSA", "Hessian LLE", "Modified LLE"]

    for i, method in enumerate(methods):
        t0 = time()
        trans_data = (
            manifold.LocallyLinearEmbedding(
                n_neighbors=n_neighbors, n_components=2, method=method
            )
            .fit_transform(sphere_data)
            .T
        )
        t1 = time()
        print("%s: %.2g sec" % (methods[i], t1 - t0))

        ax = fig.add_subplot(252 + i)
        plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
        plt.title("%s (%.2g sec)" % (labels[i], t1 - t0))
        ax.xaxis.set_major_formatter(NullFormatter())
        ax.yaxis.set_major_formatter(NullFormatter())
        plt.axis("tight")

    # Perform Isomap Manifold learning.
    t0 = time()
    trans_data = (
        manifold.Isomap(n_neighbors=n_neighbors, n_components=2)
        .fit_transform(sphere_data)
        .T
    )
    t1 = time()
    print("%s: %.2g sec" % ("ISO", t1 - t0))

    ax = fig.add_subplot(257)
    plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
    plt.title("%s (%.2g sec)" % ("Isomap", t1 - t0))
    ax.xaxis.set_major_formatter(NullFormatter())
    ax.yaxis.set_major_formatter(NullFormatter())
    plt.axis("tight")

    # Perform Multi-dimensional scaling.
    t0 = time()
    mds = manifold.MDS(2, max_iter=100, n_init=1)
    trans_data = mds.fit_transform(sphere_data).T
    t1 = time()
    print("MDS: %.2g sec" % (t1 - t0))

    ax = fig.add_subplot(258)
    plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
    plt.title("MDS (%.2g sec)" % (t1 - t0))
    ax.xaxis.set_major_formatter(NullFormatter())
    ax.yaxis.set_major_formatter(NullFormatter())
    plt.axis("tight")

    # Perform Spectral Embedding.
    t0 = time()
    se = manifold.SpectralEmbedding(n_components=2, n_neighbors=n_neighbors)
    trans_data = se.fit_transform(sphere_data).T
    t1 = time()
    print("Spectral Embedding: %.2g sec" % (t1 - t0))

    ax = fig.add_subplot(259)
    plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
    plt.title("Spectral Embedding (%.2g sec)" % (t1 - t0))
    ax.xaxis.set_major_formatter(NullFormatter())
    ax.yaxis.set_major_formatter(NullFormatter())
    plt.axis("tight")

    # Perform t-distributed stochastic neighbor embedding.
    # TODO(1.2) Remove warning handling.
    with warnings.catch_warnings():
        warnings.filterwarnings(
            "ignore", message="The PCA initialization", category=FutureWarning
        )
        t0 = time()
        tsne = manifold.TSNE(
            n_components=2, init="pca", random_state=0, learning_rate="auto"
        )
        trans_data = tsne.fit_transform(sphere_data).T
        t1 = time()
    print("t-SNE: %.2g sec" % (t1 - t0))

    ax = fig.add_subplot(2, 5, 10)
    plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
    plt.title("t-SNE (%.2g sec)" % (t1 - t0))
    ax.xaxis.set_major_formatter(NullFormatter())
    ax.yaxis.set_major_formatter(NullFormatter())
    plt.axis("tight")

    plt.show()


.. rst-class:: sphx-glr-timing

   **Total running time of the script:** ( 0 minutes  5.426 seconds)


.. _sphx_glr_download_auto_examples_manifold_plot_manifold_sphere.py:

.. only:: html

  .. container:: sphx-glr-footer sphx-glr-footer-example


    .. container:: binder-badge

      .. image:: images/binder_badge_logo.svg
        :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/1.1.X?urlpath=lab/tree/notebooks/auto_examples/manifold/plot_manifold_sphere.ipynb
        :alt: Launch binder
        :width: 150 px

    .. container:: sphx-glr-download sphx-glr-download-python

      :download:`Download Python source code: plot_manifold_sphere.py <plot_manifold_sphere.py>`

    .. container:: sphx-glr-download sphx-glr-download-jupyter

      :download:`Download Jupyter notebook: plot_manifold_sphere.ipynb <plot_manifold_sphere.ipynb>`


.. only:: html

 .. rst-class:: sphx-glr-signature

    `Gallery generated by Sphinx-Gallery <https://sphinx-gallery.github.io>`_