sklearn.cross_validation
.LeaveOneOut¶
Warning
DEPRECATED
-
class
sklearn.cross_validation.
LeaveOneOut
(n)[source]¶ Leave-One-Out cross validation iterator.
Deprecated since version 0.18: This module will be removed in 0.20. Use
sklearn.model_selection.LeaveOneOut
instead.Provides train/test indices to split data in train test sets. Each sample is used once as a test set (singleton) while the remaining samples form the training set.
Note:
LeaveOneOut(n)
is equivalent toKFold(n, n_folds=n)
andLeavePOut(n, p=1)
.Due to the high number of test sets (which is the same as the number of samples) this cross validation method can be very costly. For large datasets one should favor KFold, StratifiedKFold or ShuffleSplit.
Read more in the User Guide.
Parameters: n : int
Total number of elements in dataset.
See also
LeaveOneLabelOut
,domain-specific
Examples
>>> from sklearn import cross_validation >>> X = np.array([[1, 2], [3, 4]]) >>> y = np.array([1, 2]) >>> loo = cross_validation.LeaveOneOut(2) >>> len(loo) 2 >>> print(loo) sklearn.cross_validation.LeaveOneOut(n=2) >>> for train_index, test_index in loo: ... print("TRAIN:", train_index, "TEST:", test_index) ... X_train, X_test = X[train_index], X[test_index] ... y_train, y_test = y[train_index], y[test_index] ... print(X_train, X_test, y_train, y_test) TRAIN: [1] TEST: [0] [[3 4]] [[1 2]] [2] [1] TRAIN: [0] TEST: [1] [[1 2]] [[3 4]] [1] [2] .. automethod:: __init__