sklearn.preprocessing.power_transform

sklearn.preprocessing.power_transform(X, method=’box-cox’, copy=True)[source]

Apply a power transform featurewise to make data more Gaussian-like.

Power transforms are a family of parametric, monotonic transformations that are applied to make data more Gaussian-like. This is useful for modeling issues related to heteroscedasticity (non-constant variance), or other situations where normality is desired. Note that power transforms do not result in standard normal distributions (i.e. the transformed data could be far from zero-mean, unit-variance).

.

Currently, power_transform() supports the Box-Cox transform. Box-Cox requires input data to be strictly positive. The optimal parameter for stabilizing variance and minimizing skewness is estimated through maximum likelihood.

Read more in the User Guide.

Parameters:
X : array-like, shape (n_samples, n_features)

The data to be transformed using a power transformation.

method : str, (default=’box-cox’)

The power transform method. Currently, ‘box-cox’ (Box-Cox transform) is the only option available.

copy : boolean, optional, default=True

Set to False to perform inplace computation.

See also

PowerTransformer
Performs power transformation using the Transformer API (as part of a preprocessing sklearn.pipeline.Pipeline).
quantile_transform
Maps data to a standard normal distribution with the parameter output_distribution=’normal’.

Notes

For a comparison of the different scalers, transformers, and normalizers, see examples/preprocessing/plot_all_scaling.py.

Examples

>>> import numpy as np
>>> from sklearn.preprocessing import power_transform
>>> data = [[1, 2], [3, 2], [4, 5]]
>>> print(power_transform(data, method='box-cox'))  
[[ 0...      0.342...]
 [ 2.068...  0.342...]
 [ 3.135...  0.416...]]