.. _example_linear_model_plot_logistic_l1_l2_sparsity.py:
==============================================
L1 Penalty and Sparsity in Logistic Regression
==============================================
Comparison of the sparsity (percentage of zero coefficients) of solutions when
L1 and L2 penalty are used for different values of C. We can see that large
values of C give more freedom to the model. Conversely, smaller values of C
constrain the model more. In the L1 penalty case, this leads to sparser
solutions.
We classify 8x8 images of digits into two classes: 0-4 against 5-9.
The visualization shows coefficients of the models for varying C.
.. image:: images/plot_logistic_l1_l2_sparsity_001.png
:align: center
**Script output**::
C=100.00
Sparsity with L1 penalty: 6.25%
score with L1 penalty: 0.9115
Sparsity with L2 penalty: 4.69%
score with L2 penalty: 0.9098
C=1.00
Sparsity with L1 penalty: 9.38%
score with L1 penalty: 0.9098
Sparsity with L2 penalty: 4.69%
score with L2 penalty: 0.9093
C=0.01
Sparsity with L1 penalty: 85.94%
score with L1 penalty: 0.8620
Sparsity with L2 penalty: 4.69%
score with L2 penalty: 0.8915
**Python source code:** :download:`plot_logistic_l1_l2_sparsity.py `
.. literalinclude:: plot_logistic_l1_l2_sparsity.py
:lines: 15-
**Total running time of the example:** 0.44 seconds
( 0 minutes 0.44 seconds)