LassoLarsIC#
Implementa un modelo Lasso con el algoritmo LARS y criterios de información.
Para la estimación de los parámetros se minimiza la función objetivo:
\min_w \frac{1}{2n} ||Xw -y||_2^2 + \alpha * ||w||_1
[1]:
from sklearn.linear_model import LassoLarsIC
lassoLarsIC = LassoLarsIC(
# --------------------------------------------------------------------------
# The type of criterion to use {'aic', 'bic'}
criterion='aic',
# --------------------------------------------------------------------------
# Whether to fit the intercept for this model.
fit_intercept=True,
# --------------------------------------------------------------------------
# The maximum number of iterations.
max_iter=500,
# --------------------------------------------------------------------------
# When set to True, forces the coefficients to be positive.
positive=False,
# --------------------------------------------------------------------------
# The estimated noise variance of the data. If None, an unbiased estimate is
# computed by an OLS model. However, it is only possible in the case where
# n_samples > n_features + fit_intercept.
noise_variance=None,
)
[2]:
from sklearn.datasets import load_diabetes
X, y = load_diabetes(return_X_y=True)
[3]:
lassoLarsIC.fit(X, y)
lassoLarsIC.score(X, y)
[3]:
0.5134098914486394
[4]:
lassoLarsIC.alpha_
[4]:
0.045206256469784906
[5]:
lassoLarsIC.coef_
[5]:
array([ 0. , -197.75346667, 522.27003779, 297.15393894,
-103.94552857, 0. , -223.92409377, 0. ,
514.7480026 , 54.76900516])
[6]:
lassoLarsIC.intercept_
[6]:
152.13348416289602
[7]:
lassoLarsIC.alphas_
[7]:
array([2.14804358, 2.01202214, 1.02465091, 0.71509814, 0.29441072,
0.20086946, 0.15602894, 0.04520626, 0.01239262, 0.01151185,
0.00493726, 0.0029648 , 0. ])