pystruct.learners.
SubgradientLatentSSVM
(model, max_iter=100, C=1.0, verbose=0, momentum=0.0, learning_rate='auto', n_jobs=1, show_loss_every=0, decay_exponent=1, decay_t0=10, break_on_no_constraints=True, logger=None, averaging=None)[source]¶Latent Variable Structured SVM solver using subgradient descent.
Implements a margin rescaled with l1 slack penalty. By default, a constant learning rate is used. It is also possible to use the adaptive learning rate found by AdaGrad.
This class implements online subgradient descent. If n_jobs != 1, small batches of size n_jobs are used to exploit parallel inference. If inference is fast, use n_jobs=1.
Parameters: | model : StructuredModel
max_iter : int, default=100
C : float, default=1.
verbose : int, default=0
learning_rate : float or ‘auto’, default=’auto’
momentum : float, default=0.0
n_jobs : int, default=1
show_loss_every : int, default=0
decay_exponent : float, default=1
decay_t0 : float, default=10
break_on_no_constraints : bool, default=True
averaging : string, default=None
|
---|---|
Attributes: | w : nd-array, shape=(model.size_joint_feature,)
``loss_curve_`` : list of float
``objective_curve_`` : list of float
``timestamps_`` : list of int
|
Methods
fit (X, Y[, H_init, warm_start, initialize]) |
Learn parameters using subgradient descent. |
get_params ([deep]) |
Get parameters for this estimator. |
predict (X) |
|
predict_latent (X) |
|
score (X, Y) |
Compute score as 1 - loss over whole data set. |
set_params (**params) |
Set the parameters of this estimator. |
__init__
(model, max_iter=100, C=1.0, verbose=0, momentum=0.0, learning_rate='auto', n_jobs=1, show_loss_every=0, decay_exponent=1, decay_t0=10, break_on_no_constraints=True, logger=None, averaging=None)[source]¶fit
(X, Y, H_init=None, warm_start=False, initialize=True)[source]¶Learn parameters using subgradient descent.
Parameters: | X : iterable
Y : iterable
constraints : None
warm_start : boolean, default=False
initialize : boolean, default=True
|
---|
get_params
(deep=True)¶Get parameters for this estimator.
Parameters: | deep: boolean, optional :
|
---|---|
Returns: | params : mapping of string to any
|
score
(X, Y)[source]¶Compute score as 1 - loss over whole data set.
Returns the average accuracy (in terms of model.loss) over X and Y.
Parameters: | X : iterable
Y : iterable
|
---|---|
Returns: | score : float
|
set_params
(**params)¶Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects
(such as pipelines). The former have parameters of the form
<component>__<parameter>
so that it’s possible to update each
component of a nested object.
Returns: | self : |
---|