com.salesforce.op.stages.impl.classification
Input Features type
Input Features type
Checks the input length
Checks the input length
input features
true is input size as expected, false otherwise
Check if the stage is serializable
Check if the stage is serializable
Failure if not serializable
This method is used to make a copy of the instance with new parameters in several methods in spark internals Default will find the constructor and make a copy for any class (AS LONG AS ALL CONSTRUCTOR PARAMS ARE VALS, this is why type tags are written as implicit vals in base classes).
This method is used to make a copy of the instance with new parameters in several methods in spark internals Default will find the constructor and make a copy for any class (AS LONG AS ALL CONSTRUCTOR PARAMS ARE VALS, this is why type tags are written as implicit vals in base classes).
Note: that the convention in spark is to have the uid be a constructor argument, so that copies will share a uid with the original (developers should follow this convention).
new parameters want to add to instance
a new instance with the same uid
Function that fits the binary model
Function that fits the binary model
Gets names of parameters that control input columns for Spark stage
Gets names of parameters that control input columns for Spark stage
Gets an input feature Note: this method IS NOT safe to use outside the driver, please use getTransientFeature method instead
Gets an input feature Note: this method IS NOT safe to use outside the driver, please use getTransientFeature method instead
array of features
NoSuchElementException
if the features are not set
RuntimeException
in case one of the features is null
Gets the input features Note: this method IS NOT safe to use outside the driver, please use getTransientFeatures method instead
Gets the input features Note: this method IS NOT safe to use outside the driver, please use getTransientFeatures method instead
array of features
NoSuchElementException
if the features are not set
RuntimeException
in case one of the features is null
Method to access the local version of stage being wrapped
Method to access the local version of stage being wrapped
Option of ml leap runtime version of the spark stage after reloading as local
Output features that will be created by this stage
Output features that will be created by this stage
feature of type OutputFeatures
Gets names of parameters that control output columns for Spark stage
Gets names of parameters that control output columns for Spark stage
Name of output feature (i.e.
Name of output feature (i.e. column created by this stage)
Method to access the spark stage being wrapped
Method to access the spark stage being wrapped
Option of spark ml stage
Gets a save path for wrapped spark stage
Gets a save path for wrapped spark stage
Gets an input feature at index i
Gets an input feature at index i
input index
maybe an input feature
Gets the input Features
Function to convert InputFeatures to an Array of FeatureLike
Function to convert InputFeatures to an Array of FeatureLike
an Array of FeatureLike
Function to be called on getMetadata
Function to be called on getMetadata
Function to be called on setInput
Function to be called on setInput
Short unique name of the operation this stage performs
Short unique name of the operation this stage performs
operation name
Function to convert OutputFeatures to an Array of FeatureLike
Function to convert OutputFeatures to an Array of FeatureLike
an Array of FeatureLike
Should output feature be a response? Yes, if any of the input features are.
Should output feature be a response? Yes, if any of the input features are.
true if the the output feature should be a response
the predictor to wrap
the predictor to wrap
Suggested depth for treeAggregate (greater than or equal to 2).
Suggested depth for treeAggregate (greater than or equal to 2). If the dimensions of features or the number of partitions are large, this param could be adjusted to a larger size. Default is 2.
Set the ElasticNet mixing parameter.
Set the ElasticNet mixing parameter. For alpha = 0, the penalty is an L2 penalty. For alpha = 1, it is an L1 penalty. For alpha in (0,1), the penalty is a combination of L1 and L2. Default is 0.0 which is an L2 penalty.
Note: Fitting under bound constrained optimization only supports L2 regularization, so throws exception if this param is non-zero value.
Sets the value of param family.
Sets the value of param family. Default is "auto".
Whether to fit an intercept term.
Whether to fit an intercept term. Default is true.
Input features that will be used by the stage
Input features that will be used by the stage
feature of type InputFeatures
Sets input features
Sets input features
feature like type
array of input features
this stage
Set the lower bounds on coefficients if fitting under bound constrained optimization.
Set the lower bounds on intercepts if fitting under bound constrained optimization.
Set the maximum number of iterations.
Set the maximum number of iterations. Default is 100.
Set the regularization parameter.
Set the regularization parameter. Default is 0.0.
Sets a save path for wrapped spark stage
Sets a save path for wrapped spark stage
Whether to standardize the training features before fitting the model.
Whether to standardize the training features before fitting the model. The coefficients of models will be always returned on the original scale, so it will be transparent for users. Note that with/without standardization, the models should be always converged to the same solution when no regularization is applied. In R's GLMNET package, the default behavior is true as well. Default is true.
Set the convergence tolerance of iterations.
Set the convergence tolerance of iterations. Smaller value will lead to higher accuracy at the cost of more iterations. Default is 1E-6.
Set the upper bounds on coefficients if fitting under bound constrained optimization.
Set the upper bounds on intercepts if fitting under bound constrained optimization.
Sets the value of param weightCol.
Sets the value of param weightCol. If this is not set or empty, we treat all instance weights as 1.0. Default is not set, so all instances have weight one.
Stage unique name consisting of the stage operation name and uid
Stage unique name consisting of the stage operation name and uid
stage name
This function translates the input and output features into spark schema checks and changes that will occur on the underlying data frame
This function translates the input and output features into spark schema checks and changes that will occur on the underlying data frame
schema of the input data frame
a new schema with the output features added
Type tag of the output
Type tag of the output
Type tag of the output value
Type tag of the output value
stage uid
stage uid
Wrapper around spark ml logistic regression org.apache.spark.ml.classification.LogisticRegression