public class AFTSurvivalRegression extends Regressor<Vector,AFTSurvivalRegression,AFTSurvivalRegressionModel> implements AFTSurvivalRegressionParams, DefaultParamsWritable, org.apache.spark.internal.Logging
Since 3.1.0, it supports stacking instances into blocks and using GEMV for better performance. The block size will be 1.0 MB, if param maxBlockSizeInMB is set 0.0 by default.
Constructor and Description |
---|
AFTSurvivalRegression() |
AFTSurvivalRegression(String uid) |
Modifier and Type | Method and Description |
---|---|
IntParam |
aggregationDepth()
Param for suggested depth for treeAggregate (>= 2).
|
Param<String> |
censorCol()
Param for censor column name.
|
AFTSurvivalRegression |
copy(ParamMap extra)
Creates a copy of this instance with the same UID and some extra params.
|
BooleanParam |
fitIntercept()
Param for whether to fit an intercept term.
|
static AFTSurvivalRegression |
load(String path) |
DoubleParam |
maxBlockSizeInMB()
Param for Maximum memory in MB for stacking input data into blocks.
|
IntParam |
maxIter()
Param for maximum number of iterations (>= 0).
|
DoubleArrayParam |
quantileProbabilities()
Param for quantile probabilities array.
|
Param<String> |
quantilesCol()
Param for quantiles column name.
|
static MLReader<T> |
read() |
AFTSurvivalRegression |
setAggregationDepth(int value)
Suggested depth for treeAggregate (greater than or equal to 2).
|
AFTSurvivalRegression |
setCensorCol(String value) |
AFTSurvivalRegression |
setFitIntercept(boolean value)
Set if we should fit the intercept
Default is true.
|
AFTSurvivalRegression |
setMaxBlockSizeInMB(double value)
Sets the value of param
maxBlockSizeInMB . |
AFTSurvivalRegression |
setMaxIter(int value)
Set the maximum number of iterations.
|
AFTSurvivalRegression |
setQuantileProbabilities(double[] value) |
AFTSurvivalRegression |
setQuantilesCol(String value) |
AFTSurvivalRegression |
setTol(double value)
Set the convergence tolerance of iterations.
|
DoubleParam |
tol()
Param for the convergence tolerance for iterative algorithms (>= 0).
|
StructType |
transformSchema(StructType schema)
Check transform validity and derive the output schema from the input schema.
|
String |
uid()
An immutable unique ID for the object and its derivatives.
|
featuresCol, fit, labelCol, predictionCol, setFeaturesCol, setLabelCol, setPredictionCol
params
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getCensorCol, getQuantileProbabilities, getQuantilesCol, hasQuantilesCol, validateAndTransformSchema
extractInstances, extractInstances, validateAndTransformSchema
getLabelCol, labelCol
featuresCol, getFeaturesCol
getPredictionCol, predictionCol
clear, copyValues, defaultCopy, defaultParamMap, explainParam, explainParams, extractParamMap, extractParamMap, get, getDefault, getOrDefault, getParam, hasDefault, hasParam, isDefined, isSet, paramMap, params, set, set, set, setDefault, setDefault, shouldOwn
toString
getMaxIter
getFitIntercept
getAggregationDepth
getMaxBlockSizeInMB
$init$, initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, initLock, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, org$apache$spark$internal$Logging$$log__$eq, org$apache$spark$internal$Logging$$log_, uninitialize
write
save
public AFTSurvivalRegression(String uid)
public AFTSurvivalRegression()
public static AFTSurvivalRegression load(String path)
public static MLReader<T> read()
public final Param<String> censorCol()
AFTSurvivalRegressionParams
censorCol
in interface AFTSurvivalRegressionParams
public final DoubleArrayParam quantileProbabilities()
AFTSurvivalRegressionParams
quantileProbabilities
in interface AFTSurvivalRegressionParams
public final Param<String> quantilesCol()
AFTSurvivalRegressionParams
quantilesCol
in interface AFTSurvivalRegressionParams
public final DoubleParam maxBlockSizeInMB()
HasMaxBlockSizeInMB
maxBlockSizeInMB
in interface HasMaxBlockSizeInMB
public final IntParam aggregationDepth()
HasAggregationDepth
aggregationDepth
in interface HasAggregationDepth
public final BooleanParam fitIntercept()
HasFitIntercept
fitIntercept
in interface HasFitIntercept
public final DoubleParam tol()
HasTol
public final IntParam maxIter()
HasMaxIter
maxIter
in interface HasMaxIter
public String uid()
Identifiable
uid
in interface Identifiable
public AFTSurvivalRegression setCensorCol(String value)
public AFTSurvivalRegression setQuantileProbabilities(double[] value)
public AFTSurvivalRegression setQuantilesCol(String value)
public AFTSurvivalRegression setFitIntercept(boolean value)
value
- (undocumented)public AFTSurvivalRegression setMaxIter(int value)
value
- (undocumented)public AFTSurvivalRegression setTol(double value)
value
- (undocumented)public AFTSurvivalRegression setAggregationDepth(int value)
value
- (undocumented)public AFTSurvivalRegression setMaxBlockSizeInMB(double value)
maxBlockSizeInMB
.
Default is 0.0, then 1.0 MB will be chosen.
value
- (undocumented)public StructType transformSchema(StructType schema)
PipelineStage
We check validity for interactions between parameters during transformSchema
and
raise an exception if any parameter value is invalid. Parameter value checks which
do not depend on other parameters are handled by Param.validate()
.
Typical implementation should first conduct verification on schema change and parameter validity, including complex parameter interaction checks.
transformSchema
in class Predictor<Vector,AFTSurvivalRegression,AFTSurvivalRegressionModel>
schema
- (undocumented)public AFTSurvivalRegression copy(ParamMap extra)
Params
defaultCopy()
.copy
in interface Params
copy
in class Predictor<Vector,AFTSurvivalRegression,AFTSurvivalRegressionModel>
extra
- (undocumented)