public interface PredictorParams extends Params, HasLabelCol, HasFeaturesCol, HasPredictionCol
Modifier and Type | Method and Description |
---|---|
RDD<org.apache.spark.ml.feature.Instance> |
extractInstances(Dataset<?> dataset)
Extract
labelCol , weightCol(if any) and featuresCol from the given dataset,
and put it in an RDD with strong types. |
RDD<org.apache.spark.ml.feature.Instance> |
extractInstances(Dataset<?> dataset,
scala.Function1<org.apache.spark.ml.feature.Instance,scala.runtime.BoxedUnit> validateInstance)
Extract
labelCol , weightCol(if any) and featuresCol from the given dataset,
and put it in an RDD with strong types. |
StructType |
validateAndTransformSchema(StructType schema,
boolean fitting,
DataType featuresDataType)
Validates and transforms the input schema with the provided param map.
|
getLabelCol, labelCol
featuresCol, getFeaturesCol
getPredictionCol, predictionCol
clear, copy, copyValues, defaultCopy, defaultParamMap, explainParam, explainParams, extractParamMap, extractParamMap, get, getDefault, getOrDefault, getParam, hasDefault, hasParam, isDefined, isSet, paramMap, params, set, set, set, setDefault, setDefault, shouldOwn
toString, uid
RDD<org.apache.spark.ml.feature.Instance> extractInstances(Dataset<?> dataset)
labelCol
, weightCol(if any) and featuresCol
from the given dataset,
and put it in an RDD with strong types.dataset
- (undocumented)RDD<org.apache.spark.ml.feature.Instance> extractInstances(Dataset<?> dataset, scala.Function1<org.apache.spark.ml.feature.Instance,scala.runtime.BoxedUnit> validateInstance)
labelCol
, weightCol(if any) and featuresCol
from the given dataset,
and put it in an RDD with strong types.
Validate the output instances with the given function.dataset
- (undocumented)validateInstance
- (undocumented)StructType validateAndTransformSchema(StructType schema, boolean fitting, DataType featuresDataType)
schema
- input schemafitting
- whether this is in fittingfeaturesDataType
- SQL DataType for FeaturesType.
E.g., VectorUDT
for vector features.