FakeRun allows user to implement custom function under the exact environment as other PredictionIO workflow.
Workflow parameters.
Workflow parameters.
Batch label of the run.
Verbosity level.
Controls whether trained models are persisted.
Spark properties that will be set in SparkConf.setAll().
Skips all data sanity check.
Stops workflow after reading from data source.
Stops workflow after data preparation.
:: DeveloperApi :: Singleton object that collects anonymous functions to be executed to allow the process to end gracefully.
:: DeveloperApi :: Singleton object that collects anonymous functions to be executed to allow the process to end gracefully.
For example, the Elasticsearch REST storage client maintains an internal connection pool that must be closed to allow the process to exit.
CoreWorkflow handles PredictionIO metadata and environment variables of training and evaluation.
Collection of reusable workflow related utilities that touch on Apache Spark.
Collection of reusable workflow related utilities that touch on Apache Spark. They are separated to avoid compilation problems with certain code.
Collection of workflow creation methods.
Collection of reusable workflow related utilities.
FakeRun allows user to implement custom function under the exact environment as other PredictionIO workflow.
Useful for developing new features. Only need to extend this trait and implement a function: (SparkContext => Unit). For example, the code below can be run with
pio eval HelloWorld
.