Class

org.apache.predictionio.e2.engine

PythonAlgorithm

Related Doc: package engine

Permalink

class PythonAlgorithm extends P2LAlgorithm[EmptyPreparedData, PipelineModel, Query, Row]

Linear Supertypes
P2LAlgorithm[EmptyPreparedData, PipelineModel, Query, Row], BaseAlgorithm[EmptyPreparedData, PipelineModel, Query, Row], BaseQuerySerializer, AbstractDoer, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. PythonAlgorithm
  2. P2LAlgorithm
  3. BaseAlgorithm
  4. BaseQuerySerializer
  5. AbstractDoer
  6. Serializable
  7. Serializable
  8. AnyRef
  9. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new PythonAlgorithm()

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def batchPredict(m: PipelineModel, qs: RDD[(Long, Query)]): RDD[(Long, Row)]

    Permalink

    This is a default implementation to perform batch prediction.

    This is a default implementation to perform batch prediction. Override this method for a custom implementation.

    m

    A model

    qs

    An RDD of index-query tuples. The index is used to keep track of predicted results with corresponding queries.

    returns

    Batch of predicted results

    Definition Classes
    P2LAlgorithm
  6. def batchPredictBase(sc: SparkContext, bm: Any, qs: RDD[(Long, Query)]): RDD[(Long, Row)]

    Permalink

    :: DeveloperApi :: Engine developers should not use this directly.

    :: DeveloperApi :: Engine developers should not use this directly. This is called by evaluation workflow to perform batch prediction.

    sc

    Spark context

    bm

    Model

    qs

    Batch of queries

    returns

    Batch of predicted results

    Definition Classes
    P2LAlgorithmBaseAlgorithm
  7. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  12. lazy val gsonTypeAdapterFactories: Seq[TypeAdapterFactory]

    Permalink

    :: DeveloperApi :: Serializer for Java query classes using Gson

    :: DeveloperApi :: Serializer for Java query classes using Gson

    Definition Classes
    BaseQuerySerializer
  13. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  14. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  15. def makePersistentModel(sc: SparkContext, modelId: String, algoParams: Params, bm: Any): Any

    Permalink

    :: DeveloperApi :: Engine developers should not use this directly (read on to see how parallel-to-local algorithm models are persisted).

    :: DeveloperApi :: Engine developers should not use this directly (read on to see how parallel-to-local algorithm models are persisted).

    Parallel-to-local algorithms produce local models. By default, models will be serialized and stored automatically. Engine developers can override this behavior by mixing the PersistentModel trait into the model class, and PredictionIO will call PersistentModel.save instead. If it returns true, a org.apache.predictionio.workflow.PersistentModelManifest will be returned so that during deployment, PredictionIO will use PersistentModelLoader to retrieve the model. Otherwise, Unit will be returned and the model will be re-trained on-the-fly.

    sc

    Spark context

    modelId

    Model ID

    algoParams

    Algorithm parameters that trained this model

    bm

    Model

    returns

    The model itself for automatic persistence, an instance of org.apache.predictionio.workflow.PersistentModelManifest for manual persistence, or Unit for re-training on deployment

    Definition Classes
    P2LAlgorithmBaseAlgorithm
    Annotations
    @DeveloperApi()
  16. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. def predict(model: PipelineModel, query: Query): Row

    Permalink

    Implement this method to produce a prediction from a query and trained model.

    Implement this method to produce a prediction from a query and trained model.

    model

    Trained model produced by train.

    query

    An input query.

    returns

    A prediction.

    Definition Classes
    PythonAlgorithmP2LAlgorithm
  20. def predictBase(bm: Any, q: Query): Row

    Permalink

    :: DeveloperApi :: Engine developers should not use this directly.

    :: DeveloperApi :: Engine developers should not use this directly. Called by serving to perform a single prediction.

    bm

    Model

    q

    Query

    returns

    Predicted result

    Definition Classes
    P2LAlgorithmBaseAlgorithm
  21. def queryClass: Class[Query]

    Permalink

    :: DeveloperApi :: Obtains the type signature of query for this algorithm

    :: DeveloperApi :: Obtains the type signature of query for this algorithm

    returns

    Type signature of query

    Definition Classes
    BaseAlgorithm
  22. lazy val querySerializer: Formats

    Permalink

    :: DeveloperApi :: Serializer for Scala query classes using org.apache.predictionio.controller.Utils.json4sDefaultFormats

    :: DeveloperApi :: Serializer for Scala query classes using org.apache.predictionio.controller.Utils.json4sDefaultFormats

    Definition Classes
    BaseQuerySerializer
  23. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  24. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  25. def train(sc: SparkContext, data: EmptyPreparedData): PipelineModel

    Permalink

    Implement this method to produce a model from prepared data.

    Implement this method to produce a model from prepared data.

    returns

    Trained model.

    Definition Classes
    PythonAlgorithmP2LAlgorithm
  26. def trainBase(sc: SparkContext, pd: EmptyPreparedData): PipelineModel

    Permalink

    :: DeveloperApi :: Engine developers should not use this directly.

    :: DeveloperApi :: Engine developers should not use this directly. This is called by workflow to train a model.

    sc

    Spark context

    pd

    Prepared data

    returns

    Trained model

    Definition Classes
    P2LAlgorithmBaseAlgorithm
  27. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from P2LAlgorithm[EmptyPreparedData, PipelineModel, Query, Row]

Inherited from BaseAlgorithm[EmptyPreparedData, PipelineModel, Query, Row]

Inherited from BaseQuerySerializer

Inherited from AbstractDoer

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped