Trait

org.apache.predictionio.data.storage

PEvents

Related Doc: package storage

Permalink

trait PEvents extends Serializable

:: DeveloperApi :: Base trait of a data access object that returns Event related RDD data structure. Engine developers should use org.apache.predictionio.data.store.PEventStore instead of using this directly.

Annotations
@DeveloperApi()
Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. PEvents
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def delete(eventIds: RDD[String], appId: Int, channelId: Option[Int])(sc: SparkContext): Unit

    Permalink
    Annotations
    @DeveloperApi()
  2. abstract def write(events: RDD[Event], appId: Int, channelId: Option[Int])(sc: SparkContext): Unit

    Permalink

    :: DeveloperApi :: Write events to database

    :: DeveloperApi :: Write events to database

    events

    RDD of Event

    appId

    the app ID

    channelId

    channel ID (default channel if it's None)

    sc

    Spark Context

    Annotations
    @DeveloperApi()
  3. abstract def find(appId: Int, channelId: Option[Int] = None, startTime: Option[DateTime] = None, untilTime: Option[DateTime] = None, entityType: Option[String] = None, entityId: Option[String] = None, eventNames: Option[Seq[String]] = None, targetEntityType: Option[Option[String]] = None, targetEntityId: Option[Option[String]] = None)(sc: SparkContext): RDD[Event]

    Permalink

    :: DeveloperApi :: Read from database and return the events.

    :: DeveloperApi :: Read from database and return the events. The deprecation here is intended to engine developers only.

    appId

    return events of this app ID

    channelId

    return events of this channel ID (default channel if it's None)

    startTime

    return events with eventTime >= startTime

    untilTime

    return events with eventTime < untilTime

    entityType

    return events of this entityType

    entityId

    return events of this entityId

    eventNames

    return events with any of these event names.

    targetEntityType

    return events of this targetEntityType:

    • None means no restriction on targetEntityType
    • Some(None) means no targetEntityType for this event
    • Some(Some(x)) means targetEntityType should match x.
    targetEntityId

    return events of this targetEntityId

    • None means no restriction on targetEntityId
    • Some(None) means no targetEntityId for this event
    • Some(Some(x)) means targetEntityId should match x.
    sc

    Spark context

    returns

    RDD[Event]

    Annotations
    @deprecated @DeveloperApi()
    Deprecated

    (Since version 0.9.2) Use PEventStore.find() instead.

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  12. lazy val logger: Logger

    Permalink
    Attributes
    protected
  13. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  15. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  17. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  18. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  20. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  21. def write(events: RDD[Event], appId: Int)(sc: SparkContext): Unit

    Permalink

    :: DeveloperApi :: Write events to database

    :: DeveloperApi :: Write events to database

    events

    RDD of Event

    appId

    the app ID

    sc

    Spark Context

    Annotations
    @DeveloperApi()

Deprecated Value Members

  1. def aggregateProperties(appId: Int, channelId: Option[Int] = None, entityType: String, startTime: Option[DateTime] = None, untilTime: Option[DateTime] = None, required: Option[Seq[String]] = None)(sc: SparkContext): RDD[(String, PropertyMap)]

    Permalink

    Aggregate properties of entities based on these special events: $set, $unset, $delete events.

    Aggregate properties of entities based on these special events: $set, $unset, $delete events. The deprecation here is intended to engine developers only.

    appId

    use events of this app ID

    channelId

    use events of this channel ID (default channel if it's None)

    entityType

    aggregate properties of the entities of this entityType

    startTime

    use events with eventTime >= startTime

    untilTime

    use events with eventTime < untilTime

    required

    only keep entities with these required properties defined

    sc

    Spark context

    returns

    RDD[(String, PropertyMap)] RDD of entityId and PropertyMap pair

    Annotations
    @deprecated
    Deprecated

    (Since version 0.9.2) Use PEventStore.aggregateProperties() instead.

  2. def extractEntityMap[A](appId: Int, entityType: String, startTime: Option[DateTime] = None, untilTime: Option[DateTime] = None, required: Option[Seq[String]] = None)(sc: SparkContext)(extract: (DataMap) ⇒ A)(implicit arg0: ClassTag[A]): EntityMap[A]

    Permalink

    :: Experimental :: Extract EntityMap[A] from events for the entityType NOTE: it is local EntityMap[A]

    :: Experimental :: Extract EntityMap[A] from events for the entityType NOTE: it is local EntityMap[A]

    Annotations
    @deprecated @Experimental()
    Deprecated

    (Since version 0.9.2) Use PEventStore.aggregateProperties() instead.

  3. def getByAppIdAndTimeAndEntity(appId: Int, startTime: Option[DateTime], untilTime: Option[DateTime], entityType: Option[String], entityId: Option[String])(sc: SparkContext): RDD[Event]

    Permalink
    Annotations
    @deprecated
    Deprecated

    (Since version 0.9.2) Use PEventStore.find() instead.

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped