//: # (<!-- Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to You under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. -->
You can proceed to Choosing an Engine Template, or continue the QuickStart guide of the Engine template if you have already chosen one.)
Follow the steps below to setup Apache PredictionIO (incubating) and its dependencies. In these instructions we will assume you are in your home directory. Wherever you see
/home/abc, replace it with your own home directory.
Ensure you have an appropriate Java version installed. For example:
1 2 3 4
$ java -version java version "1.8.0_40" Java(TM) SE Runtime Environment (build 1.8.0_40-b25) Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)
Download Apache PredictionIO (incubating)
Download Apache PredictionIO (incubating) and extract it.
1 2 3 4 5
$ cd $ pwd /home/abc $ wget http://download.prediction.io/PredictionIO-0.11.0-incubating.tar.gz $ tar zxvf PredictionIO-0.11.0-incubating.tar.gz
Let us install dependencies inside a subdirectory of the Apache PredictionIO (incubating) installation. By following this convention, you can use PredictionIO's default configuration as is.
$ mkdir PredictionIO-0.11.0-incubating/vendors
Apache Spark is the default processing engine for PredictionIO. Download and extract it.
$ wget http://d3kbcqa49mib13.cloudfront.net/spark-1.6.3-bin-hadoop2.6.tgz $ tar zxvfC spark-1.6.3-bin-hadoop2.6.tgz PredictionIO-0.11.0-incubating/vendors
Elasticsearch can be used as a storage backend for the meta data repository.
$ wget https://download.elastic.co/elasticsearch/elasticsearch/elasticsearch-1.7.6.tar.gz $ tar zxvfC elasticsearch-1.7.6.tar.gz PredictionIO-0.11.0-incubating/vendors
If you are not using the default setting at
localhost, you may change the following in
PredictionIO-0.11.0-incubating/conf/pio-env.sh to fit your setup.
1 2 3
PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300
HBase can be used as the backend of the event data repository.
Download HBase from a mirror. Extract HBase by following the example below.
$ tar zxvfC hbase-1.2.5-bin.tar.gz PredictionIO-0.11.0-incubating/vendors
You will need to at least add a minimal configuration to HBase to start it in standalone mode. Details can be found here. Here, we are showing a sample minimal configuration.
1 2 3 4 5 6 7 8 9 10
<configuration> <property> <name>hbase.rootdir</name> <value>file:///home/abc/PredictionIO-0.11.0-incubating/vendors/hbase-1.2.5/data</value> </property> <property> <name>hbase.zookeeper.property.dataDir</name> <value>/home/abc/PredictionIO-0.11.0-incubating/vendors/hbase-1.2.5/zookeeper</value> </property> </configuration>
PredictionIO-0.11.0-incubating/vendors/hbase-1.2.5/conf/hbase-env.sh to set
JAVA_HOME for the cluster. For example:
For Mac users, use this instead (change
1.7 if you have Java 7 installed):
export JAVA_HOME=`/usr/libexec/java_home -v 1.8`
In addition, you must set your environment variable
JAVA_HOME. For example, in
/home/abc/.bashrc add the following line:
Start PredictionIO and Dependent Services
PredictionIO-0.11.0-incubating/bin/pio-start-all and you should see something similar to the following:
1 2 3 4 5 6 7
$ PredictionIO-0.11.0-incubating/bin/pio-start-all Starting Elasticsearch... Starting HBase... starting master, logging to /home/abc/PredictionIO-0.11.0-incubating/vendors/hbase-1.2.5/bin/../logs/hbase-abc-master-yourhost.local.out Waiting 10 seconds for HBase to fully initialize... Starting PredictionIO Event Server... $
You may use
jps to verify that you have everything started:
1 2 3 4 5 6
$ jps -l 15344 org.apache.hadoop.hbase.master.HMaster 15409 org.apache.predictionio.tools.console.Console 15256 org.elasticsearch.bootstrap.Elasticsearch 15469 sun.tools.jps.Jps $
A running setup will have these up and running:
At any time, you can run
PredictionIO-0.11.0-incubating/bin/pio status to check the status of the dependencies.
Now you have installed everything you need!
You can proceed to Choosing an Engine Template, or continue the QuickStart guide of the Engine template if you have already chosen one.