## 3. Define Evaluation and EngineParamsGenerator object

Create a new file BatchEvaluation.scala. Note that the new BatchPersistableEvaluator is used. The BatchEngineParamsList specifies the parameters of the engine.

Modify the appName parameter in DataSourceParams to match your app name.

 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 package org.template.recommendation import org.apache.predictionio.controller.EngineParamsGenerator import org.apache.predictionio.controller.EngineParams import org.apache.predictionio.controller.Evaluation object BatchEvaluation extends Evaluation { // Define Engine and Evaluator used in Evaluation /** * Specify the new BatchPersistableEvaluator. */ engineEvaluator = (RecommendationEngine(), new BatchPersistableEvaluator()) } object BatchEngineParamsList extends EngineParamsGenerator { // We only interest in a single engine params. engineParamsList = Seq( EngineParams( dataSourceParams = DataSourceParams(appName = "INVALID_APP_NAME", evalParams = None), algorithmParamsList = Seq(("als", ALSAlgorithmParams( rank = 10, numIterations = 20, lambda = 0.01, seed = Some(3L)))))) } 

## 4. build and run

Run the following command to build

 1 $pio build  After the build is successful, you should see the following outputs:  1 [INFO] [Console$] Your engine is ready for training. 

To run the BatchEvaluation with BatchEngineParamsList, run the following command:

 1 $pio eval org.template.recommendation.BatchEvaluation org.template.recommendation.BatchEngineParamsList  You should see the following outputs:  1 2 3 4 [INFO] [BatchPersistableEvaluator] Writing result to disk [INFO] [BatchPersistableEvaluator] Result can be found in batch_result [INFO] [CoreWorkflow$] Updating evaluation instance with result: org.template.recommendation.BatchPersistableEvaluatorResult@2f886889 [INFO] [CoreWorkflow\$] runEvaluation completed 

You should find the batch queries and the predicted results in the output directory batch_result/.