Run


Set Environment Variables

Set ANALYTICS_ZOO_HOME and SPARK_HOME:

export SPARK_HOME=folder path where you extract the spark package
export ANALYTICS_ZOO_HOME=folder path where you extract the analytics zoo package
export SPARK_HOME=folder path where you extract the spark package
export ANALYTICS_ZOO_HOME=the dist folder generated by the build process, which is under the top level of the source folder

Use Interactive Spark Shell

You can try Analytics Zoo easily using the Spark interactive shell. Run below command to start spark shell with Analytics Zoo support:

${ANALYTICS_ZOO_HOME}/bin/spark-shell-with-zoo.sh --master local[*]

You will see a welcome message looking like below:

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Spark context available as sc.
scala>

Now you'll be able to play with Analytics Zoo API's. For instance, to load a pre-trained object detection model, you may try below code:

scala> import com.intel.analytics.zoo.models.image.objectdetection.ObjectDetector
import com.intel.analytics.zoo.models.image.objectdetection.ObjectDetector

scala> ObjectDetector.loadModel[Float](params.modelPath)

Run as a Spark Program

You can run a analytics zoo program, e.g., the Object Detection, as a standard Spark program (running in either local mode or cluster mode) as follows:

  1. Download the pre-trained model from here.
  2. Prepare predict images
  3. Run the following command:
  # Spark local mode
  spark-submit --master local[core_number] --class com.intel.analytics.zoo.examples.objectdetection.Predict \
  dist/lib/analytics-zoo-VERSION-jar-with-dependencies.jar \
  --image path_to_your_images --output path_to_output --model path_to_model

  # Spark standalone mode
  spark-submit --master spark://... --executor-cores cores_per_executor \
  --total-executor-cores total_cores_for_the_job \
  --class com.intel.analytics.zoo.examples.objectdetection.Predict \
  dist/lib/analytics-zoo-VERSION-jar-with-dependencies.jar \
  --image path_to_your_images --output path_to_output --model path_to_model

  # Spark yarn client mode
  spark-submit --master yarn --deploy-mode client \
  --executor-cores cores_per_executor \
  --num-executors executors_number \
  --class com.intel.analytics.zoo.examples.objectdetection.Predict \
  dist/lib/analytics-zoo-VERSION-jar-with-dependencies.jar \
  --image path_to_your_images --output path_to_output --model path_to_model

  # Spark yarn cluster mode
  spark-submit --master yarn --deploy-mode cluster \
  --executor-cores cores_per_executor \
  --num-executors executors_number \
  --class com.intel.analytics.zoo.examples.objectdetection.Predict \
  dist/lib/analytics-zoo-VERSION-jar-with-dependencies.jar \
  --image path_to_your_images --output path_to_output --model path_to_model

If you are to run your own program, do remember to create SparkContext and initialize before calling other Analytics Zoo API's, as shown below.

import com.intel.analytics.zoo.common.NNContext
val sc = NNContext.initNNContext(conf)