Download a pre-built library

You can download the Analytics Zoo release and nightly build from the Release Page

Currently, Analytics Zoo releases are hosted on maven central; here's an example to add the Analytics Zoo dependency to your own project:


You can find the latest ANALYTICS_ZOO_VERSION here.

SBT developers can use

libraryDependencies += "" % "analytics-zoo-bigdl_0.12.2-[spark_2.1.1|spark_2.2.0|spark_2.3.1|spark_2.4.3|spark_3.0.0]" % "${ANALYTICS_ZOO_VERSION}"


Currently, Analytics Zoo development version is hosted on SonaType.

To link your application with the latest Analytics Zoo development version, you should add some dependencies like Linking with Analytics Zoo releases, but set ${ANALYTICS_ZOO_VERSION} to latest version, and add below repository to your pom.xml.

    <name>sonatype repository</name>

SBT developers can use

resolvers += "ossrh repository" at ""

Download Analytics Zoo Source

Analytics Zoo source code is available at GitHub.

$ git clone

By default, git clone will download the development version of Analytics Zoo, if you want a release version, you can use command git checkout to change the version.

Setup Build Environment

The following instructions are aligned with master code.

Maven 3 is needed to build Analytics Zoo, you can download it from the maven website.

After installing Maven 3, please set the environment variable MAVEN_OPTS as follows:

$ export MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=512m"

When compiling with Java 7, you need to add the option ā€œ-XX:MaxPermSize=1Gā€.

It is highly recommended that you build Analytics Zoo using the script. And it will handle the MAVEN_OPTS variable.

Once downloaded, you can build Analytics Zoo with the following commands:

$ bash

After that, you can find a dist folder, which contains all the needed files to run a Analytics Zoo program. The files in dist include:

The instructions above will build Analytics Zoo with Spark 2.1.0. It is highly recommended to use Java 8 when running with Spark 2.x; otherwise you may observe very poor performance.

Build with Spark version

By default, uses Spark 2.1.0. To override the default behaviors, for example building analytics-zoo with spark 2.2.0, you can use bash -Dspark.version=2.2.0 -Dbigdl.artifactId=bigdl_SPARK_2.2.
Additionally, we provide a profile to build with spark 2.4, you can use bash -P spark_2.4+.

Build with Maven

To build Analytics Zoo directly using Maven, run the command below:

$ mvn clean package -DskipTests

After that, you can find that jar packages in PATH_TO_ANALYTICS_ZOO/zoo/target/, where PATH_TO_ANALYTICS_ZOO is the path to the directory of the Analytics Zoo.

Note that the instructions above will build Analytics Zoo with Spark 2.1.0 for Linux. Similarly, you may customize spark version like above.

Build with JDK 11

It's recommended to download Oracle JDK 11. And it will avoid possible incompatibility with maven plugins. Update PATH and make sure your JAVA_HOME environment variable is set to Java 11 if you're running from the command line. Or if you're running from an IDE, you need to make sure it is set to run maven with your current JDK.

Jdk 11 supports few Scala versions. You can see scala version compatibility description. Analytics Zoo supports Spark3 with Scala 2.12. You can use -P spark_3.x to specify Spark3 and scala 2.12. Additionally, default uses Java 8. To compile with java 11, it requires to specify building opts -Djava.version=11 -Djavac.version=11. You can build with or Maven with following command.

Build with

$ bash -P spark_3.x -Djava.version=11 -Djavac.version=11

Or build with Maven:

$ mvn clean package -DskipTests -P spark_3.x -Djava.version=11 -Djavac.version=11

Setup IDE

We set the scope of spark related library to provided in pom.xml. The reason is that we don't want package spark related jars which will make analytics zoo a huge jar, and generally as analytics zoo is invoked by spark-submit, these dependencies will be provided by spark at run-time.

This will cause a problem in IDE. When you run applications, it will throw NoClassDefFoundError because the library scope is provided.

You can easily change the scopes by the all-in-one profile.