neo4j-contrib / neo4j-spark-connector   5.0.3

Apache License 2.0 Website GitHub

Neo4j Connector for Apache Spark, which provides bi-directional read/write access to Neo4j from Spark, using the Spark DataSource APIs

Scala versions: 2.13 2.12

Neo4j Connector for Apache Spark

This repository contains the Neo4j Connector for Apache Spark.


This neo4j-connector-apache-spark is Apache 2 Licensed


The documentation for Neo4j Connector for Apache Spark lives at repository.

Building for Spark 3

You can build for Spark 3.x with both Scala 2.12 and Scala 2.13

./ package 2.12
./ package 2.13

These commands will generate the corresponding targets

  • spark-3/target/neo4j-connector-apache-spark_2.12-<version>_for_spark_3.jar
  • spark-3/target/neo4j-connector-apache-spark_2.13-<version>_for_spark_3.jar

Integration with Apache Spark Applications

spark-shell, pyspark, or spark-submit

$SPARK_HOME/bin/spark-shell --jars neo4j-connector-apache-spark_2.12-<version>_for_spark_3.jar

$SPARK_HOME/bin/spark-shell --packages org.neo4j:neo4j-connector-apache-spark_2.12:<version>_for_spark_3


If you use the sbt-spark-package plugin, in your sbt build file, add:

resolvers += "Spark Packages Repo" at ""
libraryDependencies += "org.neo4j" % "neo4j-connector-apache-spark_2.12" % "<version>_for_spark_3"


In your pom.xml, add:

  <!-- list of dependencies -->

For more info about the available version visit