# neo4j-spark-connector **Repository Path**: mirrors_neo4j/neo4j-spark-connector ## Basic Information - **Project Name**: neo4j-spark-connector - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: 5.0 - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-09-17 - **Last Updated**: 2024-09-17 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Neo4j Connector for Apache Spark This repository contains the Neo4j Connector for Apache Spark. ## License This neo4j-connector-apache-spark is Apache 2 Licensed ## Documentation The documentation for Neo4j Connector for Apache Spark lives at https://github.com/neo4j/docs-spark repository. ## Building for Spark 3 You can build for Spark 3.x with both Scala 2.12 and Scala 2.13 ``` ./maven-release.sh package 2.12 ./maven-release.sh package 2.13 ``` These commands will generate the corresponding targets * `spark-3/target/neo4j-connector-apache-spark_2.12-_for_spark_3.jar` * `spark-3/target/neo4j-connector-apache-spark_2.13-_for_spark_3.jar` ## Integration with Apache Spark Applications **spark-shell, pyspark, or spark-submit** `$SPARK_HOME/bin/spark-shell --jars neo4j-connector-apache-spark_2.12-_for_spark_3.jar` `$SPARK_HOME/bin/spark-shell --packages org.neo4j:neo4j-connector-apache-spark_2.12:_for_spark_3` **sbt** If you use the [sbt-spark-package plugin](https://github.com/databricks/sbt-spark-package), in your sbt build file, add: ```scala resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven" libraryDependencies += "org.neo4j" % "neo4j-connector-apache-spark_2.12" % "_for_spark_3" ``` **maven** In your pom.xml, add: ```xml org.neo4j neo4j-connector-apache-spark_2.12 [version]_for_spark_3 ``` For more info about the available version visit https://neo4j.com/developer/spark/overview/#_compatibility