MongoDB Spark Connector
The official MongoDB Spark Connector.
The connector is published on Spark packages, the community index of third-party packages for Apache Spark. The binaries and dependency information for Maven, SBT, Ivy, and others can also be found on Maven Central.
Support / Feedback
For issues with, questions about, or feedback for the MongoDB Spark Connector, please look into our support channels. Please do not email any of the developers directly with issues or questions - you're more likely to get an answer on the mongodb-user discussion forum.
At a minimum, please include in your description the exact version of the driver that you are using. If you are having connectivity issues, it's often also useful to paste in the Spark configuration. You should also check your application logs for any connectivity-related exceptions and post those as well.
Bugs / Feature Requests
Think you’ve found a bug? Want to see a new feature in the Spark driver? Please open a case in our issue management tool, JIRA:
- Create an account and login.
- Navigate to the SPARK project.
- Click Create Issue - Please provide as much information as possible about the issue type and how to reproduce it.
Bug reports in JIRA for the driver and the Core Server (i.e. SERVER) project are public.
If you’ve identified a security vulnerability in a driver or any other MongoDB project, please report it according to the instructions here.
The MongoDB Spark Connector does not follow semantic versioning. The MongoDB Spark Connector version relates to the version of Spark.
Major changes may occur between point releases may occur, such as new APIs and updating the underlying Java driver to support new features. See the changelog for information about changes between releases.
Note: The following instructions are intended for internal use.
downloading instructions for information on getting and using the MongoDB Spark Connector.Please see the
To build the driver:
$ git clone https://github.com/mongodb/mongo-spark.git $ cd mongo-spark $ ./sbt check
To publish the signed jars - first commit and tag all changes to publish.
$ ./sbt +publishArchives
To publish to spark packages:
$ ./sbt +spPublish
See the sbt-spark-package plugin for more information.
Note: If uploading fails run
./sbt +spDist and manually upload the archives to
- Ross Lawley email@example.com
Additional contributors can be found here.