Docs Home → MongoDB Spark Connector
Release Notes
MongoDB Connector for Spark 3.0.0
Released on August 17, 2020
Support for Spark 3.0.0.
Updated
mongodb-driver-sync
version to 4.0.4Updated dependency to
mongodb-driver-sync
MongoDB Connector for Spark 2.4.2
Released on June 10, 2020
Updated MongoDB Java Driver to 3.12.5.
Don't use Java SPI for the data source internally.
Fix
BsonOrdering
bug for strings of different lengths.
MongoDB Connector for Spark 2.4.1
Released on June 6, 2019
Ensures nullable fields or container types accept
null
values.Added
ReadConfig.batchSize
property. For more information, see Input Configuration.Renamed system property
spark.mongodb.keep_alive_ms
tomongodb.keep_alive_ms
.Added
MongoDriverInformation
to the defaultMongoClient
.Updated to latest Java driver (3.10.+)
Updated
PartitionerHelper.matchQuery
to no longer include$ne
/$exists
checks.Added logging support for partitioners and their queries.
Added
WriteConfig.extendedBsonTypes
setting so users can disable extended BSON types when writing. For more information, see Output Configuration.Added Java spi can now use short form:
spark.read.format("mongo")
.spark.read.format("mongo")
can be used in place ofspark.read.format("com.mongodb.spark.sql")
andspark.read.format("com.mongodb.spark.sql.DefaultSource")
.
MongoDB Connector for Spark 2.4.0
Released on December 7, 2018
Support Spark 2.4.0. Updated Spark dependency to 2.4.0.
Ensure
WriteConfig.ordered
is applied to write operations.Updated Mongo Java Driver to 3.9.0.
Added Scala 2.12 support.
Fixed
MongoSpark.toDF()
to use the providedMongoConnector
.