Learn the "why" behind slow queries and how to fix them in our 2-Part Webinar.
Register now >
Docs Menu
Docs Home
/ /

Batch Mode

In batch mode, you can use the Spark Dataset and DataFrame APIs to process data at a specified time interval.

The following sections show you how to use the Spark Connector to read data from MongoDB and write data to MongoDB in batch mode:

  • Read from MongoDB in Batch Mode

  • Write to MongoDB in Batch Mode

Tip

Apache Spark Documentation

To learn more about using Spark to process batches of data, see the Spark Programming Guide.

Back

Configure TLS/SSL

On this page