New
{New}  We’ve started inviting developers into the Atlas Stream Processing private preview. Request access today!

PREVIEW

Atlas Stream Processing

Transform building event-driven applications by continuously processing streams of data with a familiar developer experience.
Request Private Preview
Atlas Stream Processing Explained thumbnail image
Atlas Stream Processing Explained in 3 minutes
Learn how Atlas Stream Processing combines the document model, flexible schemas, and a rich aggregation language to provide a new level of power and convenience when building applications that require processing complex event data at scale.
This is an image

Stream processing like never before

When working with streaming data, schema management is critical to data correctness and developer productivity. MongoDB’s document model and familiar aggregation framework give developers powerful capabilities and productivity gains you won't find elsewhere in stream processing.
This is an image

Unifying data in motion and data at rest

For the first time, developers can use one platform – across API, query language, and data model – to continuously process streaming data alongside the critical application data stored in their database.
This is an image

Fully managed in Atlas

Atlas Stream Processing builds on our robust and integrated developer data platform. With just a few API calls and lines of code, a developer can stand up a stream processor, database, and API serving layer — all fully managed on any of the major cloud providers.

Capabilities
misc_grow

Continuous processing

Build aggregation pipelines to continuously query, analyze, and react to streaming data without the delays inherent to batch processing.

general_features_complexity

Continuous validation

Perform Continuous Schema Validation to check that events are properly formed before processing, detect message corruption, and detect late arriving data that has missed a processing window.

general_features_build_faster

Continuous merge

Continuously materialize views into Atlas database collections or streaming systems like Apache Kafka to maintain fresh analytical views of data supporting decision making and action.


Atlas Stream Processing

How does it unify the experience of working with data in motion and data at rest?
Atlas Stream Processing diagram
Hands typing on laptop
Event-driven applications
Paving the path to a responsive and reactive real-time business
Download the Whitepaper

Check out the simplicity and power of Atlas Stream Processing

Use Atlas Stream Processing to easily process and validate complex event data, merging it for use exactly where you need it.
View Documentation
Querying Apache Kafka data streams
Atlas Stream Processing makes querying data from Apache Kafka as easy as it to query MongoDB. Just define a source, desired aggregation stages, and a sink to quickly process your Apache Kafka data streams.
Advanced analytics with windowing functions
Window operators in Atlas Stream Processing allow you to analyze and process specific, fixed-sized windows of data within a continuous data stream, making it easy to discover patterns and trends.
Schema validation of complex events
Continuous validation is essential to ensuring events are properly formed before processing, for detecting message corruption, and detecting late arriving data that has missed a processing window.
Querying Apache Kafka data streams
Atlas Stream Processing makes querying data from Apache Kafka as easy as it to query MongoDB. Just define a source, desired aggregation stages, and a sink to quickly process your Apache Kafka data streams.
MongoDB Query API
Advanced analytics with windowing functions
Window operators in Atlas Stream Processing allow you to analyze and process specific, fixed-sized windows of data within a continuous data stream, making it easy to discover patterns and trends.
MongoDB Query API
Schema validation of complex events
Continuous validation is essential to ensuring events are properly formed before processing, for detecting message corruption, and detecting late arriving data that has missed a processing window.
MongoDB Query API
MongoDB Query API

FAQ

Want to learn more about stream processing?
What is streaming data?
Streaming data is generated continuously from a wide range of sources. IoT sensors, microservices, or mobile devices are all common sources of high volume streams of data. The continuous nature of streaming data as well as its immutability make it unique from static data at rest in a database.
What is stream processing?
Stream processing is continuously ingesting and transforming event data from an event messaging platform (like Apache Kafka) to perform various functions. This could mean creating simple filters to remove unneeded data, performing aggregations to count or sum data as needed, creating stateful windows, and more. Stream processing can be a differentiating characteristic in event-driven applications, allowing for more reactive, responsive customer experiences.
How is event streaming different from stream processing?

Streaming data lives inside of event streaming platforms (like Apache Kafka), and these systems are essentially an immutable distributed log. Event data is published and consumed from event streaming platforms using APIs.

Developers need to use a stream processor to perform more advanced processing, such as stateful aggregations, window operations, mutations, and creating materialized views. These are similar to the operations one does when running queries on a database, except that stream processing continuously queries an endless stream of data. This area of streaming is more nascent; however, technologies such as Apache Flink and Spark Streaming are quickly gaining traction.

Stream processing is the area where Atlas Stream Processing focuses. MongoDB is providing developers with a better way to process streams for use in their applications, leveraging the aggregation framework.

Why did MongoDB build Atlas Stream Processing?
Stream processing is an increasingly critical component to building responsive, event-driven applications. By adding stream processing functionality as a native capability in Atlas, we can help more developers build innovative applications leveraging our multi-cloud developer data platform, MongoDB Atlas.
How do I get access to the private preview?
You can request access to the private preview of Atlas Stream Processing from this page and our team will be in touch once available.
How is stream processing different from batch processing?

Stream processing is the processing of data continuously. In the context of building event-driven applications, stream processing enables reactive and compelling experiences like real-time notifications, personalization, route planning, or predictive maintenance.

Batch processing does not work on continuously produced data. Instead, batch processing works by gathering data over a specified period of time and then processing that static data as needed. An example of batch processing is a retail business collecting sales at the close of business each day for reporting purposes and/or updating inventory levels.

Request private preview today

Once the private preview of Atlas Stream Processing is available, our team will be in touch.
Request Private Preview
An illustration of the chart with a rocket