EventJoin us at AWS re:Invent 2024! Learn how to use MongoDB for AI use cases. Learn more >>

ATLAS

Atlas Stream Processing

Simplify integrating MongoDB with Apache Kafka to build event-driven applications.

Get Started Now
Illustration of vectors going into and coming out of a pipe.

A data model built for streaming data

Schema management is critical to data correctness and developer productivity when working with streaming data. The document model gives developers a flexible, natural data model for building apps with real-time data.

A data model built for streaming data illustration.
A unified developer experience illustration.

A unified developer experience

Developers can use one platform—across API, query language, and data model—to continuously process streaming data from Apache Kafka alongside the critical application data stored in their databases.

Fully managed in Atlas

With a few lines of code, developers can quickly integrate streaming data from Apache Kafka with their database to build reactive, responsive applications—all fully managed with Atlas.

Fully managed in Atlas illustration.

Native stream processing in MongoDB Atlas

Use Atlas Stream Processing to easily process and validate complex event data, merging it exactly where you need to use it.

Integrate with Apache Kafka data streams

Atlas Stream Processing makes querying data from Apache Kafka as easy as querying a MongoDB database. A stream processor is made up of a source stage, any number of processing stages, and a sink stage.

Read the documentation
MongoDB Query API
Read the Documentation

Perform continuous analytics using window functions

Window operators in Atlas Stream Processing allow you to analyze and process specific, fixed-sized windows of data within a continuous data stream, making it easy to discover patterns and trends in near real-time.

Read the documentation
MongoDB Query API

Validate schema on complex events

In Atlas Stream Processing, developers can perform continuous validation. Detecting potential message corruption and late-arriving data ensures that events are properly formed before processing.

Read the documentation
MongoDB Query API

Atlas Stream Processing customer successes

View all customers
CONTINUOUS INSIGHTS
"At Acoustic, our key focus is to empower brands with behavioral insights that enable them to create engaging, personalized customer experiences. With Atlas Stream Processing, our engineers can leverage the skills they already have from working with data in Atlas to process new data continuously, ensuring our customers have access to real-time customer insights."
John Riewerts
EVP of Engineering, Acoustic
Learn More

Learning hub

Find white papers, tutorials, and videos about how to handle streaming data.

Stream processing use cases

View all use cases

FAQ

Ready to get started?

Check out a tutorial to get started creating a stream processor today.
Get Started NowRegister now
GET STARTED TODAY
  • Easily integrate Kafka & MongoDB
  • Process data continuously
  • Native MongoDB experience
  • Available globally