A new generation of technologies is needed to consume and exploit today's real time, fast moving data sources. Apache Kafka, originally developed at LinkedIn, has emerged as one of these key new technologies. This paper explores the use-cases and architecture for Kafka, and how it integrates with MongoDB to build sophisticated data-driven applications that exploit new sources of data.
Download the white paper to learn:
- What data streaming is and where it fits into modern data architectures
- How Kafka works, what it delivers, and where it's used
- Implementation recommendations & limitations
- What alternatives exist and which technologies complement Kafka
- How to operationalize the Data Lake with MongoDB & Kafka
- How MongoDB integrates with Kafka – both as a producer and a consumer of event data