Im using MongoDB Spark Connector in Databricks to stream data from MongoDB. My writeStream
works fine, but when trying to readStream. I get the following error
java.lang.ClassNotFoundException: true
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
My query:
query = spark.readStream
.format(“mongodb”)
.option(“spark.mongodb.connection.uri”, mongo_uri)
.option(“spark.mongodb.database”, “db”)
.option(“spark.mongodb.collection”, “collection”)
.option(“spark.mongodb.change.stream.publish.full.document.only”, “true”)
.load()
configuration:
spark 3.5, scala 2.12 and using spark mongodb connector 2.12_10.4.0
Skip to main content
New & Unread Topics
Topic | Replies | Views | Activity |
---|---|---|---|
MongoTimedOutException | 1 | 639 | May 2024 |
Stream Processing / Free Tier | 1 | 348 | Jul 2024 |
Does pymongo 3.6 support db.watch() argument ‘full_document_before_change’? | 2 | 48 | Sep 2024 |
Replication for using Change Streams | 4 | 95 | Sep 2024 |