Im using MongoDB Spark Connector in Databricks to stream data from MongoDB. My writeStream
works fine, but when trying to readStream. I get the following error
java.lang.ClassNotFoundException: true
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
My query:
query = spark.readStream
.format(“mongodb”)
.option(“spark.mongodb.connection.uri”, mongo_uri)
.option(“spark.mongodb.database”, “db”)
.option(“spark.mongodb.collection”, “collection”)
.option(“spark.mongodb.change.stream.publish.full.document.only”, “true”)
.load()
configuration:
spark 3.5, scala 2.12 and using spark mongodb connector 2.12_10.4.0
Skip to main content
New & Unread Topics
Topic | Replies | Views | Activity |
---|---|---|---|
Atlas Stream Processing is Now Generally Available! | 0 | 529 | May 2024 |
MongoTimedOutException | 1 | 639 | May 2024 |
Stream Processing / Free Tier | 1 | 348 | Jul 2024 |
Uploading large file using Mongo ReactiveGridFsTemplate | 1 | 272 | Jul 2024 |
MongoDB Kafka Connector Configuration Properties | 0 | 37 | Nov 2024 |