1 / 1
Apr 12

Hi I am pulling out data from a mongo db collection to a kafka topic using the mongodb source connector, however the schema that is being registered in my registry does not contain a name space and the name is “default” which causes issues in my Java consumer, How can I get the connector to properly register schemas ?

Here is what my configuration looks like

publish.full.document.only.tombstone.on.delete: true # Publish Full Document Only publish.full.document.only: true copy.existing: "true" copy.existing.pipeline: "[{\"$match\": {}}]" copy.existing.max.threads: "1" key.converter: "org.apache.kafka.connect.storage.StringConverter" key.converter.schemas.enable: "false" key.converter.schema.registry.url: "http://schema-registry-external:8081" key.converter.charset: "UTF-8" value.converter.schemas.enable: "true" value.converter: "io.confluent.connect.avro.AvroConverter" value.converter.schema.registry.url: "http://schema-registry-external:8081" value.converter.charset: "UTF-8" key.converter.key.subject.name.strategy: "io.confluent.kafka.serializers.subject.TopicNameStrategy" value.converter.value.subject.name.strategy: "io.confluent.kafka.serializers.subject.TopicNameStrategy" # This ensures the Avro schema is generated with proper field types output.format.value: schema # Ensure numbers and dates are correctly formatted in JSON output.format.key: json # This infers native Kafka types (int, long, timestamp) rather than MongoDB's extended JSON. output.schema.infer.value: true output.schema.infer.key: true

I am also facing issues with exporting the message key as AVRO as I want the key to be the properties inside the “_id”: {“name”: test, “age”:12} I want this to convert to {“name”: test, “age”:12} in the message key and to have a proper avro schema. So far I have only managed to get is as JSON string