Skip to Main Content

MongoByte MongoDB Logo

Welcome to the new MongoDB Feedback Portal!

{Improvement: "Your idea"}
We’ve upgraded our system to better capture and act on your feedback.
Your feedback is meaningful and helps us build better products.

ADD NEW FEEDBACK

Feedback

Atlas Archive on Azure requires India Region due to Digital Personal Data Protection Rules 2025

What problem are you trying to solve? Focus on the what and why of the need you have, not the how you'd like it solved. With the enforcement of the DPDP (Digital Personal Data Protection Rules 2025) regulations effective November 1, 2025, org...
Rankesh Kumar 4 months ago in Online Archive 6 Will Not Implement

Online Archive support the UAE (me-central-1) region

What problem are you trying to solve? Focus on the what and why of the need you have, not the how you'd like it solved. business need to store data within UAE for compliance reasons What would you like to see happen? Describe the desir...
Buzzin Buzzin 3 months ago in Online Archive 1 Future Consideration

online archive data to move to glacier after specific retention period

What problem are you trying to solve? Focus on the what and why of the need you have, not the how you'd like it solved. Move data from S3 to a glacier storage post a specific time which costs much less than S3 What would you like to see ...
Ashish Shukla 4 months ago in Online Archive 1 Will Not Implement

Add X509 as a Authentication Method

The current authentication methods lack in their ability to control password complexity requirements (SCRAM-SHA-*) or require exposing on-premise LDAP servers to the public internet. The addition of X509 is worthwhile given that MongoDB Atlas alre...
Andrew Raczka almost 5 years ago in Kafka Connector / Spark Connector 0 Submitted

CFLE support for Kafka connector

Using MongoDB CFLE, The data need to be passed to downstream using Kafka connector which is consumed by downstream datalake for further processing. When data is pushed into Kafka it will remain encrypted and same encrypted data will land into the ...
Guest about 3 years ago in Kafka Connector 0 Submitted

Retry/reconnect mechanism Mongodb Source Connectors on MongoTimeoutException

The MongoDB Kafka Source and Sink connectors for the data streaming are working fine seamlessly until there are any errors in Kafka Source connectors. If there is any error occurs, the connectors are not recovered from the timeout exceptions and ...
Guest over 2 years ago in Kafka Connector 0 Submitted

Ignore heartbeats-mongodb topic by default

As per KAFKA-208, SMTs can't be applied to the heartbeats-mongodb topic. Users should not have to configure each connector to ignore this topic. Please either ignore this topic by default or provide a command-line switch so it can be ignored.
Guest over 4 years ago in Kafka Connector 0 Submitted

Built-in Partial Update $push Strategy for MongoDB Kafka Connector

We would like the official MongoDB Kafka Connector to natively support partial updates that append (\$push) records into an array in an upsert scenario—specifically, a built-in WriteModelStrategy to handle use cases where: We match on a configura...
Guest about 1 year ago in Kafka Connector 0 Submitted

Get schema validation "feedback" in Kafka Mongo Sink Connector

Objective : We want to be able to validate that data matches some requirements. We would like to to perform this data validation by adding a JSON schema in Mongo (such as it is described here : https://docs.mongodb.com/manual/core/schema-validat...
Guest over 5 years ago in Kafka Connector 1 Submitted

MongoDB Sink Connector CDC default handler

I would like to have a default CDC handler that can process data produced from MongoDB Source Connector without Debezium https://docs.mongodb.com/kafka-connector/master/kafka-sink-cdc#cdc-handler-configuration
Guest over 5 years ago in Kafka Connector 2 Submitted