Connectors (BI, Kafka, Spark)
35 results found
-
Provide an alternative sampling technique for views in MongoBI Connector
As stated in the docs (https://docs.mongodb.com/manual/reference/operator/aggregation/sample/#behavior), the
$sample
stage will perform a fullCOLLSCAN
if some criteria won't be met. The problem is that no view (https://docs.mongodb.com/manual/core/views/) will ever meet these, as the$sample
won't be the first aggregation stage.My proposal is to implement additional,
$skip
+$limit
based sampling.2 votes -
Re-sample only part of the MongoBI Connector schema
It'd be great if we could run "FLUSH SAMPLE collectionName". It's rarely the case that a lot of collections have changed at once and such a "light re-sample" might be a good alternative.
2 votes -
Implement error handling in MongoBI Connector
Right now, if the
$sample
query (used to create the DRDL schema) fails, it'll re-run the query once again, forever. The only place when the error is accessible is in the "Profiler" tab of a given node. In our case - and I believe it's common - that it was a secondary node, which "Profiler" is far less accessible than the one for the primary node.We learned it the hard way, as our cluster was silently maxing out one CPU core for over a week. After a lot of debugging, it turned out that one of our views has…
2 votes -
Enable Write Permission In BI
Hope BI enable write permission because a lot of system especially Windows base using ODBC
6 votes -
BI Connector Atlas - View current schema creation/update status
In Atlas the is no way to see the BI connector logs, so, the request is to be able to see the Current schema creation/update status or even better, be able to see the log.
6 votes -
MongoDB Sink Connector CDC default handler
I would like to have a default CDC handler that can process data produced from MongoDB Source Connector without Debezium https://docs.mongodb.com/kafka-connector/master/kafka-sink-cdc#cdc-handler-configuration
3 votes -
Multiple clusters in a single mongosqld configuration.
The limitation of supporting multiple clusters in a single mongosqld configuration is now supported or not.
3 votes -
Get schema validation "feedback" in Kafka Mongo Sink Connector
Objective :
We want to be able to validate that data matches some requirements. We would like to to perform this data validation by adding a JSON schema in Mongo (such as it is described here : https://docs.mongodb.com/manual/core/schema-validation/).
Problem is that current implementation of the current Mongo DB Kafka Sink connector does not implement the required elements to benefit from features brought by this KIP : https://cwiki.apache.org/confluence/display/KAFKA/KIP-610%3A+Error+Reporting+in+Sink+Connectors
So if we define such a validation on Mongo, if a message has a value that does not match the definition, it would not go in the dead letter queue, and the…
3 votes -
Notification and alerts when BI Connector fails
When a BI Connector fails, no alerts are sent to admins, only the notification on the page is available indicating that it will "restart in 5 minutes".
6 votes -
Make BI Connector for Atlas pricing clear upfront
From this page https://docs.atlas.mongodb.com/bi-connection/ it appears that the BI Connector for Atlas is available if I have an M10 or larger cluster. I upgraded to an M10 cluster to get the BI connector only to discover that if I enable the BI connector, it charges me additional costs. I could not find these additional BI connector prices detailed anywhere on your website except for in my account after I upgraded to the M10 cluster. And even these pricing details are not clear:
BI Connector
$1.47/day for sustained monthly usage
pricing for M10
or $3.84/day, up to $45.00/month maximumWhat is…
1 vote -
Allow separate whitelist for Atlas BI Connector
Currently, whitelisting is only possible at the Project level. We would like to allow whitelisting of the BI connector instance separately from the project.
The Users/IPs that connect to mongo are completely different for BI connector vs the actual DB
8 votes -
Deploy MongoDB BI Connector product using the MongoDB Kubernetes Operator
We would like to have the ability to deploy and run MongoDB BI connector as a container under MongoDB Kubernetes Operator. Currently there is no support on such deployments.
2 votes -
Dropbox
Lead with the drop that will very allow. Permit to be inside and all the time useful !!
stay tuned.1 vote -
1 vote
-
Kafka source connector once only semantics
Added as a suppport case here : https://support.mongodb.com/case/00634630
When using the connector as a Source, i.e we capture change streams from the Source Mongo DB and stream that to a Kafka endpoint.
Imagine these are updates on financial transactions in mongodb and they are NOT tolerant to
1) missed data and
2) duplicated data
in that order.So, we need to make sure that the Change Streams that we are observing(matching) on, are delivered once and exactly once to the Kafka pipeline. (Blog on the same : https://www.confluent.io/blog/exactly-once-semantics-are-possible-heres-how-apache-kafka-does-it/). If exactly-once semantics are enabled, it makes commits transactional by default.
…
4 votes
- Don't see your idea?