Skip to content

Connectors (BI, Kafka, Spark)

41 results found

  1. Kafka source connector once only semantics

    Added as a suppport case here : https://support.mongodb.com/case/00634630

    When using the connector as a Source, i.e we capture change streams from the Source Mongo DB and stream that to a Kafka endpoint.

    Imagine these are updates on financial transactions in mongodb and they are NOT tolerant to
    1) missed data and
    2) duplicated data
    in that order.

    So, we need to make sure that the Change Streams that we are observing(matching) on, are delivered once and exactly once to the Kafka pipeline. (Blog on the same : https://www.confluent.io/blog/exactly-once-semantics-are-possible-heres-how-apache-kafka-does-it/). If exactly-once semantics are enabled, it makes commits transactional by default.

    4 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
1 3 Next →
  • Don't see your idea?

Connectors (BI, Kafka, Spark)

Categories

Feedback and Knowledge Base