Skip to content

Atlas

Share your idea. In order to help prioritize, please include the following information

  1. A brief description of what you are looking to do
  2. How you think this will help
  3. Why this matters to you

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback

62 results found

  1. Send Atlas logs to S3

    I would like to automatically send my cluster logs to my S3 bucket. I could then use Atlas Data Lake to query them and Charts to create visualizations on them, or inspect them with other tools.

    68 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  2. Propagation of NodeType via Prometheus metrics

    Context: we rely on Prometheus metrics for various systems in the company. The setup of clusters until now was pretty standard, so basically we had replica sets and sharded clusters with electable nodes only – clusters with three nodes per RS or shard which numbers correspond to <clustername>-<shardnumber>-<(00|01|02)>>. Now we have a fourth analytics node in the clusters, which out of convention seems to get the next incremented number 03.

    The thing is we cannot rely on convention and we need to have a way to distinguish the node types from Prometheus metrics (or at least from the Atlas API)…

    27 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Integrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  3. Implement conditional DNS forwarders in Atlas to support internal LDAP servers

    When a customer wants to use their private LDAP server in Atlas, they currently need to either expose their name via public DNS or use an internal CA and an IP address. For some customers these scenarios are suboptimal since they want to keep their infrastructure details private, even at DNS layer.

    The proposed solution is to use the Conditional DNS forwarders in all cloud providers supported by Atlas, so the requests to resolve a private DNS zone (specified by the customer) will be forwarded to the listed DNS servers across the VPC peering connection while all other (public) DNS…

    27 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Integrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  4. Atlas API to update Project Timezone

    At the moment, Project Timezone setting can only be done via Atlas UI. We are automating project and cluster creation and setup for our application. We need an API to update Project Timezone setting just like other project settings. This feature is important for our organization to keep all the project with the same settings without a manual intervention.

    20 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  5. Support Dynatrace integration for monitoring with Atlas

    This is a request to support an integration with Dynatrace for monitoring MongoDB deployments in Atlas.

    18 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  6. Access Total Memory from Mongo clusters

    Could the MongoDB metrics available in Datadog get expanded to include total memory? For setting up thresholds to track the resident memory of each client, it would be ideal to standardize these thresholds across clients. To do so would require displaying resident memory as a percentage, which would the total memory to calculate for each client. Please let me know if total memory can be a metric available in the Datadog integration or perhaps how I can find it if it already exist or can be created.

    15 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  7. Enable Private Link for Azure Data Factory

    Private Link for Azure Data Factory is not currently supported. This appears to be the most secure way for ADF to connect to Atlas / MongoDB, and support for this feature would be ideal for our use-case.

    14 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  8. Send alerts to Azure Log Analytics

    Similar to the 3rd party integrations already in Atlas, it would be helpful to have integration with Azure Log Analytics for alerts and metrics instead of writing it ourselves with the Atlas API

    13 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  9. Custom metric option on Datadog

    Currently, there is no option to use the custom metric option when using the Atlas Datadog integration.

    We would like to use this functionality in Atlas : https://docs.datadoghq.com/integrations/guide/mongo-custom-query-collection

    12 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  10. Map IDP groups to Atlas teams

    At the moment Atlas does not support mapping IDP groups to existing Atlas teams. We would like that the integration would support that. For example:

    okta group "devops" --- mapped to ---> Atlas team "devops"

    Each time the customer adds a user to this idp group, the user will be given the proper permissions in Atlas.

    11 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  11. Add Datadog integration to Atlas Serverless

    Datadog ( and other monitoring tools ) integrate great with Atlas, it would be awesome to copy this integration to Atlas Serverless. Without this we can't monitor Atlas Serverless with our standard toolset. In conclusion, please bring 3rd party monitoring integration to Atlas Serverless !

    11 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Integrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  12. Parity of metrics between Atlas UI and Prometheus integration

    It would be great if any metrics that are available in the Atlas UI are also available via the the Prometheus integration, especially since the metrics in the Atlas UI get coarse-grained after a while.

    For example, in a recent incident it took us a while to do some preliminary investigations, and we wanted to confirm by looking at historical values for the metric "Operation Execution Time".

    Unfortunately the chart in the Atlas UI was already too coarse-grained and the metric also does not seem to get exposed via the Prometheus integration.

    9 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Integrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  13. KMIP-based Encryption at Rest

    Problem Statement,
    What is the problem?
    As of now (2021-03-09) Atlas does not support KMIP-based Encryption at Rest. Currently supported (as of 2021-03-09) encryption key sources are,
    * AWS Key Management Service (AWS KMS)
    * Azure Key Vault
    * Google Cloud Key Management Service (Google Cloud KMS).

    Why is this a problem? Customers who use KMIP to source encryption keys are not able to use Encryption at Rest (https://docs.atlas.mongodb.com/security-kms-encryption/) feature of Atlas.

    **Proposal,**
    * Add KMIP support for Encryption at Rest feature of Atlas.

    9 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  14. Support for push-based logging integration with Google Cloud Storage

    As of now, MongoDB Atlas supports log export functionality exclusively with AWS S3. In addition to the existing AWS S3 bucket integration, would be good to have the capability of pushing logs to GCP Cloud Storage for organisations who depend on GCP for cloud infrastructure.

    8 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  15. SEIM integration between Splunk and MongoDB

    I'd like formal integration between Splunk and MongoDB for the SEIM use case.

    8 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  16. 8 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  17. Database user credentials rotation with AWS SecretManager

    1. Similar to the HashiCorp Vault integration, you should offer the option to use AWS SecretManager in the same way
    2. Rotating dbuser credentials is a common best practice. Many people that are "AWS" shops chose AWS-native solutions over 3rd- party.
    3. We use SecretManager, so this would save us from having to write something of our own.
    7 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  18. SIEM

    Add audit log integration with enterprise SIEMs

    6 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Integrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  19. Control KMIP Key Rotation Timing

    We need the ability to change the automatic key rotation time for our cluster. Currently the automatic rotation is set at 90 days but we cannot specify the time at which the rotation should occur. We want to avoid automatic key rotation activity during business hours.

    6 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
  20. Allow customers to specify the number of service attachments for PSC

    To connect applications to MongoDB Atlas clusters/project via Google Private Service Connect, the documentation says we need to reserve 50 IP addresses in our subnet:

    https://www.mongodb.com/docs/atlas/security-private-endpoint/

    Each private endpoint in Google Cloud reserves an IP address within your Google Cloud VPC and forwards traffic from the endpoints' IP addresses to the
    service attachments
    . You must create an equal number of private endpoints to the number of service attachments. The number of service attachments defaults to 50.

    We would like the ability to not have to reserve 50 IP addresses per project as we have limited internal subnets. We would…

    6 votes

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Integrations  ·  Admin →
    How important is this to you?

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
← Previous 1 3 4
  • Don't see your idea?

Feedback and Knowledge Base