Fulton
My feedback
14 results found
-
141 votes
An error occurred while saving the comment Fulton supported this idea ·
-
72 votes
Thank you for your feedback. This work was planned. In order to avoid a false impression due to request title, I would like to clarify that:
1) MongoDB will first support Workforce Identity Federation for human users to access databases. This will allow you to SSO to the database not with GCP IAM but with your Identity Provider supporting OpenID Connect such as Google Cloud Identity, Okta, Ping, etc..
2) Then, MongoDB will support Workload Identity Federation that will allow your applications to access to database using GCP Service Accounts.
An error occurred while saving the comment Fulton commented
The current ordering priority is not ideal.
Generally we do not want individual users accessing the database, so why are you prioritizing Workforce Identity Federation to access clusters using SSO? If a user needs to access a cluster directly (in an emergency) it's maybe once every few years.
The highest imperative is for application to access databases securely. The application has the highest access and security needs. Therefore prioritizing Workload Identity Federation (IAM) is most valuable. IAM access makes it easier for developers to build new applications to work with the database (instead of touching the database directly).
Why is Atlas choosing to support Workforce Identity Federation first?
An error occurred while saving the comment Fulton commented
Would be very nice to have this so we can use workload identity in GCP GKE clusters to eliminate yet another credential to distribute.
Fulton supported this idea ·
-
40 votes
An error occurred while saving the comment Fulton commented
VPC Native as well.
Would need a Private Link, VPC Native, and Public discovery endpoints. Or maybe some sort of scrape parameter to add to the request...
Fulton supported this idea ·
-
47 votes
An error occurred while saving the comment Fulton commented
Priorities in terms of logs:
1. Database Instance Logs
2. Audit Logs
3. Activity logsIn the end we _need_ to be able to collect any and all data logged in order to automate managing Atlas on a large scale.
An error occurred while saving the comment Fulton commented
I would request that it supports GCP Storage as well and just Object Storage APIs in general.
I would hope this feature would supercede https://feedback.mongodb.com/forums/924145-atlas/suggestions/39104293-ability-to-stream-logs-to-cloudwatch-logs-or-datad.
If you can get logs into an S3 or Cloud Storage bucket there are a million ways that can reliably ship from there to a million other tools.
Fulton supported this idea ·
-
3 votes
An error occurred while saving the comment Fulton commented
Another approach is to just tack a last access date onto a user document but access happens so frequently that might not perform well.
Fulton shared this idea ·
-
9 votes
Fulton supported this idea ·
-
333 votes
Fulton supported this idea ·
-
50 votes
Fulton supported this idea ·
-
39 votes
Fulton supported this idea ·
-
5 votes
Fulton supported this idea ·
-
13 votes
Fulton supported this idea ·
-
36 votes
Fulton supported this idea ·
-
73 votes
Fulton supported this idea ·
-
63 votes
Fulton supported this idea ·
We believe shipping logs to S3 compatible object storage APIs such as AWS S3 or GCP Cloud Storage to be the highest priority and drive the most value. Many log routing frameworks support pulling logs from S3 or GCP and therefore you could cover more customer needs this way.
We would like to see database logs (all) prioritized first so that it is easy for our teams to view database status across all clusters and projects easily using our log provider (it's not Datadog/Cloudwatch/GCP).
The Atlas team should also consider https://feedback.mongodb.com/forums/924145-atlas/suggestions/43971369-send-atlas-logs-to-s3 when thinking about this functionality. Shipping to S3/GCP Storage would also allow Big Data frameworks such as GCP BigQuery or AWS Datalake to succeed. Therefore servicing the needs of internal data analysts as well as developers who need to view logs.