Fulton Byrne
My feedback
36 results found
-
27 votesFulton Byrne supported this idea ·
-
5 votesFulton Byrne supported this idea ·
-
6 votes
An error occurred while saving the comment Fulton Byrne supported this idea · -
246 votes
An error occurred while saving the comment Fulton Byrne commentedWe believe shipping logs to S3 compatible object storage APIs such as AWS S3 or GCP Cloud Storage to be the highest priority and drive the most value. Many log routing frameworks support pulling logs from S3 or GCP and therefore you could cover more customer needs this way.
We would like to see database logs (all) prioritized first so that it is easy for our teams to view database status across all clusters and projects easily using our log provider (it's not Datadog/Cloudwatch/GCP).
The Atlas team should also consider https://feedback.mongodb.com/forums/924145-atlas/suggestions/43971369-send-atlas-logs-to-s3 when thinking about this functionality. Shipping to S3/GCP Storage would also allow Big Data frameworks such as GCP BigQuery or AWS Datalake to succeed. Therefore servicing the needs of internal data analysts as well as developers who need to view logs.
Fulton Byrne supported this idea · -
81 votes
An error occurred while saving the comment Fulton Byrne commentedVPC Native as well.
Would need a Private Link, VPC Native, and Public discovery endpoints. Or maybe some sort of scrape parameter to add to the request...
Fulton Byrne supported this idea · -
65 votes
An error occurred while saving the comment Fulton Byrne commentedPriorities in terms of logs:
1. Database Instance Logs
2. Audit Logs
3. Activity logsIn the end we _need_ to be able to collect any and all data logged in order to automate managing Atlas on a large scale.
An error occurred while saving the comment Fulton Byrne commentedI would request that it supports GCP Storage as well and just Object Storage APIs in general.
I would hope this feature would supercede https://feedback.mongodb.com/forums/924145-atlas/suggestions/39104293-ability-to-stream-logs-to-cloudwatch-logs-or-datad.
If you can get logs into an S3 or Cloud Storage bucket there are a million ways that can reliably ship from there to a million other tools.
Fulton Byrne supported this idea · -
5 votes
An error occurred while saving the comment Fulton Byrne commentedAnother approach is to just tack a last access date onto a user document but access happens so frequently that might not perform well.
Fulton Byrne shared this idea · -
17 votesFulton Byrne supported this idea ·
-
444 votesFulton Byrne supported this idea ·
-
60 votesFulton Byrne supported this idea ·
-
42 votesFulton Byrne supported this idea ·
-
5 votesFulton Byrne supported this idea ·
-
18 votesFulton Byrne supported this idea ·
-
47 votesFulton Byrne supported this idea ·
-
82 votesFulton Byrne supported this idea ·
-
70 votesFulton Byrne supported this idea ·
For example, Wazuh https://documentation.wazuh.com/current/user-manual/capabilities/log-data-collection/index.html
Which can monitor files, or accept logs over syslog.
Perhaps the Object Storage shipping feature is sufficient if the SIEM can receive from object storage.