Ability to stream logs to Cloudwatch Logs or Datadog
There's no way to stream logs from MongoDB on Atlas right now. I should be able to stream logs, either to Datadog or Cloudwatch or something!
-
Roger commented
Splunk connectivity for compliance review is a critical part of our production operations. This functionality is therefore critical for us.
-
Shirish commented
It is needed to monitor Production instances.
-
Tony commented
Since we have a lot of production infra assets hosted in AWS, many of our tech support teams rely heavily on AWS CloudWatch so it would be great if MongoDB Atlas could create an integration with AWS so that the some or all of the KPIs available in the Atlas UI could be ingested and/or presented in an AWS CloudWatch dashboard; particularly for Atlas clusters that are hosted in AWS.
-
Dritero commented
The idea is to connect MongoDB Atlas Organization with CM Solution like Wazuh and other solutions, and there we can set triggers and monitor logs real time for security enhancement, monitoring etc.
-
Marius Daniel commented
It's possible to stream auditing from Atlas to AWS S3.
Pls implement the same for GCP block storage.
Best would be to have the possibility to stream all audit logs(Atlas, project, RS) to GCP block storage. -
Naveenkumar commented
Need to integrate mongod logs with exabeam to view the audit logs. And we need these logs for PCI compliance.
-
Alex commented
How about streaming to logstash and Kibana ? Logs from multiple nodes could also be combined into one single portal, and users can use whatever filter is desired to search.
-
AdminSalman (Admin, MongoDB) commented
We have introduce a capability to push cluster logs to a customer S3. This push happens every five minutes. This capability is not the same as streaming, but a step in that direction.
https://www.mongodb.com/docs/atlas/push-logs/ -
Jai Kumar commented
Ability to stream logs to Cloudwatch Logs would eliminate lot of complicity and dependencies
-
Léo commented
At least document a workaround to either feed the logs to datadog via the API, or stream them to S3 and then datadog.
-
Kalyanram Gangumolu commented
Why can't we dump audit logs to log analytics workspace when we are using azure private endpoints to connect atlas
-
Kho Ho commented
We are standardizing on DataDog and like to have logs in a centralize location. It would be great to have logs stream to DataDog.
-
Raffaele Marcello commented
Yes, I confirm the need to stream logs to the cloud provider streaming service ( example Azure Event Hub)
-
Tom commented
Any movement on this one that was raised back in 2019?
-
Michael commented
Our requirement is to feed these audit logs in to a central system, which will help us report on elevated or unauthorized changes to the Database services.
Currently, the only way to pull these logs through the API is to pull each file from each node within a cluster as a .gz file. (https://www.mongodb.com/docs/atlas/reference/api-resources-spec/#tag/Monitoring-and-Logs/operation/downloadHostLogs)
What we’d like to request is a way to pull the Project (or cluster) audit logs that acts like the Events API endpoint, where it is a paged list of Events that have occurred for a project. (https://www.mongodb.com/docs/atlas/reference/api-resources-spec/#tag/Events/operation/listProjectEvents) -
AYMERIC commented
Logs, and activity feed also (like alerts, as datadog events)!
-
Fulton Byrne commented
We believe shipping logs to S3 compatible object storage APIs such as AWS S3 or GCP Cloud Storage to be the highest priority and drive the most value. Many log routing frameworks support pulling logs from S3 or GCP and therefore you could cover more customer needs this way.
We would like to see database logs (all) prioritized first so that it is easy for our teams to view database status across all clusters and projects easily using our log provider (it's not Datadog/Cloudwatch/GCP).
The Atlas team should also consider https://feedback.mongodb.com/forums/924145-atlas/suggestions/43971369-send-atlas-logs-to-s3 when thinking about this functionality. Shipping to S3/GCP Storage would also allow Big Data frameworks such as GCP BigQuery or AWS Datalake to succeed. Therefore servicing the needs of internal data analysts as well as developers who need to view logs.
-
Oliver commented
We really need this. Would be ideal if we can integrate with Splunk or DataDog to get database logs, audit logs, and atlas activity logs
-
Christian commented
This would be very nice to have; we are in some ways flying blind without this. Particularly I would like being able to stream out slow query logs to the datadog, to analyze using datadog's tooling and easily share it with other engineers.
-
Veronika commented
I agree, stream the logs to Splunk for further security analysis will be great.