Ability to stream logs to Cloudwatch Logs or Datadog
There's no way to stream logs from MongoDB on Atlas right now. I should be able to stream logs, either to Datadog or Cloudwatch or something!
-
Alex commented
How about streaming to logstash and Kibana ? Logs from multiple nodes could also be combined into one single portal, and users can use whatever filter is desired to search.
-
AdminSalman (Admin, MongoDB) commented
We have introduce a capability to push cluster logs to a customer S3. This push happens every five minutes. This capability is not the same as streaming, but a step in that direction.
https://www.mongodb.com/docs/atlas/push-logs/ -
Jai Kumar commented
Ability to stream logs to Cloudwatch Logs would eliminate lot of complicity and dependencies
-
Léo Ferlin-Sutton commented
At least document a workaround to either feed the logs to datadog via the API, or stream them to S3 and then datadog.
-
Kho Ho commented
We are standardizing on DataDog and like to have logs in a centralize location. It would be great to have logs stream to DataDog.
-
Raffaele Marcello commented
Yes, I confirm the need to stream logs to the cloud provider streaming service ( example Azure Event Hub)
-
Tom commented
Any movement on this one that was raised back in 2019?
-
AYMERIC commented
Logs, and activity feed also (like alerts, as datadog events)!
-
Fulton Byrne commented
We believe shipping logs to S3 compatible object storage APIs such as AWS S3 or GCP Cloud Storage to be the highest priority and drive the most value. Many log routing frameworks support pulling logs from S3 or GCP and therefore you could cover more customer needs this way.
We would like to see database logs (all) prioritized first so that it is easy for our teams to view database status across all clusters and projects easily using our log provider (it's not Datadog/Cloudwatch/GCP).
The Atlas team should also consider https://feedback.mongodb.com/forums/924145-atlas/suggestions/43971369-send-atlas-logs-to-s3 when thinking about this functionality. Shipping to S3/GCP Storage would also allow Big Data frameworks such as GCP BigQuery or AWS Datalake to succeed. Therefore servicing the needs of internal data analysts as well as developers who need to view logs.
-
Oliver commented
We really need this. Would be ideal if we can integrate with Splunk or DataDog to get database logs, audit logs, and atlas activity logs
-
Christian commented
This would be very nice to have; we are in some ways flying blind without this. Particularly I would like being able to stream out slow query logs to the datadog, to analyze using datadog's tooling and easily share it with other engineers.
-
Veronika commented
I agree, stream the logs to Splunk for further security analysis will be great.
-
Dale commented
The ability to consolidate logs out of mongo private cloud (Atlas) into consumer cloud solution should be part of the base functionality framework of Atlas
-
Adnan commented
looking forward to this integration to centralize all matrices in cloudwatch.
-
Syed Turab commented
need support to shop logs and metric into cloudwatch
-
Valer commented
+1 for sure. We're spending a significant time writing our own shipper that's polling the logs API (which only updates every 5min). Being able to provide an endpoint with an ElasticSearch or Splunk interface would be much appreciated.
We're shipping all the logs, server and audit, for all of our clusters.
-
Thales commented
+1
This feature will be a lifesaver to us, we have a lot of monitoring/statistics using information from logs and we had to implement a job gathering it every 5 minutes. So, the time to download and process it, makes our analysis always be late.
-
Vedant commented
+1 for streaming logs to Datadog. Having DB logs and slow query logs available on Datadog would make custom dashboards and debugging so much easier
-
Jason commented
Audit logs
-
Nathan commented
I'm using data dog and would love to be able to stream logs there!