Mongo Spark Connector Option to refresh the Schema
This w.r.t the ticket we raised "https://support.mongodb.com/case/01352011"
In the current spark connector to infer the schema automatically we have an option "stream.publish.full.document.only" to "true", once this configured there is no explicit schema we need to pass but the driver will infer the schema on the first document it streams and use/apply the schema for any future documents coming from that collection.
But the issue here is when there is any addition of new fields in the source collection the streams are not inferring the new changes and instead it is using the old schema.
We should either design in a way to always use the new schema or at least we should have a configuration option to refresh the inferred schema with the new documents.