External Dependencies Should Be Per Function
10mb compressed size constraint is very limited. Plus, dependencies are not siloed per function. Either uncap the dependencies constraint or allow dependencies to be uploaded per function, still with a higher compressed size constraint (preferably at least 100mb to support large dependencies such as for image processing).
I think the thought was "functions are meant to be short running operations" but I think that is a huge assumption. Why not offer options for powerful compute that respond to database change events. Sure you may need to just update something else in the db, and that may be sufficient for some workloads, but you may need to do some complex processing as well.
Also I'm finding the entire implementation of realm function surronding shared dependencies, dependencies that don't work, an environment that has unsupported node.js/v8 apis seems to (me) an incorrect approach, you can't export modules in the global environment. Why not allow users to specify a docker file, then just run containers that respond to change events. Why are we rebuilding the wheel?
Google Cloud Functions, Azure Functions, AWS Lambda, Cloud Front Workers work the same way. The first three also have integrations where they are invoked with specific database integrations.