The 16 MB document size restriction of MongoDB makes me wonder how any dev uses it to build monolithic apps.

  • 4
    Never used it. Your post adds reason to keep it that way.
  • 1
    ...split it into chunks?
  • 7
    also: IMHO, storing a 16MB-object inside of a database instead of the filesystem is far beyond the point where i'd seriously question my own sanity.
  • 0
    @tosensei that's true actually
  • 0
    @tosense If you store it in a filesystem then how would you perform queries on it?
  • 0
    @Sid2006 The first question is, what type of document are you storing that takes up 16mb as an entry in your database?

    E.g. is it an image? Store the filesystem path to the image in your database and retrieve it programmaticly when needed.
  • 0


    Though I would recommend storing a relative path inside the document instead.

    Configure an env variable, e.g.


    Each doc has an


    Combination of attachment_storage_path, attachment_path gives full URL.

    For specific file storage drivers, URI protocol.

    E.g. s3://.../ as attachment_storage_path.

    I mentioned GridFS before...

    A fair warning: Storing data inside any database can be a good idea. If planned wisely.

    If not planned wisely or if there is a lack of experience, I really disrecommend it.

    There are... Lots of caveats involved.

    E.g. resource usage, backups, .... .

    The solution of storing inside might have few pros, but the con is that you have a resource hogging and guzzling maintenance monster.

    Storing it outside the database makes it far more manageable.
  • 0
    @Sid2006 using Hitachi HCP or similar object storage integration (https://knowledge.hitachivantara.com/...)
  • 0
    @tosensei well. i did it just because in an academia setting....but in my last job i did see umm......some rather large objects in the Mighty Dolphin Base.
  • 0
    @PotatoCookie Let's say I am building a Shopify Integration. A store can have 10 million products and every user in that store can add products into their import list.

    Each user can add all 10 million products associated with their import list.

    How do you deal with a scenario like that?

    Products usually have a description attribute which can have a LongText type.
  • 0
    The same principle would apply: store a reference to the item document in the user document (e.g., sku) and only fully retrieve it when it's needed.

    At that point you're abusing MongoDB though (hierarchical vs relational) and you should consider reaching for an SQL solution instead.

    A software setup where defining relationships is key (here, coupling users to wishlisted items, to pruchases and probably to a few other lists or datasets) naturally lean towards SQL (relational). Unless there is a very detailed reason MongoDB is required, an SQL solution will most likely serve you best here.
  • 0
    @Sid2006 Another way is accepting the data duplication and limiting the amount of items the user can add to their import list.

    You can safely assume that a user that adds all 10 million products to their account is not properly using your platform.
  • 0

    You can safely assume that a user that adds all 10 million products to their account is not properly using your platform

    But that's thing, I am seeing it from a scalability perspective. The system SHOULD be able to handle it. Because there could be 1 million customers adding 2 million products each.
  • 0
    @Sid2006 The limit option. 2 million products is absurd.

    And at that, you'd be amazed how much 16MB can fit if it's just text. It is a hard limit for the documents, not the database. So each user document can go up to 16MB, which is plenty.

    The solution for your scalability proposal is using a database system that is designed for this use case.
  • 0
    @Sid2006 what queries would you perform on a 16mb blob?
  • 1
    16 MB is HUGE.

    Just for reference: all Harry Potter books fit on a floppy disk (1.44 MB, compressed).
Add Comment