71

We have an API which returns 600 MB of JSON.

Because client "Wants to see everything first and then apply filters, just like Excel".

FML
Edit and ofc thier laptops with core i3 and 4GB of ram can't even process that.

Comments
  • 11
    😆 this sounds to for me as a non dev even so stupid that it hurts.

    Convince the client to Get graphQL then he can select the data beforehand and gets exactly what he needs.
  • 3
    Not gzipping either?
  • 3
    @asgs it's GZipped. But still.

    AND all of that data is present on a WEB page ! Even my core i7 (Granted a bit dated, only i7-3770k) have troubles to process this web page.
  • 1
    @heyheni it's in a web page context (to make things even worse)

    We already have a filtering system. They can pin point exactlly what they a reelooking for.
  • 6
    I'm currently limited to a 8Mbit/s link, speed testers show about 130kb/s on average so it's possibly slightly more

    so this website would currently load for about 78.8 minutes on my link

    which ofc would timeout way before that

    why did you make this?!
  • 4
    @Hazarth because "We need to make this client happy and we don't care that they don't use product properly" :(

    Trust me, nothing will be more fun for me than delete the whole API
  • 1
    @AtuM nop, just one of very big accountant firms :) Can't give country/name for obvious reasons tho. (I can say it's not USA)
  • 1
    Couldn't you compress it, then decompress when it gets to the FE?
  • 10
    ...si-s-s-s-six hundred megs of plaintext JSON.
    I think I'm gonna get the strong Vodka for this one
  • 3
    @drewbie he said they use gzip, but even with gzip 600Mb of json just doesn't turn into anything reasonable
  • 0
    @Hazarth big rip. Holy.
  • 7
    @drewbie it is compressed by Gzip, but it doesn't change the fact that web browser still need to decompress it and process 600 MB json lol
  • 4
    @Ranchonyx Vodka + beer + red bull. The only way I can work on it
  • 7
    UPDATE ! I managed to bring it to "only" 347 MB !
  • 0
  • 1
    @NoToJavaScript And let's not forget parse it and decode it...
  • 2
    600M JSON data? What data is this big? ?
  • 4
    @kateliuyi Imagine, 300 emploeeyes worjkin on a total of 8000 projetcs. Each projects has an average of 8 task.

    Now Imagine it on a web page.
  • 3
    600 MB is too much for text data.
  • 5
    Shit in -> shit out.
    Why even bother making an API? Just send them the whole database if they like to work "just like with Excel".
  • 1
    Completely ridiculous of course. But did he say he wanted to see everything at the same time? 🤔

    Yes it could still be very challenging to solve, but that would make it less impossible right 😅

    But it doesn't sound like the client gave you a whole lot of time to build this properly..
  • 2
    Not sure if this is an option, but there are some "binary json" formats out there
  • 2
    @foox the thing is.. it should be supported by ALL major browsers before puting in production. It's not the case.

    Belive me, I would love to use that !
  • 2
    +1 for stupidity of decisions!
    Also would ask sysadmin to limit the connection speed for that specific endpoint!
  • 0
    @maladiec naaaan too much work that.

    It would require : 1+ Azure service for load balancing
    +1 Fixed endpoint (ouy pay for that)
    +1 App service firewall
    +1 Rule of acess
    +1 rule of "fail" (if speed > XX send email/page what ever)

    Basiclly the 10 GBit link on our end still holds. We'll just work with thewm to see how we can fuckiong NOT send 347 MB (Now, down from 600).
  • 1
    @NoToJavaScript again how did you get it down to 347 mate?
  • 2
    @hjk101 Well, I made properties names shorter lol. Not like 1 letter short, but very short.
    There is a way do optimize it more, but it will require to send arrays without properties, so it's annoying to work with on front end side
  • 2
    Let me guess. To update / patch a single value, you have to upload all those MBs again? I hope your client is on a limited internet plan with a fair use policy.
  • 1
    @NoToJavaScript that's minification for ya. Still insane. I would not optimize it for them. If they want a shitty solution they can get it. Sure it has to run on all browsers, can't run on any normal device though...
  • 2
    So pagination and datatables are completely out of the picture?

    If they are not I would still press for a better pagination style like a load more.
    I'm assuming that this data is a super long list right? And no way they can possibly "see everything" without scrolling. So a load more button thingy can be added to ease this nightmarish situation.
  • 0
    What about something like https://npmjs.com/package/... or something equivalent? Then you could display "everything" at once without actually having everything loaded at once
  • 0
    @eeee Naaan, update is async by element
  • 1
    @v3ctor that's FOR ONE page of data :) total dataset is around 5 pages like that
  • 0
    @AtuM
    can't. this particualr API uses around 15 differents tables, Multi thread for agregation AND some inProc calculations. It's not matter of DB acess. We get DB dataset from all of these 15 tables in around 2 seconds.

    Post processing takes around 15 seconds (Now 5 seconds, Improuvement were made !!).

    I have no idea hhow much time serialization take. My next thing to look :)
  • 0
    600mb? That's heavy
  • 2
    Paginate and lazyload maybe?
  • 1
    Just like excel, they say. So why they're not asking for .xlsx file generate feature then? So they can download the spreadsheet file and open it in excel.
  • 0
    @imaji Excel is the best thing for "just like excel" work btw 😂😂
  • 0
    @v3ctor Yeah, even though it's ridiculous, but embedding google spreadsheet using iframe is the best option for "like excel" stuff. 😂
  • 0
    @imaji good thing g-sheets is just Google's excel
Add Comment