8
optimista
57d

at most 12 serverless next.js pages on zeit/vercel now
it seems everything is shifting to client

Comments
  • 3
    Definitely if you run a full js stack.

    We're a blended shop so our experience is a bit different; I have a ton of serverless, but I still lean to static payloads for the client and truly render everything on the user's machine. I don't pay for any CPU or memory for those resources and can serve millions of request for < $50/mo. I'll meteor/similar on occasion as an optimization, but backend processes and apis that pull weight are definitely cheaper and faster on backend languages and platforms, performed with eventually consistent architectures.

    In our testing spikes, running our backend processes under capably written node implementations would take 5x the time when scaled to 20 workers using pm2. Rendering scenarios aren't really comparable as the client story for us is O(1) delivery, subsequent requests are client cached with etags. Serverless and FaaS are an excellent story for the node runtime because of its fast startup time, especially as glue code, you just have to remember to test the whole enchilada.

    For ref: our containerized nodes for backend services scale down to 1 node, .25core units and 250mb/ram so they're cost comparable to FaaS in that respect.
  • 1
    @SortOfTested Yes, definitely. Making all the sense. Updating my mind, catching up on this in the past few days. Thank you for sharing.

    Somehow I fixated on pre-rendering too much because of SEO and OpenGraph. Surely it is better for performance as well but that goes into costs and the factor in that is just incomparable. Those 12 pages are just enough for early prototyping and then you can scale.
Add Comment