7
zzzachzzz
49d

So I work at this place that primarily does games for training purposes using Unreal Engine and C++. I've been doing web development there in React and JavaScript. Our client wants us to transition to making the games playable in the browser (partially because of Coronavirus and the need for remote work / training). This is possible with Unreal, as there's a way to export a project to be played in a browser (Emscripten to compile C++ into JavaScript), though this feature was recently deprecated in a recent release of Unreal. The package size is ridiculous though, and for a very small proof of concept project we created, the package for the game totalled ~160 MB. This is for what was effectively a 2d game made with 3d assets. On a good internet connection, this takes about a full minute to load the game into the browser page. This is where the rant begins: it seems silly to me to be developing browser games in this way, and that we should switch to a browser specific option like Three.js with other tools. My coworkers on the game dev side of things don't know much at all about web development, and I've been trying to convince them that the package size is way too big. The browser games we'll be developing are going to be in very short development cycles, apparently one per week... So it seems like yet another reason why using Unreal is overkill here, and could slow our development speed, even compared to learning a new JS 3d library. The game devs tend to be very dismissive about the package size, making comparisons to the usual massive binary files they regularly deal with. Thanks for the read, this was my first post on DevRant. What do you guys think?

Comments
  • 1
    Size is definitely the main limiting factor when it comes to WebGL games, though 5G and fiber optics, combined with browser caching are changing that a bit. My current download speed is about 1GB per minute, and I don't even have that good network infrastructure around here, at work it reaches even greater speeds.

    Perhaps the defining factor here is how much work would go into maintaining two separate products and keeping them in sync vs. the work required to optimise a single product to be suitable for both low-res web deployment and high-res desktop installation.
  • 0
    @hitko Yeah, that's the cost benefit analysis it comes down to. One thing I haven't tried yet is gzipping the files, so we'll see how much that can reduce the size transferred over the network. Whenever I loaded the page with the game, it seemed to re-download the files every time. Anything you could elaborate on with caching? It seemed like the game files weren't being cached.
  • 1
    @zzzachzzz Have a look at workbox https://developers.google.com/web/...

    It provides functions to hook into specific requests (e.g. when user opens a page, requests a resource, etc.), and deal with caching the response. It can, e.g. cache various files in the background and serve them immediately when needed, or serve a cached response if network isn't available, or even fetch a new version in the background while serving a cached version to the user.
Add Comment