Do all the things like ++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatarSign Up
There are fifty million numbers in that, approximately. So if you need a byte per _digit_ it should end up at fifty megabytes. Every prime but the first five need more than one digit, so already we're at one hundred megabytes. You don't need to go far along the set of primes to reach triple digits, and then quadruple. It's a completely negligible amount among those over fifty million primes, so we're quickly approaching five bytes per number. And then six bytes per number. And so on. 500MB is smaller than I'd have guessed.
Pet peeve: people who write megabytes as millibits.
No, those who use MiB I'll just smack over the head.
Wack68162yWell... At the comp sci students association, we're hosting a gitlab for every member. Member space was reduced as people started to store ml projects (around 1.5gb per project) on the servers...
Every time you say something is n megs large, it's an approximation anyway, so if you're trying to be so specific as to say "mebibytes, not megabytes" you're being needlessly pedantic (and thus annoying).
The difference between mb and MB is a factor of eight billion.
But the question is... how much memory did it take when compiled and ran?