98
AlgoRythm
10d

And THAT is how you stuff a 4x8 bitmap into a single int.

Comments
  • 9
    I don't get it, what's the point of converting it into int?
  • 2
    I don't know that much about binary calculations, but doesnt the transition from 4*8bits to one binary number require conversion to string first?
  • 15
    Oh. This is actually pretty cool!
  • 7
    @DevForTheMoney no necessarily ex in c you could bit swift them in and out in the order you want
  • 2
    are you hinting some kind of compression algorithm here?
  • 2
    I always sketch stuff like this instead of participating on history class etc as well
    :D
  • 1
    @mrsulfat
    That is interesting, thanks!
  • 4
    I never looked at storing array elements in this way. This is quite interesting!
  • 9
    This is the way you used to do sprites back in the day, but on the c64 for example you had each row correspond to 3 bytes, but using the came scheme.

    @DevForTheMoney No, you just use bitwise operations.
  • 28
    I'm actually a bit shocked that this is seems to be something new for many of you. Shouldn't be some basics about memory representation and interpretation common knowledge amongst developers. I'm aware that not all of you have degrees and some are just starting to learn, but still.
  • 2
    But it's only a black/white image so it's not really feasible
  • 5
    @ThermalCube For bitmap fonts it is enough
  • 3
    Finally found it! https://advanced-ict.info/interacti...
    now you can play it online!
  • 1
    but it's 64 bit
  • 3
  • 10
    @AlgoRythm efficiency depends on how you intend to use it. If you access it by pixel the compiler will store each pixel at its own byte and it takes 8 times the space it does now. Yet you can force it to be stored in a single integer but if you access a pixel then the compiler will insert code that extracts the pixel from the byte it is stored in (you can't access less than a byte). So all you can do with this method is trading CPU time by memory consumption, decide wisely.
  • 6
    @JustKidding Yeah, true, but there really isn't any reason today to store characters in a single int anyways (Except for the reason "because I can"), so there isn't really any decision to make - both will perform approximately the same. Just wanted to do some science-y shit while I was bored at work.
  • 3
    actually did something like this with bitmaps that I wanted to store in an AVR chip's flash memory
  • 7
    fun fact, a full 2048 game state fits into a single `long`
  • 1
    @Milind1997 For bitmasking for example, so you can make automatic tile system.
  • 0
    @succcubbus 2048, as in, the mobile puzzle game?
  • 0
  • 0
    Casio used a similar method to pack 5x7 bitmap characters into ten hex digits on the FX-850P. For each column (there are 5), bit 0 is ignored, bit 1 is the bottommost pixel and bit 7 is the topmost one, take a 2-digit hex of that and concat the whole thing, you've got 10 hex digits you can pass to DEFCHR$ to define *your **own** characters*! Wow!
  • 5
    There is something I don't quite get: you are storing 4 bytes of data into a 4 bytes integer, what's amazing about that ? Am I missing a revolutionary break through ?
  • 6
    Isn't this the obvious way to store 32 bits?
    Also, how is this excessive?
  • 2
    @deodexed Absolutely not. It's just cool, old-school graphics, which are incredibly uncommon these days. I could have made it an array of BOOL types or something stupid (but easier) like that, but instead opted for this solution, which requires an understanding of bit operations.
  • 4
    @Root Yes! But using bits for graphics is not so obvious anymore. That's why I said "excessive", although, this was an exaggeration.
  • 2
    @xzvf it was the same for sprites in QB45.

    Lots of data statements at the end of the program with a subroutine that loaded them based on an array of offsets.

    Happy days!
  • 0
  • 5
    @JustKidding
    I think it just shows how far IT has gotten in general. There are now people out there that can make a living off programming while they don't even know how computer memory works, let alone know how to manage it!

    Thanks to people studying the field others won't have to, which is basically the fundamental reason were not bashing each others skulls in with rocks anymore.

    Hurrah civilization?
  • 4
    The amount of people suprised by this is too damn high!
  • 4
    @AlgoRythm now what astonishes me the most is the amount of people not nothing the basics
  • 0
    @ThermalCube use 4 layers of this and you can get 16 colors.
  • 3
    @DevForTheMoney seriously I think that is a really bad development. I don't say that we should throw away all the abstraction and still only use assembler. It is indeed good that we don't have to do it.

    BUT saying that we don't need to understand the basics anymore is bullshit. When you write e.g. a web app you should have at least a coarse understanding of what happens in your browser and how that affects your system. The fact that people do things without knowing their implications is the reason for many performance and security issues.

    There is a HUGE difference between throwing away deprecated knowledge and throwing away basic knowledge. What has been posted in this thread is not deprecated but basic and therefor should be known by anyone doing anything serious in this field.
  • 0
    I wonder how many of those who were "shocked that not many people know this" actually understood
Your Job Suck?
Get a Better Job
Add Comment