What will happen if every school starts teaching with binary numbers before the easy decimal number system?

I think it would be challenging initially but it can have a much greater impact on how we think and it can open a completely new possibility of faster algorithms that can directly be understood by computers.

The reason people hate binary systems is that all their life they make the decimal system a habit which makes them reluctant to learn binary systems into that much depth later on.

Just a thought. But I really believe if I would have learned the binary system before the decimal system than my brain would see things in a totally different way than it does now.

It sounds a little geeky yet thoughtful

  • 1
    Cool idea admittedly
  • 1
    Speaking of numerical systems, duodecimal (base 12) would be more useful in everyday life, but yeah, it could be nice
  • 0
    Hexadecimal would be much more efficient.
  • 3
    How about teaching the basics of a base-x system in the first place? Why do we even seperate them? Can‘t we all just settle on one? All that stuff.

    Doesn’t matter what you start with. If it‘s binary, most will be better in binary than the base-10.
  • 1
    Just learn all of them from complex to simple numeric systems (in that order).
  • 2
    Did you ever wondered why the base-10 system is more more popular than any other? Because it suits most to your ten fingers.

    Just ask a child how old he/she is and some very young ones will probably use their fingers to count. Learning in early stages of childhood often involves counting haptic things.

    But I would also think that introducing other bases in earlier stages would be helpful for later.
  • 1
    Yes agreed,
    But instead of counting up to 10 using 10 fingers, imagining 1024 numbers using just 10 digits will be much easier if I become good at it.
    Initially, it would be hard but if you have no other choice you will have to stick to it and eventually you will end up criticizing the decimal system itself when compared with binary.. XD
  • 2
    While the idea seems good, there are some pretty serious drawbacks.

    For one, can you tell me off the top of your head how to write 3758696? Or what it looks like if you subtract 100 from it? The issue here becomes that in order to work with real world numbers, you have to start summarizing products of 2^x to find the right representation in binary. Instead of counting up you go straight for the 1...10...100...1000 when writing. It is not obvious how to represent 20 when you know how to represent 10, it's harder to count on your fingers and in your head, because instead for working with round 1000s of units, you end up with 512+64+1024.
    And all that wouldn't be a problem if our society was shaped towards binary numbers, but it's not - everything from counting on fingers to money, speed, mass, everything is measured in decimal system.

    Oh yeah, and good luck getting accurate fractional values in binary compared to decimal. Go ahead and ask JavaScript
  • 1
    @ArcaneEye I don't see a problem with floating points when you don't use a computer.

    I mean, seriously, the decimal system itself doesn't support floating points if we're being honest.

    12.34 could be 1100.100010 in binary. The dot is just a utility or hacky workaround to create a floating point in the decimal system itself.

    You could easily apply this hack to any other numeric system and you'll notice that all of them end up supporting floating points.

    The only problem is the computational representation of floating points and not an examination a human has to calculate.

    We simply don't have this barrier.
  • 1
    They kind of have with common core to a degree. It has the goal of teaching number theory as fundamental elements of instruction, rather than banal rote memorization. Kids ideally leave school primary school with an intuitive understanding of how bases.

    The primary problem they encounter is teaching doesn't pay a decent wage so you end up with a lot of dumb instructors cut from the same cloth.
  • 0
    @SortOfTested common core is shit. Kids are leaving HS 2 to 3 years behind and are weaker in the sciences. It has failed and needs to die in a fire.
  • 0
    Yeah, we're talking common core math, which is quite good when taught correctly. We will never have good science education until the politicians, religiocapitalists and textbook lobbies get out of it.
  • 1
    @PublicByte it's not about how you shorthand decimals to read, it's about doing math with it and representing numbers correctly. Being a lower base just means innately larger margin of error.

    Not to mention, why would you write 12.34 as 1100.100010 - you're literally using two bases instead of one for everything then, and what happens when you multiply that with another number and the decimals become whole numbers? Base two is actually super simple for multiplications, but not if you mess with the notation like that, and would you just arbitrarily wrap your decimals at .1100011? How does that make sense?

    While I applaud learning how different bases work, there's multiple reasons why we work with base 10 on daily basis.
  • 0
    Binary isnt used in simple boring life so no. That idea is pointless. Unless we are talking about university.
  • 0
    @ArcaneEye ok, good point. Didn't notice that before.
  • 0
    what about base pi ?

    rationnal numbers are so over-rated
Add Comment