Wait... Do people really think C is an accurate way to learn how a computer works... Please tell me it isn't so

  • 26
    Well, it comes closer than any other high-level language.
    But its still a considerably large abstraction over ASM.
  • 36
    pfft. Of course not. Everyone knows python is how computers work. You just import CPU and you're good to go
  • 18
    @M3m35terJ05h Python? HTML is how computers work
  • 16
    C does more or less directly translate into assembly... IF you use a terrible compiler

    The danger in this mindset is people will think a lot of things make sense in C which are undefined behavior in the language

    For example, dividing by zero is undefined behavior and obviously your processor will raise some arithmetic exception. What’s less obvious is if you do that, the compiler is free to consider the divisor to be non zero. It’s also free to reorder inconsequential operations since now the divide is deemed side effect free. Therefore, if you write this code:

    x = 2 / y;
    if( y != 0 ) {

    It’s very possible that if y == 0 what this code will do is print hello and then throw a fp exception. True story!

    So the danger is believing the computer does exactly what you write.

    Other examples include any other undefined behavior, believing things happen in the order you write them, believing that memory is TSO, believing that every variable exists in memory, or not understanding why the volatile keyword matters, believing that pointers imply that there is an address somewhere that can be dereferenced. Not always the case~ compilers are free of a lot of the restrictions that people mistakenly think exist.

    I’ve read a lot of illegal C++ that the author thought was perfectly legal. When writing c, you cannot think in terms of machine code because you will make a mistake if so.
  • 11
    With the way CPUs are designed these days I'm pretty sure only a couple Intel and AMD employees know how computers work. What with their pipelining and smt and prefetching and speculative execution
  • 3
    @M3m35terJ05h it’s complex, but not as complex as you think~
  • 0
    @FrodoSwaggins in your example, I don't get why the compiler is free to assume that the divisor is not 0. I get the reordering part, but why when dividing by some variable the compiler would be like "oh this person knows division by 0 is invalid, let me assume that the divisor variable is not 0".
  • 1
    @sSam that’s exactly what the compiler does. I have debugged this type of problem many times. The key to this is that division by zero is UNDEFINED BEHAVIOR. That means the language and what it does is not defined when that happens. If you trigger undefined behavior in your program, you have no guarantees about what happens.

    The language is free to do as it chooses with undefined behavior. The division statement is only defined for non zero divisors. Since you did not check if the divisor is zero, the code that follows is also only defined for non zero divisors. Therefore the contents of the if statement can be hoisted.

    I assure you compilers do this, I have worked on compilers all of which do this, I have debugged peoples code who claimed they found a compiler bug but their code did this. There is specific verbiage in the C specification that describes this.

    This is why it is dangerous to assume the compiler generates code that does exactly what you write. Just because there is an if statement does not mean a branch was generated if the compiler can prove that the branch is always taken. And if you have undefined behavior, the constraints for solving the branch pattern may not be what you think they are. You must NEVER trigger undefined behavior if you want the compiled binary to do what you think it does.
  • 0
    @sSam another way of wording what you said is: “this person divided by a variable. If the variable is zero, that behavior is undefined. Therefore, generate the defined behavior and apply that constraint here and after.”
  • 0
    I want to know why you don’t think C is a great language to start learning programming? I really want to hear, because my view is Programming is being Taught programming and software engineering the wrong way... there’s this huge push for high level languages and tools that make things easy.. but then these kids have an extremely weak foundation... I am actually working on a very large free Video series to correctly teach programming ... putting my 15 years of experience as well as my other colleagues experience into this aswell, the key is a solid foundation of the basics and working up from there.. just like a video game it teaches you the extreme basics first and works up from there it doesn’t just drop you off at level 50 and say well this is how todo stuff from here forget about level 1-50... where just gonna his that behind this tool.
    It’s going to be very in-depth and free. Many hundreds of hours will be going into this, it’s our give back to the industry and community as we believe many newer folk we have noticed thru interviewing have a large lack of understanding certain topics we view as basics and fundamentals.. AND oddly so many students go to these higher level languages for reasons we don’t understand anyone with business knowledge knows lower level is a massive billion marketing with a huge void of people to fill the portions... I guess people think high level more powerful more access to ram and all that stuff capabilities are endless.. I argue the person isn’t experienced enough to write code that effectively uses that much power. Start small work up.. and everyone in the industry will improve things...
    And I’m not advocating only low level I think it’s all important but you must start somewhere and principles of design and understanding how things are now where they came from and why are explained thru low level concepts broken down to simple things...
Add Comment