8

Okay... I need to confess.
I actually like the idea of counting arrays from 1 like it is in some languages.
It makes code cleaner.
Think about it.
You would never need to subtract 1 from count/size/length or add 1 for things like the month in javascript because the first item would be at index 1 and many many errors wouldn't be happening because we don't need to force our minds to think another way. I learned counting from 1 after I learned to walk so it's the most natural thing to do. Just because the software/hardware below our language works that way doesn't mean we can not abstract this behavior away. What's your opinion about this? Am I wrong?

Comments
  • 3
    We should also change binary from “0” and “1” to “1” and “2”
  • 0
    no because this is logical.
  • 5
    Something something pointer arithmetic
  • 3
    You are wrong. Array indexes are about location, not counting.
  • 7
    An array is nothing more than a pointer from where on you can read and write on.
    So pointer + 0 for the first value is the definition of logic.

    Abstracting that behaviour away is bullshit because you'd have to have these -1 everywhere in every compiler and such. You create another step of complication for the sake of something thats not logic at all. As said, its about location. Coordinate systems don't start at 1 either.
    Even if it was about counting, 0 would make more sense, too. Cus numbers start at 0.
  • 3
    For high level languages that math grads and statisticians would use, it's a good idea. For programmers dealing with memory, not a good idea.
  • 1
    I actually don't have any preference. I just follow the language and never had any problem with it. If it's from 0 then ok, if it's from 1 it's not a big problem.
  • 0
    for(int i = 0; i < wtf.Count; i++)
    print("Yes, you are wrong, because" + wtf[i]);
  • 0
    array starts at some location in memory. let's call it x.
    therefore, first item of the array, being at its start, is in the location of x+0.

    if the first item were in x+1, your "array start" location (x) would have to falsely point to a memory one location in front of where the array is, thus x would claim the array starts at a location that doesn't belong to it.
  • 1
    It might be less intuitive, but it makes perfect sense for interacting with a machine. Also, what daintycode said.
  • 1
    Maybe to help you think about it a bit more intuitively, let’s use the example of a clock. We all know that there are 60 minutes in an hour. But the last minute of an hour is labeled as :59, not :60. This is so natural that we don’t even think about it. So there are already scenarios where we count from zero—computers are the same way. Think of arrays like clocks, not grocery lists.
  • 1
    It's not that I have problems with it and of course pointers are a valid reason against it. I was thinking more about JVM based languages that don't have pointers.
    But I still think that instead of handing it everywhere again and again it would indeed be better to solve it on the compiler level once. And I am not working against it I go with the language and never do hacks like the mentioned extension of an array just to get this "feature". And it's just a preference and nothing that I would enforce.
Add Comment