27

Well, if your tests fails because it expects 1557525600000 instead of 1557532800000 for a date it tells you exactly: NOTHING.

Unix timestamp have their point, yet in some cases human readability is a feature. So why the fuck don't you display them not in a human readable format?

Now if you'd see:

2019-05-10T22:00:00+00:00

vs the expected

2019-05-11T00:00:00+00:00

you'd know right away that the first date is wrong by an offset of 2 hours because somebody fucked up timezones and wasn't using a UTC calculation.

So even if want your code to rely on timestamps, at least visualize your failures in a human readable way. (In most cases I argue that keeping dates as an iso string would be JUST FUCKING FINE performance-wise.)

Why do have me parse numbers? Show me the meaningful data.

Timestamps are for computers, dates are for humans.

Comments
  • 1
    That's why I avoid naked unix timestamps but also any string representation. I always use some kind of wrapper around them usually there is a good library to handle that.
  • 2
    This would make an excellent joke:

    What do two computers that like each other go on?
    Timestamps, it's their idea of dates.
  • 2
    @c3ypt1c that's just great! I have to tell that one to my colleagues! 😂😂😂
Add Comment