6

I actually learnt this last year but here I go in case someone else steps into this shit.

Being a remote work team, every other colleague of mine had some kind of OS X device but I was working this Ubuntu machine.

Turns out we were testing some Ruby time objects up to a nanosecond precision (I think that's the language defaults since no further specification was given) and all tests were green in everyone's machine except mine. I always had some kind of inconsistency between times.

After not few hours of debugging and beating any hard enough surface with our heads, we discovered this: Ruby's time precision is up to nanoseconds on Linux (but just us on OS X) indeed but when we stored that into PostgreSQL (its time precision is up to microseconds) and retrieved it back it had already got its precision cut down; hence, when compared with a non processed value there was a difference. THIS JUST DOES NOT HAPPEN IN OS X.

We ended up relying on microseconds. You know, the production application runs on Ubuntu too. Fuck this shit.

Hope it helps :)

P.s.: I'm talking about default configs, if anyone knows another workaround to this or why is this the case please share.

Comments
Add Comment