Leap seconds are an abomination
With news making the rounds that we’re soon gonna have negative leap seconds, I thought I’d bring up the concept of leap seconds, generally.. what problem it attempts to solve, and whether it solves it well. (Obviously not well, imo..)
In my experience (I’ve written and worked with time series libraries for ML, prediction and such) the only advantage / capability leap seconds provide is that if you multiply the no. of seconds since 1/1/1970 by 24×3600, you land on the 1st second of the day. That’s it. That’s it’s only feature: if you want the actual date you must still consult a calendar library, which maintains lookup tables for leap days, years and such. (I have no gripe with leap days and years btw.)
But since we all have to use a date library anyway why not incorporate the adjustment to when a calendar day’s second begins rather than the measure of time itself. This all seems antiquated, from simpler times, a kind of backdoor software update to the implicit “date libraries” every app used. We work in UTC milli- and micro-seconds, nowadays, besides.
I think the basic idea of a software update (here, the defined leap second) adjusting systemwide calendar day and hour definitions is right. It’s just that implementing it via leap seconds, while expeditiously easier, is just wrong. Since it’s still an update, no matter what, how about we implemented it another way. Let the standards bodies occasionally readjust the seconds when a day begins (as when 1/1/70 was defined to be zero), down to milliseconds, if need be, instead of mucking around with the “absolute” measure of time and we’ll update our date/calendar libraries (which we’ll have to anyway, whether we use leap seconds or not.)
PS a new project I’m working on breaks if system time ever decreases. I’ll now have to use a custom adapter to add future negative leap seconds to system time.
submitted by /u/gnahraf
[link] [comments]