Alt text:
It’s not just time zones and leap seconds. SI seconds on Earth are slower because of relativity, so there are time standards for space stuff (TCB, TGC) that use faster SI seconds than UTC/Unix time. T2 - T1 = [God doesn’t know and the Devil isn’t telling.]
We use datediff in sql and let God handle the rest.
“Oh but they’re in different time zones” “Oh did you account for if one is in day light savings and other isn’t” “Aren’t some of these dates stored in UTC and some local?”
Are all problems I do not care about.
This is why we should just move to a universal time zone and stop with the day light savings.
We have that, it’s called Unix time, and the only thing it doesn’t account for is time dilation due to relativity.
If your system hasn’t been upgraded to 64-bit types by 2038, you’d deserve your overflow bug
Let’s just nake it 128-Bit so it’s not our problem anymore.
Hell, let’s make it 256-Bit because it sounds like AES25664 bits is already enough not to overflow for 292 billion years. That’s 21 times longer than the estimated age of the universe.
If you want one-second resolution, sure. If you want nanoseconds a 64-bit signed integer only gets you 292 years. With 128-bit integers you can get a range of over 5 billion years at zeptosecond (10^-21 second) resolution, which should be good enough for anyone. Because who doesn’t need to precisely distinguish times one zeptosecond apart five billion years from now‽
If you run a realistic physical simulation of a star, and you include every subatomic particle in it, you’re going to have to use very small time increments. Computers can’t handle anywhere near that many particles yet, but mark my words, physicists of the future are going want to run this simulation as soon as we have the computer to do it. Also, the simulation should predict events billions of years in the future, so you may need to build a new time tracking system to handle that.
With a 128 bit integer you can represent 340 undecillion (or sextillion if you use the long scale notation) seconds, which is equivalent to 10 nonillion (or quintillion, long scale) years. The universe will long have have stopped being able to support life by then because stars stopped forming (enough time would have passed it could have happened a hundred quadrillion (a hundred thousand billion, long form) times over assuming we start counting from the birth of the universe).
Cries in vintage computer collection tears.
You are a monster phoneymouse
I love the word “Epochalypse”, from the wiki page you linked
I thought that’s what datetime was based off of, tbh.
I know, but it’s not standard anywhere in the world.
Give it a few more decades
Swatch’s Internet Beats are making more and sense every time Daylight Savings forces a timezones change. Why are we still using base 12 for time anyway?
From the wikipedia:
TCB ticks faster than clocks on the surface of the Earth by 1.550505 × 10−8 (about 490 milliseconds per year)
It’s amazing that this level of detail is relevant to anything.
Without considering this, most people wouldn’t be able to drive anywhere they haven’t been before anymore.
“Wouldn’t be able to” is a bit of a stretch, since Thomas Maps existed long before GPS. But it wouldn’t be so easy as it is now.
Most people couldn’t use a Thomas guide in the mid 90s either.
I was about when I navigated using an AAA TripTik to get my mom and younger brothers through an 1600 mile road trip. We also had AAA guidebooks for along the path when I had to help pick a motel along the way because a torrential rain slowed us down. It was a fun game of figuring out how far it was, whether it had enough stars (mom said at least 2, but the more the better), and the best price.
It was the late 80s though, and I didn’t have a game boy, so it was kinda the only form of entertainment.
Thank you, but I gave up halfway through the list.
I got to “The day before Saturday is always Friday” and I was like waaaa?
I thought it is about when Julian calendar was dropped in favour of Gregorian, but that’s not it:
Thursday 4 October 1582 was followed by Friday 15 October 1582
Also some of the islands around the International Date Line did switch their stance on which side of the Date Line they are. So… they might have had a day twice or lost a whole day in the process. And maybe, they didn’t change sides only once…
E.g. see here https://youtu.be/cpKuBlvef6A
A great video you linked, the missing Friday is in it on timestamp 22:45
The Thursday 29th of December 2011 was followed by Saturday 31st of December 2011 on Samoa
Epoch is your friend, or use UTC. At least that’s my layman reasoning. I have no challenges working with DateTime except when I don’t know the underlying conditions applied from the source code.
This one is good (or evil, depends on how you see it):
Human-readable dates can be specified in universally understood formats such as 05/07/11.
That one’s really good.
Which one is it?
- July 5th 2011
- May 7th 2011
- July 11th 2005
- November 7th 2005
And is it 2011/2005 or rather 1911/1905, 1811/1805,…?
I really wish that list would include some explanations about why each line is a falsehood, and what’s actually true. Particularly the line:
The software will never run on a space ship that is orbiting a black hole.
If the author has proof that some software will run on a space ship that is orbiting a black hole, I’d be really interested in seeing it.
Technically isn’t the Earth itself a sort of space ship which is orbiting (…a star which is orbiting…) the black hole at the center of the Milky Way galaxy? Not really close enough for time dilation to be a factor, but still.
All links to the original article are dead and even archive.org doesn’t have a capture either. I guess the argument is along the lines of “it might not be relevant, when you’re scripting away some tasks for your small personal projects, but when you’re working on a widely used library or tool - one day, it might end up on a space vessel to explore whatever.”
E.g. my personal backup script? Unlikely. The Linux kernel? Somewhat plausible.
It’s a programmer thing. As you’re typing the code, you may suddenly realize that the program needs to a assume certain things to work properly. You could assume that time runs at a normal rate as opposed to something completely wild when traveling close to the speed of light or when orbiting a black hole.
In order to keep the already way too messy code reasonably simple, you decide that the program assumes you’re on Earth. You leave a comment in the relevant part of the code saying that this part shouldn’t break as long as you’re not doing anything too extreme.
Well in a very strict sense one can’t really say “never” (unless you can see into the Future), but it’s probably safe to go along with “It’s highly unlikelly and if it does happen I’ll fix it or will be long dead so won’t care”.
Does anyone know what is untrue about “Unix time is the number of seconds since Jan 1st 1970.”?
When a leap second happens, unix time decreases by one second. See the section about leap seconds here: https://en.m.wikipedia.org/wiki/Unix_time
As a side effect, this means some unix timestamps are ambiguous, because the timestamps at the beginning and the end of a leap second are the same.
It might be more accurate to say that Unix time is the number of days since Jan 1st, 1970, scaled by 24×60×60. Though it gets a bit odd around the actual leap second since they aren’t spread over the whole day. (In some ways that would be a more reasonable way to handle it; rather than repeating a second at midnight, just make all the seconds slightly longer that day.)
This post made my head hurt
Ah I’ve gotten to the point where I have to define what “frame” and epoch each time base is in before I’ll touch the representation of time( Unix,Gregorian, etc) .To be honest I’m probably just scratching the surface of time problem.
Hell probably the reason we haven’t seen time travellers is we suck at tracking time and you probably need to accurately know your time and place to a very good precision to travel to a given point and we can’t say where and when that is with enough accuracy to facilitate where to land. And people don’t want to land in the earth’s surface or 10000 km away from a stable orbit. Maybe some writer can build that out for a time travel book or to discount it for some reason lol
I recall a short story like that where someone died because they time traveled, but didn’t account for position.
Highly recommended
Really good, thanks
Holy crap I wasn’t ready for that. Great rec tho
SPOILER
I want little Emily to change her future. A sequel is needed!
–
(Thanks for sharing, was a good watch.)
Then there’s continental drift, which as Indiana Jones reminded us this past summer, Archimedes didn’t know about when he built his time machine.
Pet peeve: brushing aside the time travel fantasy element, there is not a single shred of evidence of any type of connection between Archimedes and the Antikythera Mechanism.
As if the only person clever enough in Ancient Greece was that one famous dude from Syracuse.
Ionians: “Are we a joke to you?”Could you eli5 what frame and epoch are? I don’t get why aren’t unix timestamps an adequate way to store time, they seem pretty easy and intuitive
I just spent two days debugging a reporting endpoint that takes in two MM-YYYY parameters and tries to pull info between the first day of the month for param1 and the last day of the month for param2 and ended up having to set my date boundaries as
LocalDate startDate = new LocalDate(1, param1.getMonth(), param2.getYear()); //pretty straightforward, right?
//bump month by one, account for rollover, set endDate to the first of that month, then subtract one day
int endMonth = param2.month == 12 ? param2.month + 1 : 1;
LocalDate endDate = new LocalDate(1, endMonth, param2.year).minusDays(1);
This is extraordinarily simply for humans to understand intuitively, but to code it requires accounting for a bunch of backward edge/corner case garbage. The answer, of course, is to train humans to think in Unix epoch time.
Using YearMonth.atEndOfMonth would have been the easier choice there, I think
holy shit, yeah it would have. tyvm, I’ll be putting in a PR first thing monday!
Would you mind trying to explain (ELI5 style) what you did before and why you are excited for this new method for those of us who dont understand code?
it does in a way that’s been reviewed, vetted and tested by a lot of people the thing that I’m trying to do with code that’s only ever been seen by me and one other guy and has been tested to this best of my ability, which i hope is quite good but one person can easily miss edge cases and weird coincidences.
Tried and tested, now gotta brush up those searching skills and use llm to get your work done quicker
So how is the new thing different/better (other than less lines i guess?) If you dont me asking
It’s easier to understand, easier to review for correctness, and less likely to cause problems with additional changes in the future. Even though it sounds counterintuitive, software developers generally try to write as little code as possible. Any code you write is a potential liability that has to be maintained, so if you can instead just call code that others have already written and that has been tested, you’ll want to do that. (Note that “less code” doesn’t mean fewer lines of code, it means less logical complexity, which is often, but not always, also less in terms of characters/lines)
So like my english teacher taught me. Keep It Stupid Simple(though he would say keep it simple stupid to some people in class i am just realizing now 20+ years later)
it’s simpler and a lot easier for another engineer to look at and understand later, so they can verify that it’s right or change it if it’s wrong or we decide to do something a little bit different. it’s also been reviewed and tested by a lot of people working in a lot of cases that are all a little bit different from one another, so the odds that their code is correct are better than the odds that my code is correct, all other things being equal
To break it down a bit further, the code that was provided is specifically trying to get the last day of a month, which I’ll call Month X since it will vary. The code is doing these things, in this order:
- Get the month after Month X
- If Month X is 12 (aka December) get Month 1 instead (aka January)
- Get the Date that is day 1 of the Month from step 2
- Get the Date that is one day before the Date from step 3
All this to get the last day of the month from Month X. The reason they did it this way is so they didn’t have to say “Is this February? Then get day 28. Is this January/March/etc? then get day 31.” and so on.
The code that the other user provided will instead get the last day of Month X without having to do all those steps. It’s doing something in the background to get the same data, but the coder doesn’t have to worry about exactly how because they can trust it will work as expected.
It ultimately boils down to the user carving out a round piece of wood, fitting it on an axle and bolting it on, then to find someone already has cheap wheels for sale that are more stable than what they just made.
Thanks!
You’re welcome! A big part of coding is finding how other people solved the problems you’re solving and finding how to incorporate their work into yours.
In the example you gave, wouldn’t the year be off by one when
param2.month
is 12?I was transcribing it from memory and that exact problem cost me like two hours when I was writing it the first time. Well spotted, now write me a unit test for that case.
Y’know, I legitimately said to myself “I bet they were writing that from memory and just forgot the edge case. I wonder if that was a problem when doing it originally?” before I wrote that comment. 😂 Time to get some Spock tests set up!
Unix epoch time in UTC, making sure that your local offset and drift are current at the time of conversion to UTC…
i don’t even care if its wrong, I just want the code to be readable.
You should care if it’s wrong.
at the resolution of clock drift in milliseconds when I’m running reports that are, at most, only specific to the day?
Clock drift? No. Time zones? Probably.
not really time zones either outside the edge case where a data point exists within delta of midnight so that the time zone drift would result in a date change
Time zones change. Relative times without time zones don’t make sense.
Unix epoch time is wrong too, as it doesn’t include leap seconds, meaning your time difference will be off by up to 15 seconds.
All dates and times shall be stored and manipulated in Unix time. Only convert to a readable format at the top of the UI, and forget trying to parse user inputs :P that’s just impossible
I picture this being read by the fred armisen “believe it or not, straight to jail” character
LOL whenever I have to work with DateTime systems that try to account for every possibility (and fail trying) I am reminded that in some disciplines, it’s acceptable to simplify drastically in order to do ‘close enough’ work.
I mean, if spherical cows are a thing because that makes the math of theoretical physics doable, why not relativity-free or just frame-constant date-time measures that are willing to ignore exotic edge cases like non-spherical livestock?
Because “relativity” isn’t even close to your biggest problem with time. The way we communicate time changes historically, unpredictably, without obvious record. The only way to know what time you’re talking about is to know exactly how you got your information. What location, measured at what time relative to recorded changes in the local time zone, with how much drift relative to the last time you synchronized to which ntp server, and so on. These things easily account for hours or days of error, not just nanoseconds.
Not even going to general relativity, as this comics suggests to experience time slowing down due to gravity, the events are not just at time, but also at particular location in space relative particular inertial system. Not specifying it, and not specifying the inertial system for which the final answer is needed makes it impossible to calculate even in special relativity, without effects of gravity.
Here’s the shortlist of horrors I’ve had to deal with in my career:
- Mixed US/ROW short date formats - DD/MM/YY, MM/DD/YY
- mixed timezones in the same column
- the wrong timezone (marked as PDT but actually UTC, or sometimes the other way around)
- clock drift
- timezones again…because timezones suck
- historical timezones
- NTP configurations
Things I’ve read about but haven’t needed to deal with personally:
- leap seconds
- clock slew vs skip
- hardware clocks
- PTP
The one thing I really don’t care about is relativity
it’s all relative anyway
t2 - t1 = what even is time anyway???
We can divide by t and then we get 2-1=1. Simples!
deleted by creator
Relevant Tom Scott video the answer depends on location, times, country, religion, and a bunch of other factors.
I wonder what fraction of xkcd readers have already seen this video… I imagine it is quite high. I know I’ve watched it multiple times. It’s a good one.
Tom Scott and Veritaseum.
Scrolled trough here just to search for this comment 😂
C++ user with operator overloading: “T2 minus T1.”
Let someone else implement the class. There’s probably a library for it.
Is there a straightforward way to know which overloading will be used or is it just a roulette every time you add a sketchy new library to your build?
That’s fine if T1 and T2 include a datetime with the exact local timezone. A simple timestamp or timestamp + utc offset won’t work.
That’s me in the bottom right corner over there.
Is it you in the bottom right spotlight over there?
Looks like they’ve lost something.
deleted by creator