Looks like someone hasn't had to deal with data from multiple different time zones processed by code written by different developers over several years!
Unix time could be unambiguous, but it is not in practice. If it has been unambiguous for you, congratulations!
Use ISO style dates if you don't want to spend the rest of your life explaining what a date actually means.
> data from multiple different time zones processed by code written by different developers over several years!
This feels strongly an argument FOR unix timestamps. Simply a number, the biggest mess up is seconds/millis. This order of magnitude difference is not that hard to disambiguate.
Unix timestamps have a convention of being an offset from Jan 1 1970 00:00. But is that Jan 1 1970 GMT or UTC or local time? There is a right answer here! But! Did all the developers do the right thing? Was the default implementation correct?
---
What I'm saying is - in your json document, you see a number and you rely on a convention to decode that number. The number has no way to tell you anything about what the number actually means. The ISO standard does. People could still mess it up, but I feel like the people who use the ISO standard care a lot more than the people who use the Unix standard.
"The right thing" is, for example, simply `System.currentTimeMillis()` in Java. I don't have to think about what 0 means. All I need to know is there's a mapping from an instant to a number.
This representation is arguably closer to the essense of time. Barring relativistic effects, it's just an affine line. Dates and timezones are artificial structure that we put on this line.
Looks like someone hasn't had to deal with data from multiple different time zones processed by code written by different developers over several years!
Unix time could be unambiguous, but it is not in practice. If it has been unambiguous for you, congratulations!
Use ISO style dates if you don't want to spend the rest of your life explaining what a date actually means.