
Peter Simons wrote:
A bank, for example, can't store its financing data in TAI, because TAI knows only the difference between two points in time. Between now and some day in the future, exactly 'n' seconds will pass. If that day is in the sufficiently distant future though, your mortgage repayment won't be due on April 1st some year, but on March 31st a second before midnight. Therefore these applications really need to store _calendar time_, which would be UTC et all.
But a computer cannot count calendar time unless it knows the leap-seconds in advance forever... So the computer must count in TAI. If you want to set a reminder event in the future (say an alarm to go off at a specific time), then you must convert the counted time (TAI) to localtime (in which the event would logically be stored for the reasons you gave above)...
You can map TAI to calendar time iff that date is in the past or in the near future. (I think leap seconds are announced at least a year before they occur or so.) That makes TAI unsuitable as an internal representation for most applications.
So, the computer counts time in TAI, but stores events in the calendar in which the user specified the events...
Assuming milliseconds (is this reasonable?):
IMHO, a good choice to store distance in time is:
type TimeDiff = (Integer, Float)
Floats do not give arbitrary precision... infact certain times (like 1ms exactly) may not be representable at all! Keean.