Ned began tying his shoes at 5:23 a.m. He took a 20 minute break for lunch and finished at 3:11 p.m. He spent 25% of his shoe-tying time trying to think of the name of those things at the end of his laces. How much time did Ned spend actually tying his shoes?
I admit it's a stupid example, but the problem is much harder than it needs to be. Why is this problem such a pain? Because the units used to measure time are awkward.
There's no need for this. The angelic souls who gave us the metric system stopped short for some reason. Even in our more civilised nations, people still measure time with years, months, days, hours, minutes, and seconds. This is ridiculous. Since years, months, and days are all based on significant physical phenomena, I'll not pick at those. However, hours, minutes, and seconds (which are 1/24, 1/1440, and 1/86400 of a day, respectively) have no place in any clean system of measurements.
To give people an idea of how the chron relates to old units, here are some conversions:
1000 chrons = 1 day 250 chrons = 6 hours 42 chrons ~= 1 hour 10 chrons ~= 15 minutes 1 chron ~= 1.5 minutes (86 seconds) 1 decichron ~= 9 seconds 1 centichron ~= 0.9 seconds
If timezones are not used (which is an issue I haven't adressed here), then the current time in chrons is almost a simple function of the Modified Julian Date:
chrons ~ non_integral_portion(MJD) * 1000
I say "almost" because MJD is related to UT, not UTC. If the length of a chron is fixed and the number of chrons is also fixed, MJD and chrons will usually be slightly off.
I can't fix my watch, but I do have various clocks and helpful software for my computer.