I blame global warming and CO2
Before somebody else does.
We know the Earth’s rotation is slowing, hence leap seconds, but by how much? Now we have a new number: 1.8 milliseconds per century for the last 2,720 years. That's the average figure the solar day has increased by since 720BC according to researchers at the Durham University and the UK’s Nautical Almanac Office. The boffins …
"Before somebody else does."
I was going to SPECIFICALLY blame human activity, since that isn't causing 'global climate anything'. It will give lefty/socialist politicians, activists, environmental wackos, and AlGore something to keep the donations flowing with...
Not as silly as you might think. There's a slight (but measurable/calculable) slowdown due to ice melting and moving away from the poles. As it does so, the earth's moment of inertia increases and, by conservation of momentum, we slow down a bit. There's even an annual term caused by the same effect; see the 'variability of the earth's rotation' plot at
You can see the annual cycle superimposed on the longer-term, not really predictable changes over the last few decades. (You can't see the longer-term trend described in the paper; that's only noticeable over a span of centuries/millennia.)
This post has been deleted by its author
Neither can I. That's why we have scientists, who publish their results and their methodology so that other scientists can corroborate or invalidate the figures.
Us plebians can only take note of the paper's conclusions and wait for the fallout.
so that other scientists can corroborate or invalidate the figures
Yeah. Lots of funding to repeat already-performed experiments, is there? As far as I can see most science funding is to prove something the funding body already believes in passionately, and the prospect of getting further funding that might upset the apple cart is non existent.
When I say "corroborate or invalidate the figures", I mean go over the calculations, the methodology, the data, and find out if there are any mistakes in how the experiment was conducted.
If one can find an error in the calculations, or prove that the initial data set was flawed, or derive a different conclusion from the same data set, then the conclusion can be put in question - which will entail more funding for more experiments.
But if no one can dispute the figures, if no discrepancy can be found in the data, if the methodology is sound, then there's a good chance that experiment is valid and the conclusion is valid.
Until a new discovery puts in question the initial data set or the methodology.
"When looking for a 1.8ms discrepancy, a second is a long time."
True enough. But it's an angular acceleration, not a constant amount. The idea is that a century ago, the earth took 1.8 milliseconds less to complete one rotation. Two centuries ago, that was 3.6 milliseconds, and so on.
Go back (say) a millennium, and you're talking 18 milliseconds per day. On average, with no difference today and 18 milliseconds back then, it's 9 ms/day. A millennium contains about 365250 days; multiply it out, and you're talking about 55 minutes. Since it's quadratic, two millennium ago means four times that.
The records from that far back are sometimes along the lines of 'the sun/moon was eclipsed at sunrise/sunset'. Quite imprecise, but good enough to measure a four-hour effect, especially if you get several such records. Note this plot from the paper referenced in the article :
As you get back to "ancient" events, the total time difference Delta-T gets into the tens of thousands of seconds. The actual eclipses aren't all that close to the plotted parabola, but close enough to be convincing.
> When looking for a 1.8ms discrepancy, a second is a long time.
A margin of error of 1 second/century would completely screw up this entire calculation. What is the probability that eclipse calculations from 720 BC to 1620 AD fell well within the 1 second margin or error?
Also, I didn't realize that atomic time started being calculated in 1620. That's downright awesome.
A margin of error of 1 second/century would completely screw up this entire calculation.
No it wouldn't. If the day is longer by 1.8mS per century, then that equates to an accumulated time difference of about 33 seconds over that century (average difference is 0.9mS per day, accumulated difference is therefore 100 * 365.25 * 0.9 mS = 32872.5 mS)
Over 1000 years the average accumulated change would be nearly an hour (average change over 1000 years = 9mS, accumulated change is therefore 1000 * 365.25 *9 = 3287250 mS)
I trust that the boffins have taken into account the change in the Moon's orbital period, which will also have been affected by tidal forces and so alter the time of the eclipses.
> If the day is longer by 1.8mS per century, then that equates to an accumulated time difference of about 33 seconds over that century
What if a day does not get longer by 1.8mS per century?
That a day gets longer by 1.8mS/century was the conclusion of this paper, not the assumption.
Check your logic. You start with assumptions, and then reach a conclusion. Not the other way around, which is what you are doing here.
If the assumptions are incorrect, the conclusion is bound to be incorrect too. Garbage in, garbage out.
One of the assumptions was that eclipse calculations from 2000 years ago were accurate. What if they were inaccurate by a margin of error of 1 second per measurement?
How do you know that eclipse measurements from 2000 years ago were accurate to the second? Were you present when the measurements were taken?
"How do you know that eclipse measurements from 2000 years ago were accurate to the second?"
They weren't. No-one has tried to look at historical data down to the second, they looked at the accumulated difference, which is much larger. Each day since 720BC has been on average 24ms longer than it would have been without any slowing, so we're now nearly 7 hours offset from where we would have been. It's exactly the same as having a clock that runs very slightly slow; you won't be able to see any difference if you put it next to an accurate clock and look at the second hands, but put the same two clocks next to each other a few months later and the slow one will show a completely different time. And yes, astronomers 2000 years ago were easily able to calculate eclipses with better accuracy than several hours, which is how they can tell how much the Earth has slowed down since then.
Note that this is also the case for leap seconds; they might sound big next to deviations of milliseconds, but again they correct the cumulative drift. There's a nice graph on Wiki that shows this - https://upload.wikimedia.org/wikipedia/commons/5/5b/Deviation_of_day_length_from_SI_day.svg ; just a millisecond or two difference in the length of a day results in 27 seconds difference in what a clock says the time actually is after a few decades of that small difference constantly adding up.
As for the result only being an average, that's true, but not especially relevant. The Earth is really, really big. Anything capable of making a significant step change in its rotation would be utterly catastrophic, and certainly no such thing has happened in the last couple of millennia. While the change has not been exactly a constant 1.8ms every century, it has certainly been a steady, somewhat meandering drift and not a sudden change at some point during the period studied. As that graph shows, there's a 1ms or so seasonal variation, a somewhat larger short term drift, and then the longer term drift which was the subject of this paper; even the largest earthquake on record couldn't cause a sudden 48ms step change.
This post has been deleted by its author
"We've had about 14 centuries since 720BC, meaning the days now are about 25ms longer than they were back in the 8th century. Thus, assuming an constant rate of change, each day since 720BC has been about 13ms longer than it otherwise would have been..."
No, we've had 27 centuries since 720BC. 27*1.8 = 48.6ms difference in the length of a day between then and now, giving the mean difference of 48.6/2 = 24.3ms.
If the earth's rotation has changed then the track of the eclipse would be different so you just have to check where on earth the total eclipse was visible. At the equator the earth moves about 1,000 miles/hour or 1/3 mile/second. If the track of the eclipse is 10 miles wide along a line of latitude then the noticeable error induced by a sped up earth would be 30 seconds. If there are observations along the edge of the track it might be possible to narrow this down to the order of a second. That would be a pretty accurate data point over 1000 years.
With a model and curve fitting parameters could be pretty accurate.
Of course, the earth is a lousy time keeper with several effects causing jumps and speed changes at the milliseconds/year (and the north pole wanders around) so the best that could be expected from these data points would be some sort of an average.
First of all it was leap years, then leap seconds, now the whole orbit and what about relativity, when the earth speeds as it slingshots round the sun do we have to change our watches relative to a fixed point in space time, and that great big rock in the sky, what's that doing to our timepieces then? Perpetual bloody chronographs my ass.
This post has been deleted by its author
Neglecting the all-but insurmountable problem of an "average" value presented for what is obviously a dynamic process of deceleration more suited to analysis by means of fiendishly complex integral calculus, this revelation throws the main issue into stark relief: why are we still putting the bloody clocks back and forward when this variation makes a mockery of the process?
Isn't it time to finally rid ourselves of the tyranny of this bucolic klepsidra-bothering in the same way we finally ditched a bunch of ridiculous licensing laws foisted on us by a jumped-up teetotaler during WW1?
To the barricades! Vivre Les Brexieurs! Down with the Clockwatchers of Whitehall! Give us back our lost hours-and-a-bit!
is 2.137 1036 erg and is proportional to the square of the angular velocity. A slowdown of 1.8 ms/day per century is a relative decrease of 0.5 10-10 in angular velocity per year, releasing 8.5 1026 erg/year.
World electricity production from all energy sources was (in 2014) 22433 TWh or 8 1026 erg/year.
Note that the rise in seawater level slows the eath's rotation, but does - in first approximation - not change the rotational energy.
Biting the hand that feeds IT © 1998–2021