Given that astronomy is the oldest science and that sunrise and sunset are very basic celestial phenomena, you might have thought this would have been sorted out centuries ago. Not a bit of it!
We live on a rocky ball surrounded with air, which makes it hard to pin down the time of sunrise and sunset with reasonable accuracy.
For example :
- Which part of the sun should be peeking over the horizon ?
- Which horizon should you use : a fictitious, idealized flat plane or one taking into account local conditions like valleys and mountains
- Finding the Sun : We can only see the suns disc after its light has travelled through the Earth’s atmosphere. The resulting refraction means the sun is not quite where it appears to be, an effect which varies with latitude and meteorological conditions.
So, given all these issues the best astronomers have come up with is a definition of sunrise or sunset as :
the moment when the centre of the sun’s disc makes an angle of 90.8333° with the zenith, that is, the point. directly over the observer’s head.
This dispenses with the tricky problem of defining a standard horizon but why is the angle not exactly 90° ?
- First, over the course of a year, the sun’s disc appears to cover an average of 0.53333° of the sky, so the distance between the centre of the sun and its edge is half this, or 0.26667°.
- Second, the earth’s atmosphere bends the light of the sun by an average of around 0.56666°
So, 0.56666° must be added to the radius of the sun to ensure appearances match with reality, which gives the 0.8333 part of the final number.
The definition is, then, a bit of a fudge, based on annual averages and ignoring local circumstances. So, sunrise and sunset times can’t be precisely predicted for any given locality (especially at relatively high latitudes, where refraction effects are strong) and the values quoted in newspapers can be out by a minute or more.