Given that astronomy is the oldest science and that sunrise and sunset are very basic celestial phenomena, you might have thought this would have been sorted out centuries ago. Not a bit of it!


We live on a rocky ball surrounded with air, which makes it hard to pin down the time of sunrise and sunset with reasonable accuracy.
For example :

  1. Which part of the sun should be peeking over the horizon ?
  2. Which horizon should you use : a fictitious, idealized flat plane or one taking into account local conditions like valleys and mountains
  3. Finding the Sun : We can only see the suns disc after its light has travelled through the Earth’s atmosphere. The resulting refraction means the sun is not quite where it appears to be, an effect which varies with latitude and meteorological conditions.


So, given all these issues the best astronomers have come up with is a definition of sunrise or sunset as :

the moment when the centre of the sun’s disc makes an angle of 90.8333° with the zenith, that is, the point. directly over the observer’s head.



This dispenses with the tricky problem of defining a standard horizon but why is the angle not exactly 90° ?



So, 0.56666° must be added to the radius of the sun to ensure appearances match with reality, which gives the 0.8333 part of the final number.

The definition is, then, a bit of a fudge, based on annual averages and ignoring local circumstances. So, sunrise and sunset times can’t be precisely predicted for any given locality (especially at relatively high latitudes, where refraction effects are strong) and the values quoted in newspapers can be out by a minute or more.