The motion of our Galaxy through the Universe is reflected in a systematic shift in the temperature of the cosmic microwave background - because of the Doppler effect, the temperature of the background is about 0.1 per cent higher in the direction of motion, with a correspondingly lower temperature in the opposite direction. This effect is known as dipole anisotropy. If our standard cosmological model is correct, a related dipole effect should also be present as an enhancement in the surface density of distant galaxies in the direction of motion. The main obstacle to finding this signal is the uneven distribution of galaxies in the local supercluster, which drowns out the small cosmological signal. Here we report a detection of the expected cosmological dipole anisotropy in the distribution of galaxies. We use a survey of radio galaxies that are mainly located at cosmological distances, so the contamination from nearby clusters is small. When local radio galaxies are removed from the sample, the resulting dipole is in the same direction as the temperature anisotropy of the microwave background, and close to the expected amplitude. The result therefore confirms the standard cosmological interpretation of the microwave background.
Vol. 416, no. 6877 (Mar 2002), pp. 150-152