The Astronomical Magnitude Scale

The Astronomical Magnitude Scale

Definition

  • The Astronomical Magnitude Scale is a logarithmic scale used to measure the brightness of celestial objects from Earth’s perspective.
  • This scale is inverted, meaning that brighter objects have lower magnitudes, while fainter objects have higher magnitudes.

Categories of Magnitude

  • Apparent Magnitude: This measures how bright a celestial object seems from Earth. It does not account for the object’s actual brightness or its distance from us.
  • Absolute Magnitude: This measures a celestial object’s intrinsic brightness. It standardises measurements by calculating how bright an object would appear if it were 10 parsecs (approximately 32.6 light years) away from Earth.

Understanding the Scale

  • Every 5 steps up the magnitude scale represent a 100-times difference in brightness: for instance, a star of magnitude 1 is 100 times brighter than a star of magnitude 6.
  • The Human Eye’s Limitation: Without any aids, our eyes can only see objects up to around 6 magnitude on a clear, dark night.

Notable Magnitudes

  • The Sun: The brightest object in our sky has an apparent magnitude of -26.7.
  • Vega: This star was originally set as 0 on the magnitude scale. Other magnitudes were based on Vega’s apparent brightness.
  • Faintest objects visible with the Hubble Space Telescope: These reach up to an apparent magnitude of about 30.

Historical Context

  • The magnitude system was first developed in ancient Greece. Greek astronomer Hipparchus proposed the idea of ranking stars on the basis of their brightness, founding the concept of the magnitude scale.
  • The modern system was refined to be more precise and quantitative, incorporating the logarithmic scale to account for the human-eye’s logarithmic response to light.