A site for popularization of Astrometry and Celestial Mechanics.

    © 2004-2014  Istituto Scientia.





    Astrometry.org

    Astronomy

    Astrometry

    Celestial Mechanics

    Stars

    Magnitude

    Online Planetarium

    SAO star catalog


Globular Cluster M3

   Globular Cluster M3


Galaxy M101 in Ursa Major

   Galaxy M101 in Ursa Major




    Links

    Astrophysical.org

    About Astrometry.org



Magnitude

The apparent magnitude of a star is a measure of its apparent brightness.

The brightest stars that can be seen by the naked-eye are defined as “first magnitude”. Slightly fainter stars are called "second magnitude". This scale continues down to “sixth magnitude”, that are he faintest stars we can see (naked-eye) in very good conditions (no light pollution, no Moon).

This is how the Greek astronomer Hipparcos (in Latin: Hipparchus) ranked stars in the 2nd century B.C. His important star catalog can be considered the first notable writing in scientific astrometry. In the 2nd Century A.D. the astronomer and astrologer Ptolemy adopted the Hipparcos scale in his own star catalog. However, Ptolemy differentiated between some of the stars that were listed as having the same magnitude (eg. slightly brighter, slightly fainter).

For the newcomer to astronomy, it can be confusing that (for example) a star of magnitude 2 is fainter (instead of brighter) than a star of magnitude 1. One would expect that as the brightness of a star increases, the number would. This is not the case of magnitudes: this system works “backwards”.

The first important change to the six-magnitude system came with Galileo's telescopic observations. He could see a large number of stars fainter than those that were classed as 6th magnitude, and in 1610 he suggested that the brighter of these newly seen stars should be classed as 7th magnitude.

Since then, the scale has continued to grow. Astronomers discovered that 1st magnitude stars were around 100 times brighter than 6th magnitude stars. This ratio (100) may seems too much, but it is due to the “logarithmic” features of our eyes (the same can be said for our ears: the decibel scale, that measures the power of sounds, is logarithmic either).

So in 1856 the astronomer Norman Pogson proposed a scientific definition of magnitude, based on the real ratio of physical luminosity for a difference of five magnitudes, that he defined as a brightness ratio of exactly 100 to 1. Accordingly, a 1 magnitude difference equates to a brightness of the 5th root of 100, approximately 2.512 times. In mathematical terms, magnitude is a logarithmic measure of brightness, with base 2.512.


Limiting magnitude

The faintest stars that can be seen under given conditions (naked-eye or telescope, light-pollution or not, bright Moon or not, and so on) determine the “limiting magnitude”.

With small binoculars (such as the popular 8x30) a dark sky enables us to see the 9th magnitude, and with regular binoculars (such as the popular 10x50) the 10th magnitude can be reached. A small telescope (3-inch-diameter) can reach 11m stars, and an average telescope for amateurs (6-inch-diameter) will let us seek out 13m stars (or other 13m astronomical objects). Largest amateur telescopes (such as 20-inch-diameter Dobsonian) allows us to see 16m stars.

The larger your optics, the fainter you see. Every time the diameter is doubled, 1.5m are gained.

Photography enables fainter stars to be captured. Earth-bound telescopes can record images approaching 24m, while Hubble Space Telescope pictures can reach 30m.

Each magnitude can be divided into decimals: for example the bright stars Castor and Pollux (in the constellation Gemini) have magnitudes 1.2 and 1.6 respectively.

A number of so-called "first magnitude" stars showed to be much brighter than others that were also classed as first magnitude. Then astronomers, rather than downgrade all the other stars in magnitude, extended the scale. Thus, the star Vega in the constellation Lyra is around 0.00m. The brightest star in the night sky, Sirius, has a magnitude of -1.46m: so the scale allowed negative numbers to include brighter celestial bodies. The closest star to the real "first magnitude" is Spica, which is 0.98m.

The planets Mars, Jupiter and Venus often appear even brighter than Sirius: Mars and Jupiter can be brighter than –2 and Venus can reach a peak of –4.4. The full Moon reaches an overall magnitude of –12.5 and the Sun of –26.7.

About photography, it was noticed in the 1800s that some stars that were of the same brightness visually, appeared to be of different brightness when seen on photographic film, and vice versa. Photography turned out to be more sensitive to blue light, and less to red.

Two magnitude terms were therefore employed. Visual magnitude, abbreviated as mv, refers to how the star looks to our eye, while photographic magnitude, abbreviated as mp, refers to the brightness of stars on "blue-sensitive black and white film". The difference between these magnitudes is termed the colour index, and is a measure of the colour of the star. The more negative the value, the more blue the star is.


Limiting magnitude by the naked-eye

Many people believe that they can see million stars by the naked eyes: this is unrealistic. The fainter magnitude that can be seen by normal eyes in a dark sky (under perfect conditions) is 6.5m. There are 8000 stars that are 6.5m or brighter. However, at a given time, from a single point on Earth, we can see half sky (half the celestial sphere) so we can actually see 4000 stars.

Unfortunately the “limiting magnitude” is often lower than 6.5m and this dramatically reduces the number of stars that can be seen. Here are 3 examples of very good sky conditions, that can be reached in a mountain or desert area (with no light-pollution and no Moon), with limiting magnitude between 6.5 and 6.0:

Limiting Magnitude

6.5m

6.3m

6.0m

Theoretical number

8000

6000

4800

Actually visible

4000

3000

2400

In such good conditions, the Milky Way structure is clearly visible (by the naked eyes), the nebula M42 in Orion actually looks as a small nebula, and the galaxy M31 in Andromeda actually looks like a small oblong galaxy.

Of course, if conditions are not so good, fewer stars can be seen. Examples:

Rural area (low light-pollution)
Limiting Magnitude (LM) = 5m
Theoretical number of stars visible (naked eye): 1500
Actually visible at a given time: 750
Milky Way is barely visible; M42 and M31 look like small nebular objects rather than stars.

Sub-urban area (moderate/mild light-pollution)
Limiting Magnitude (LM) = 4m
Theoretical number of stars visible (naked eye): 500
Actually visible at a given time: 250

Urban area (severe light-pollution)
Limiting Magnitude (LM) = 3m
Theoretical number of stars visible (naked eye): around 160
Actually visible at a given time: around 80

Let's sum up in a single table:

Limiting Magnitude

6.5m

6.0m

5.0m

4.0m

3.0m

Theoretical number

8000

4800

1500

 500

 160

Actually visible

4000

2400

 750

 250

  80

The difference is striking: from a city you are likely to see around 80 stars in the sky, while in a mountain area you may see 4000 stars!

Of course these limiting magnitudes refer to the naked eyes. Binoculars and telescopes allow to see fainter stars, depending on the diameter of their main lens (or their primary mirror in the case of “reflecting telescopes”). Small binoculars (such as the popular 8x30) allows us to gain more than 2 magnitudes while regular binoculars (such as the popular 10x50) allows us to gain more than 3 magnitudes.

The gain allowed by binoculars or telescopes can be calculated upon the ratio between the diameter or the main lens (or primary mirror) and the diameter of the eye's pupil: multiply 5 by the decimal logarithm of this ratio and that's the gain in magnitude. Viewing in dark conditions, the diameter of the pupil is around 7 millimeters (several minutes are needed before our eyes adapt to dark conditions and our pupils enlarge). So, if our binoculars or telescope diameter is D millimeters, the formula becomes: gain = 5 * log(D/7) magnitudes.


Absolute magnitude

Apparent brightness is not equal to actual brightness: of course an extremely bright object may appear quite dim, if it is far away (the rate at which apparent brightness changes, as the distance from an object increases, is calculated by the inverse-square law.

The absolute magnitude, M, of a star or galaxy is the apparent magnitude it would have if it were 10 parsecs away (1 parsec is around 300,000,000,000,000 kilometres or 32.6 light years).

For comparison, the Sun has an absolute visual magnitude of 4.83 (it also serves as a reference point) and Sirius of 1.4. Absolute magnitudes of stars generally range from -10 to +17, that means that stars can be very, very different in brightness. For an object with a given absolute magnitude, 5 is added to the apparent magnitude for every tenfold increase in the distance to the object.


Thanks for your interest!
If you appreciate this site, please bookmark it.
If you own a website, please include a link to www.astrometry.org.



Astrometry.org - IstitutoScientia.it - Privacy Policy

© 2004-2014  Istituto Scientia