An explanation of the stellar magnitude scale and how it works:
It is logarithmic and in dealing with light that is helpful for the large
ranges of values, just as decibels are in dealing with sound, which
has a large range of values, also.
The larger the number, the fainter the object.
If a star is one magnitude brighter than another one,
it is 2.512 times brighter. If a star is 5 magnitudes fainter,
it is 100 times fainter, in terms of how bright
(how much light it gives off.)
2.512 is approximately the fifth root of 100.
Here is a more detailed explanation of it:
http://en.wikipedia.org/wiki/Apparent_magnitude
The sun is -27, and the full moon is -12.
The full moon is about 1/450000 as bright as the sun.
Venus is -4.7 to -3.8
Jupiter is -2.9 to -1.6
Mars at opposition is -2.9 to -1.5
Saturn is 0.0 to 1.0
Uranus is 5.7 or so.
Neptune is 7.7
Pluto is 13.5 or fainter.
Sirius (the brightest star) is -1.5
Vega is about 0.0 magnitude, so is Arcturus,
and the faintest stars visible in a dark location
are about 6.5.
Orion's bright ones are 0.0 and 1st magnitude, some 2nd also,
and fainter.
The Big Dipper's stars, Polaris, and Kochab, Casseiopeia,
and Andromeda, have mostly 2nd magntude, with a few
third magnitude ones also.
In suburban America , the naked eye limit is usually 5.0 unless you
are standing near street lights, then it's 4 or 3, or worse.
To do any serious wide field photography of the Milky Way or
faint fuzzies, one needs to go away from city lights, especially in
urban metro areas. In the Northeast, Eastern Connecticut, the
Litchfield Hills, Massachusetts Quabbin area, Catskill Mountains,
Adirondacks, and parts of Northern New England mountain areas,
are good places to find dark skies.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment