Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-L] absolute magnitude

On the other hand 100 parsecs or even a little further would have been a good choice if the goal were to produce an absolute magnitude scale that started near 0 as does the visual magnitude scale. At 100 parsecs a star like Rigel, one of the most luminous stars in our corner of the Milky Way, would have a magnitude of around -3, a little brighter than Sirius A, the brightest star in the night sky, which lies at a distance of 2.6 parsecs.

John Mallinckrodt
Cal Poly Pomona

On Sep 15, 2012, at 3:01 PM, John Mallinckrodt wrote:

The sky's a lot darker in Ogden than in Southern California. I suspect it's a pretty rare night when we see many more than 400 stars!

Seriously, though, I still suspect that the choice has to do with wanting the range of absolute magnitudes to line up in some reasonable way with the range of visual magnitudes. Note for instance that the absolute magnitude of the Sun--a pretty average star--is 4.8, which also happens to be a pretty average magnitude for so-called "visible stars." If one were to use, say, 100 parsecs as the standard distance for evaluating absolute magnitudes, then the Sun would (if I've done the calculation correctly) come in at something closer to a not-even-close-to-visible 10.

John Mallinckrodt
Cal Poly Pomona

On Sep 15, 2012, at 9:37 AM, Dan Schroeder wrote:

There isn't really a "typical" distance for the visible stars of the
night sky. Most of them are between 100 and 1000 light-years away,
but about 15% are closer, and about 10% are farther. The distance
distribution is so broad that calling any value "typical" would be
misleading. This is because the intrinsic luminosities of stars vary
by many orders of magnitude, allowing the rare, very luminous stars to
be seen over vast distances, while most of very closest stars can't be
seen without a telescope. In any case, 10 parsecs (33 light-years) is
closer than all but a couple percent of the naked-eye stars, and
nearly half of the naked-eye stars are more than ten times as far away.

Dan Schroeder

From: John Mallinckrodt <>
Date: September 14, 2012 3:51:38 PM MDT

My guess would be that that is a typical distance for stars that
actually ARE visible (there are ~400 stars within 10 parsecs) so
adopting this standard means that most stars will have *absolute*
magnitudes that line up nicely with familiar *visual* magnitudes.

John Mallinckrodt
Cal Poly Pomona

On Sep 14, 2012, at 1:09 PM, Bill Nettles wrote:

I've done some searching, but obviously not in the right places:

How did astronomers arrive at 10 parsecs as the distance used for
computing absolute magnitudes?

Bill Nettles
Union University

Forum for Physics Educators