Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
There isn't really a "typical" distance for the visible stars of the
night sky. Most of them are between 100 and 1000 light-years away,
but about 15% are closer, and about 10% are farther. The distance
distribution is so broad that calling any value "typical" would be
misleading. This is because the intrinsic luminosities of stars vary
by many orders of magnitude, allowing the rare, very luminous stars to
be seen over vast distances, while most of very closest stars can't be
seen without a telescope. In any case, 10 parsecs (33 light-years) is
closer than all but a couple percent of the naked-eye stars, and
nearly half of the naked-eye stars are more than ten times as far away.
Dan Schroeder
From: John Mallinckrodt <ajm@csupomona.edu>
Date: September 14, 2012 3:51:38 PM MDT
My guess would be that that is a typical distance for stars that
actually ARE visible (there are ~400 stars within 10 parsecs) so
adopting this standard means that most stars will have *absolute*
magnitudes that line up nicely with familiar *visual* magnitudes.
John Mallinckrodt
Cal Poly Pomona
On Sep 14, 2012, at 1:09 PM, Bill Nettles wrote:
I've done some searching, but obviously not in the right places:
How did astronomers arrive at 10 parsecs as the distance used for
computing absolute magnitudes?
Bill Nettles
Union University