Chronology | Current Month | Current Thread | Current Date |
[Year List] [Month List (current year)] | [Date Index] [Thread Index] | [Thread Prev] [Thread Next] | [Date Prev] [Date Next] |
From: John Mallinckrodt <ajm@csupomona.edu>
Date: September 14, 2012 3:51:38 PM MDT
My guess would be that that is a typical distance for stars that actually ARE visible (there are ~400 stars within 10 parsecs) so adopting this standard means that most stars will have *absolute* magnitudes that line up nicely with familiar *visual* magnitudes.
John Mallinckrodt
Cal Poly Pomona
On Sep 14, 2012, at 1:09 PM, Bill Nettles wrote:
I've done some searching, but obviously not in the right places:
How did astronomers arrive at 10 parsecs as the distance used for computing absolute magnitudes?
Bill Nettles
Union University