How bright a star appears can depend on how much light the star actually emits and how far away it is. The stellar magnitude scale can be adjusted to account for distance as follows:
M_2 - M_1 = log (b_1/b_2)
Here, M refers to a star's absolute magnitude, that is, how brightly it appears from a standard distance of 10 parsecs (or 32.6 light-years). The absolute brightness of Sirius is 1.4 and the absolute brightness of Betelgeuse is -8.1.
a) Which of these two stars is brighter, in absolute terms, and by how much?
Ok, so I am not sure what I should do here.
But what I did try was subbing in 10 for m_2 and subbing in 1.4 for M_1. and then solving for (b_1/b_2). I ended up getting 10^8.6 for (b_1/b_2) which is for Sirius.
Then I did the same for Betelgeuse. subbing in 10 for M_2 and -8.1 for m_1 and solving for (b_1/b_2). I ended up getting 10^18.1 = (b_1/b_2)
Then the question asks: Which of these two stars is brighter, i absolute terms, and by how much?
My answer is that Betelgeuse is brighter than Sirius by 10^18.1- 10^8.6 = ?
Did I do any of that right? Please provide an explanation. Thank you!