The stellar magnitude scale compares the brightness of stars using the equation m_2 - m_1 = log(b_1/b_2). where m_2 and m_1 are the apparent magnitude of the two stars being compared (how bright they appear in the sky) and B_2 and B_1 are their brightness (how much light they actually emit). This relationship does not factor in how far from Earth the stars are.
a) Sirius is the brightest-appearing star in our sky, with an apparent magnitude of -1.5. How much bright does Sirius appear than Betelgeuse, whose apparent magnitude is 0.12?
My answer: (which is correct)
Siruius appears to be 41.69 times brighter than Betelgeuse.
b) The sun appears about 1.3 x 10^10 times as brightly in our sky as does Sirius. What is the apparent magnitude of the sun?
I need help on this part.
I subed in m_1 = -1.5 and (b_1/b_2) = (1.3 x 10^10)
and then solved for m_2
m_2 - m_1 = log (1.3x10^10)
m_2 - (-1.5) = log (1.3 x 10^10)
m_2 + 1.5 = log (1.3 x 10^10)
m_2=log (1.3 x 10^10) - 1.5
therefore the apparent mag of the sun is 8.61.
However, the back of the book says that it is -11.61.. How do I get it and what did I do wrong?