In astronomy, absolute magnitude (M) is the apparent magnitude, m, an object would have if it were at a distance of 10 parsecs (about 32.616 light years, or 3×1014 kilometers). In defining absolute magnitude it is necessary to specify the type of electromagnetic radiation being measured. When referring to total energy output, the proper term is bolometric magnitude. The dimmer an object (at a distance of 10 parsecs) would appear, the higher its absolute magnitude. The lower an object's absolute magnitude, the higher its luminosity. A mathematical equation relates apparent magnitude with absolute magnitude, via parallax. The Hertzpsrung-Russel diagram relates absolute magnitude with luminosity, stellar classification, and surface temperature.
Many stars visible to the naked eye have an absolute magnitude which is capable of casting shadows from a distance of 10 parsecs; Rigel (-7.0), Deneb (-7.2), Naos (-7.3), and Betelgeuse (-5.6). For comparison, Sirius has an absolute magnitude of 1.4 and the Sun has a magnitude of 4.5/4.8. Absolute magnitudes generally range from -10 to +17.
Comparing with visual apparent magnitudes, what you see when you look up at night, Sirius is -1.4, Venus gets to -4.3 at best and a full moon is -12. The last object with a magnitude comparable to the absolute magnitude of those three stars above named was visible as a supernova a thousand years ago; its remnant is the Crab Nebula, M1. Chinese astronomers reported being able to read by it, see their shadows in its light and observe it clearly in broad daylight.
Confusingly, for comets and asteroids a different definition of absolute magnitude is used, because the above one would be of little use. In this case, the absolute magnitude is defined as the apparent magnitude that the object would have if it were one astronomical unit from both the Sun and the Earth and at a phase angle of zero degrees. This is a physical impossibility, but it is convenient for purposes of calculation.