General Astronomy/Luminosity

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Luminosity is the total amount of light emitted by a particular star. It is the star's intrinsic brightness, or the amount of energy it emits each second. In many cases, this energy comes from thermonuclear reactions inside the star. In general, a star's luminosity depends on its current stage in its evolutionary sequence. Stars radiate various forms of electromagnetic radiation, such as ultraviolet radiation, visible light, infrared, and radio waves.

The luminosity of a star depends on its temperature and size. If two stars of different sizes both have the same temperature, the larger of the two will have a greater luminosity due to its larger surface area.

The formula for calculating luminosity is

where T is the star's temperature in Kelvins.

Another way to measure a star's luminosity is by calculating its magnitude. There are two types of magnitude: apparent and absolute. Apparent is the luminosity of an object when viewed from Earth; absolute is the brightness of that object if it were 10 parsecs away from Earth.

The Sun has an apparent magnitude of -26.7, and an absolute magnitude of 4.8, while a full moon has an apparent magnitude of -12. Most visible stars range in apparent magnitude from -1.5 (Sirius) to 4 (Alcor). The dimmest ones just visible to the naked eye are around 6.5 (4 in the city with bright lights).

Stars are classified by orders of magnitude: alpha, beta, gamma, delta, epsilon, and zeta. These are called Bayer letters.

Cepheid variables are pulsating stars. They have a longer period and higher magnitude than other stars. By calculating the star's period and magnitude, the distance can be determined. Astronomers discovered from the Cepheid variables in Andromeda that the nebula was 2 million light-years away.

The stars which have been measured by astronomers up to date range from to 500,000 times the luminosity of our Sun.