### How to calculate luminosity in g-band from absolute AB magnitude and luminosity distance?

How can I calculate the (non-bolometric) luminosity $L$ of a galaxy (or a star for that matter) over a given band from its AB apparent magnitude $m_{AB}$ over that band and its luminosity distance $d_L$?

For instance, consider the g-band which typically has a $\lambda_{eff} = 467 \text{ nm}$ and a $\Delta \lambda = 100 \text{ nm}$. Given this galaxy has an apparent AB magnitude of $m_g = 22.5$ and luminosity distance of $1991 \text{ Mpc}$ (i.e. $z = 0.355$ if you are curious), what is its luminosity? I know I shold use the following equation, but I don't know what value to pick for $M_\odot$.

$$L/L_\odot = 10^{0.4(M_\odot - M)}$$

I tried 5.12 from a website which is the value of $M_\odot$ in g-band but it gives me a different answer compared to when I calculate luminosity using flux:

$$L = 4 \pi d_{L}^2 f_\nu \Delta \nu$$

where $f_\nu$ is flux density and $\Delta \nu$ is the frequency width of the band (this is an approximation to the integration over frequency, assuming bands are step functions). So, how can I find luminosity using absolute magnitude?

Please provide your references.

You are confused. $L$ is usually reserved for a luminosity integrated over all wavelengths. This *cannot* be calculated simply from the g-band magnitude and distance without knowing more about the galaxy.

@CarlWitthoft, I get magnitude wrong by 1.

@RobJeffries, In the above question L is the luminosity over the g-band. And it doesn't matter if the flux is not constant because a telescope measures the average flux over a band. So, in this question, L is the luminosity over the g-band and it is not equal to total luminosity and I have made my notation clear by stating "non-bolometric luminosity". Feel free to edit it to $L_{g-band}$.

Then what are $L_{\odot}$ and $M_{\odot}$ supposed to be? These have very clear defined meanings, and they are bolometric quantities. You also need to tell us what flux (density) and $\Delta \nu$ you used if we are meant to be looking for errors.

@RobJeffries, $L_\odot$ is a constant equal to $3.848 \times 10^{26}$ W. $M_\odot$ on the other hand, should be band dependent. What's its value then? That's my question! I have stated $\Delta \nu$ and $f$ can be calculated from $m_g$.

Here is my attempt to reconcile your calculations.

If the AB g-band apparent magnitude is 22.5, then the flux density in the g-band is given by

$$f_{\nu} = 10^{(-48.6-22.5)/2.5} = 3.63 \times 10^{-29}\ {\rm erg\ cm}^{-2} {\rm s}^{-1} {\rm Hz}^{-1}$$If the distance is 1991 Mpc, then the absolute g magnitude is

$$ M_g = m_g - 5\log d + 5 = -19.0$$The absolute AB magnitude of the Sun

*in the g-band*(from your source) is 5.12.The latter tells us that the ratio of the "

*g-band*luminosites" of the galaxy and the Sun are given by

$$\frac{L_g}{L_{\odot,g}} = 10^{0.4(M_{{\odot},g} - M_g)} = 4.45\times 10^{9}$$One cannot say more than this, in particular one cannot calculate the luminosity of the galaxy, without knowing more about its spectrum.

**Also note that the equation above cannot be used to find the ratio of flux in one band to bolometric flux, as I think you are trying to do**. To see this, consider that the absolute V-band magnitude and bolometric magnitude of the Sun are almost the same. This does*not*mean that all the flux from the Sun emerges in the V band!Your second method requires a figure for $\Delta \nu$, but you haven't said what you have used. The g-filter has a width of around $\Delta \lambda =100$ nm. Using $\lambda \nu = c$ we can say

$$ \Delta \nu = |c\Delta \lambda/\lambda^2| = 1.28\times 10^{14}\ {\rm Hz}$$

$$ L_g = 4\pi d^2 f_{\nu} \Delta \nu = 2.2 \times 10^{42} {\rm erg/s}$$The two figures would agree precisely if the "g-band luminosity" of the Sun were $4.9 \times 10^{32}$ erg/s, or in other words if 13% of the solar luminosity emerged in the g-band as defined above. This does not sound unreasonable, but requires a detailed integration of the solar spectrum over the actual g-band filter profile.

**Additional:**Just to check the numbers above, we can estimate the flux density of the Sun (in wavelength units, above the atmosphere) to be about 1.7 W m$^{-2}$ nm$^{-1}$ at 467 nm (just search on 'solar spectrum' for many examples) at a distance of $1.5 \times 10^{11}$ m. Putting these numbers together, I calculate a "g-band luminosity" for the Sun of

$$ L_{\odot,g}=4\pi d^2 f_{\lambda}\Delta \lambda = 4.8\times 10^{25} \rm W$$

which is indeed 13% of the solar bolometric luminosity.13% is too much. This is supposed to be exact since it's almost doing the same calculation with two different methods. I suspect that $L_g$ via flux is correct given we use luminosity distance (distance adjusted for redshift), but further adjustments are needed to use magnitude and we might not be fine using $d_L$ for both methods. I guess what I'm saying is the problem might be due to redshift, but I have to think about it more.

Also, people don't do $L_g/L_{\odot,g}$, they just do $L_g/L_{\odot}$ which is just an assumed constant of $3.848 \times 10^26$ W. The point is unit having simple units not comparison necessarily.

@Miladiouss on what basis do say that 13% is too much? The "flux" method is not exact, since you apparently don't know the galaxy spectrum and assuming a top hat filter function is incorrect.

License under CC-BY-SA with attribution

Content dated before 7/24/2021 11:53 AM

Carl Witthoft 4 years ago

How different were your two answers? In addition, 100nm is a pretty wide band to assume uniform flux density.