How would Jupiter's brightness relative to our sun seem to a remote observer (observing from a remote star)

  • Given that Jupiter is 5 AU from our sun and a remote observer viewing our solar system from some other part of our galaxy looks at it. We assume that Jupiter's radius is $11 \times 6700$ km.



    What would be the relative brightness (luminosity?) of Jupiter relative to our sun's?


    You may want to clarify the point of the viewer. Title says remote galaxy, while the content suggests some where else in our galaxy.

    just edited. I'm assuming that the distance if very very big, yet the remote viewer can detect the light

    Phase of the planet will make difference. Jupiter will only be fully lit when it's directly *behind* the sun, and not visible at all.

  • ProfRob

    ProfRob Correct answer

    8 years ago

    The absolute visual magnitude of the Sun is about 4.8. (This is the Sun's brightness viewed from 10 parsecs).



    The brightest visual magnitude for Jupiter is -2.7, when it is about 4 au from the Earth. Using the usual formula we can therefore calculate the absolute magnitude for Jupiter as 25.9. (Note that this is way brighter than the faintest objects seen for instance with the Hubble Space Telescope; but as Jupiter would be separated from the Sun by only 0.5 arcseconds, a telescope with even better angular resolution than the HST would be needed to pick it out of the glare from the Sun).



    Thus to first order Jupiter is 21 visual magnitudes fainter than the Sun (a factor of 250 million) . The situation is slightly worse than this because as seen from outside the solar system, Jupiter could not be fully illuminated, because it must have an angular separation from the Sun to be visible at all. A half-illuminated Jupiter would be at least a magnitude fainter (actually a factor $\pi$ for a Lambertian reflector).



    But all is not lost. If we we to switch to the infrared, the situation is far more favourable. The approximate ratio of Jupiter/Sun brightness improves from about $1.4 \times10^{-9}$ in the visible to about $2.8\times10^{-8}$ at a wavelength of 10 microns (Traub & Oppenheimer (2010). But this is still a 19 magnitude difference.



    The technology to achieve these sorts of contrasts is not quite there yet, even if we were viewing from nearby stars. An example of what is currently possible can be seen with the discovery of a few Jupiter mass planet around the star GJ 504 by Kuzuhara et al. (2013). The star-planet contrast was about 15-16 magnitudes at near-infrared wavelengths of 1-4 microns and with a star-planet separation of 2.5 arcseconds (equivalent to Jupiter as seen from only 2 parcsecs).


    I got a bit lost with Traub & Oppenheimer's study. How did they come to the conclusion that Jupiter's brightness relative to our sun is 1.4×10^-9? Thanks !

    @vondip It sounds about right. I arrived a $4\times10^{-9}$ using simple considerations. But as I pointed out, Jupiter would be $\sim 1$ mag fainter, because it could not be at "opposition" and this gives $1.6\times10^{-9}$. T+O indeed appear to have calculated a number at "maximum elongation" - i.e. it is half-lit by the Sun from the observers point of view. For "Lambertian" reflection this appears to reduce the flux by $\pi$, so is in excellent agreement with my calculation.

License under CC-BY-SA with attribution


Content dated before 7/24/2021 11:53 AM

Tags used