How did Astronomers deduce that the Sun was not a ball of fire?

  • Its common knowledge that people used to think that the sun is a ball of fire or molten metal, but when did science start to prove otherwise?


    Comments are not for extended discussion; this conversation has been moved to chat.

  • HDE 226868

    HDE 226868 Correct answer

    one year ago

    I think it's maybe not the case that there was a moment when the astronomy community conclusively rejected the ball-of-fire hypothesis; astronomers simply accumulated more and more evidence against it. If you want to put a rough date on it, you could put your finger somewhere in the middle of the 19th century, as by then, other ideas had taken hold.



    Back in the classical period, Anaxagoras had proposed that the Sun was a heap of molten metal. I don't know whether this was widely accepted by his contemporaries. The idea of the Sun as a ball of metal or fire certainly persisted for some time, though perhaps largely for lack of any better ideas. We didn't even understand oxygen and combustion until the work of Lavoisier and others in the late 18th century, so detailed calculations were presumably out of the question for a millennium or two after Anaxagoras. I don't know when calculations of how long combustion could sustain the Sun were first done, but it appears to have been not more than several decades after the theory of combustion was developed.



    Why? Well, we can say that by the middle of the 19th century, the predominant explanation for the Sun's luminosity was not the burning of coal but instead gravitational potential energy. By the 1860s, it was widely known that chemical reactions could only power the Sun for a few thousand years. We also now had a potentially viable alternative: a decade earlier, Hermann von Helmholtz had begun exploring the idea that gravitational contraction of some sort, by what we now call the Kelvin-Helmholtz mechanism, was the source of energy, with gravitational potential energy being transformed into heat$^{\dagger}$. Around the same time, Lord Kelvin suggested that meteors falling into the Sun provided the necessary energy, a similar mechanism to Helmholtz's. I believe astronomers continued with the contraction hypothesis through the turn of the century - I've seen an article written around 1900 to that effect.



    However, during the early and mid- 1900s, quantum theory and nuclear physics were being developed, and the work of Eddington, Bethe and others would lay the groundwork for our current understanding of solar energy production. Previous models (including, finally, Kelvin-Helmholtz contraction) were now known to be insufficient because they allowed the Sun to shine for only thousands or millions of years, and geologists had established that Earth itself was much older than this. Fusion, on the other hand, allows the Sun to survive for billions of years - a timescale that matches up well with the age of the Earth. We also knew that hydrogen and helium were the dominant constituents of the Sun and other stars; while Wollaston and Fraunhofer had performed the first solar spectroscopy observations in the early 1800s, the true composition of the Sun was not accepted for more than a century, when Cecilia Payne made a detailed study of spectral lines.






    $^{\dagger}$ While this does produce heat in various bodies, including T Tauri stars, it is not significant in most stars beyond that stage.


    “it is not significant in most stars” – well it is; it's what's gets fusion going in the first place! It is just by itself not enough to keep the star from cooling down again quickly.

    @leftaroundabout Fair point! I've edited to change that phrasing.

    "understand oxygen and combustion until the work of **Lavoisier**", +1. (prior to the edit, this answer wasn't in 'human readable text', nor did it mention the person who figured it out)

    @leftaroundabout "Quickly" being relative, of course. It's still enough for hundreds of millions of years, and in the case of objects like white dwarves, it's enough to keep them shining for longer than the Sun is going to spend on the main sequence. That's part of why we didn't search very hard for a better explanation - there was a long... let's say fight... between geologists who've been finding more and more evidence that the Earth's age is on the order of billions of years, and physicists who thought that was preposterous.

    @Luaan A white dwarf cools from a solar luminosity to a tenth of that in less than a few million years. They are "long-lived" at luminosities of $10^{-5}$ to $10^{-6}$ that of the Sun.

    Just to add to the history here, it actually took a long time before people decided that gravitational collapse alone couldn't sustain the Sun because they had no idea how old it was and similarly also didn't know how old the Earth was (presumably both would be approximately the same age). It took strong advances in geology (some by Darwin himself, in his attempt to prove evolution) before people started believing the Earth and thus the Sun was older than hundreds of millions of years and thus gravitational collapse could not support the entire energy output of the Sun.

    "detailed calculations were presumably out of the question for a millennium or two". I guess you meant for a century or two unless it's hyperbole.

    @farhanhubble Anaxagoras was living circa 500 BCE, so I did indeed mean "millennium".

License under CC-BY-SA with attribution


Content dated before 7/24/2021 11:53 AM

Tags used