Why does it take so long to transmit an image from New Horizons to Earth?
I just got the news that the New Horizons space probe has passed by some remote planet on the edge of the solar system.
I was surprised that the guy from NASA says that it might take 24 months from us to get the photo of that planet.
The solar system is not that big, right? It is slow because the signal transmission is slow, right? But why is the transmission so slow?
The question refers to "the photo", as if there is only one. *New Horizons* captured many images during the brief flyby, and also captured a good amount of non-imagery data as well. Per multiple articles, New Horizons should have capture 900 about high resolution images, with about 50 gigabits of data captured in total.
*`The solar system is not that big, right?`* The solar system is *huge.* Here's a picture of Earth taken from (approximately) the orbit of Neptune. Can you find us? There we are. According to NASA Earth isn't even a full pixel in the original image, but rather a measly *12%* of one pixel (the color bands, by the by, are lens flares), even on a narrow field of view. The wide angle is worse.
A point that doesn't seem to have been mentioned in any of the answers is that the intensity of the signal goes like the inverse square of the distance. The distance from the earth to Ultima Thule is about $10^4$ times greater than the distance to the moon, for example, so all other things being equal, signals will be weaker by a factor of $10^8$ than in the case of communication with the moon.
@BenCrowell My suspicion, though, is that all else isn't equal. In particular, I would hope that something designed to send information from billions of miles away would use an _extremely_ directional antenna.
@Fattie I'm not terribly familiar with this particular probe's design, but, as someone who designs RF equipment, I'd be shocked if antenna size has anything to do with it. I suspect the antenna size is very close to optimal for the frequency band in use. Optimal antenna length for 8 GHz is quite small. The transmit power available and distance across which the transmission must take place without much error are the much more likely causes of the low bit rate.
@Fattie A quarter wave of 8.4 GHz is just under 9mm (about a third of an inch,) for example. It's low-frequency (i.e. high wavelength) transmitters that require very large antennas.
@reirab - that is fascinating. I deleted my comment since it was poor quality! I guess I meant "it's not powerful enough".
@Chris And the intensity falloff is inverse *square* of the distance.
"The solar system is not that big, right?" It's big enough that *light* takes *7 minutes* to reach us from the Sun on a direct path, and we're one of the *inner* planets. If the flash from a bolt of lightning reached you in 7 minutes, the thunder would take over 11.6 years to reach you. It may be tiny compared to a galaxy or even bigger structures, but on human size scales (which is what you're thinking about if you're talking transmission times, bandwidths, etc.), it's absolutely massive.
In general the higher gain (more directional) you want an antenna to be the bigger it gets. The actual radiating/receiving element will be small but the reflector can be pretty much arbiterally large.
@Draco18s I think the point of the question is that the probe surely cannot be 2 light years from Earth yet. So it's asking why it will take 2 years to get the data if the probe is less than 2 light years away. An answer to that will have to explain how both latency, bandwidth, signal-noise-ratio, and other factors influence the transmission time.
New Horizons has just passed the Kuiper Belt Object (KBO) 2014 MU69 also known as Ultima Thule. KBOs form a belt of asteroids (the Kuiper Belt) from Neptune's orbit outwards and of which Pluto is the largest member of the Belt. During the encounter with Ultima Thule, all of the 7 instruments on New Horizons were gathering data (although not all at the same time) and the total data collected is expected to be about 50 gigabits of data (compared to 55 gigabits of data taken during the Pluto encounter in 2015).
Since New Horizons is about another billion miles further out than Pluto was and 3 more years have elapsed, there is less power for the (tiny) transmitter and the signals are much weaker. The bit rate is about 1000 bits per second and so the 50 gigabits to transmit this will take 50e9 bits / 1000 bits per second = 50,000,000 seconds or about 579 days. Converting (roughly) to months by dividing by 365.25 and multiplying by 12 shows that it will indeed take about 19-20 months to transmit everything back. The first image at about 300 meters per pixel resolution and so about 100 pixels across the 30 km KBO, should be received on 2019 Jan 1. A second higher resolution image with about 300 pixels across the KBO is expected to be downloaded by 2019 Jan 2. There will be a press conference on 2019 Jan 2 when these images are due to be released and shown. (more details on what to expect when at Emily Lakdawalla's Planetary Society blog entry)
After the initial data download, they expect to perform some analysis to see which images have the best data with 2014 MU69 in the frame. Given the uncertainty in the position of 2014 MU69 and the high speed of the encounter, they had to shoot strips of images and not all will contain the target. These data will be prioritized in the downlink so they arrive on the ground first and can be analyzed first.
As mentioned by @luis-g there is also the Solar conjunction which will cause a 5 day period (according to PI Alan Stern in the 2019 Jan 3 press briefing) when reception of the data won't be possible. We would expect this to re-occur in January 2020 but these approx. 10 days don't make a large difference to the time taken which is dominated by the weakness of the received signal after the 15W transmission travels the ~4 billion miles and falls off due to the inverse square law, the corresponding low bitrate allowed by the need to have the transmitted data decodeable and the amount of data to transfer.
Two additional points relating to the initial delay: 1. it's about six light hours away, so there is that minimum delay 2. New Horizons can't point its instruments at the target and its antenna at Earth at the same time, so transmission of the data has to wait until data gathering is done.
I feel this answer could be improved by highlighting that the low power (~200W) available to the craft means it has to operate in the MHz range rather than GHz range (since high frequency requires higher power); and to provide error correction for such a long range, multiple cycles must be high or low so that we can see the difference - resulting in the data transition being pushed down into the KHz range....
@UKMonkey - NASA deep space probes use S band, X band, or Ka band, all of which are in the gigahertz range. New Horizons uses X band for downlink, 8.4 GHz.
@DavidHammen so it is ... Maybe the battery needs to charge between transmissions then ... in either case, this answer fails to explain why the bandwidth is so low. It just says "the bandwidth is low because it's low".
@UKMonkey: The answer does state the reason: *there is less power for the (tiny) transmitter and the signals are much weaker.* You could read up on channel capacity to understand why a signal that is just a bit over the noise floor provides less capacity than a stronger signal.
@UKMonkey - There's no battery that needs charging. (How could it charge? Solar cells are pretty much useless beyond Jupiter, and *New Horizons* is way beyond Jupiter.) *New Horizons* instead has a radioisotope thermoelectric generator (RTG).
@UKMonkey - See The_Sympathizer's answer, which goes into some extra detail regarding why. *New Horizons* uses phase-shift keying modulation. The probe must reduce the data rate as signal to noise ratio decreases to ensure that the ground systems properly detect each phase shift with a sufficiently high probability. While the ground systems can handle small bit error rates, too high an error rate results in loss of frame sync (a frame of data has to be tossed), and an even higher error rate means loss of bit sync (essentially no signal).
GHz, MHz, kHz for a carrier is not that related to power required per unit of bandwidth. @UKMonkey's source of confusion might be processor switching speeds, where higher frequency means more power. Actually, because of diffraction, you could end up with greater radiance (radio power per solid angle) with *way* less power when you use microwave (GHz) bands.
@PedroA 2019... as in yesterday and today... (it's all in the article OP links to).
@SteveLinton I don't think either the six light hour distance or waiting until after the data was gathered adds much to the delay. The transmission time is dominated by the `number_of_bits * bit_rate`, and comes out at over a year, so ping times of hours don't make much difference. The fly-by was also much much faster than the transmission time.
@chirlu I know you know this. It's just a comment on your comment lest others miss the point: The signal is always "just over the noise floor". If you have more power, less distance, higher RX or TX antenna gain, ... then you up the data rate until the signal is as low as you can handle wrt noise while attaining the acceptable error rate.
The other answer mentions it, but this gives a bit more theory as to the why.
It's effectively for the same reason that your phone or Wi-Fi don't work as well and slow down when that they are far from the hotspot or cannot get a clear line of access to the cell tower, more commonly known as having "few bars": the signal gets weaker and as a result the signal-to-noise ratio (SNR) goes down.
This means that the error rate - failure to successfully transmit a bit and have it received correctly at the sender - goes up, because there is a greater probability that some fluctuation, like other sources of radio waves such as the stars and astrophysical phenomena, or even thermal fluctuation within the receiving devices themselves, can be taken as representing data.
As a result, to ensure that the bits successfully make it through, they have to be transmitted for a longer time so that they can be more clearly distinguished over that noisy background and won't be spuriously flipped. The poorer the SNR, the longer you need to transmit to make it clear. Another way to say it is that when you have a noisy background, and you turn on the transmitter, it creates a statistical bias in the noise fluctuations as its transmissions become superimposed upon them, e.g. putting a sinusoidal variation on top.
At very low levels, this statistical bias is very small and thus requires a long sampling time to collect enough data to tease it out with high probability and since you don't know what data is coming at you by definition, you want the thing you're trying to tease to be as predictable as possible over the teasing time, and thus you must be sending only a single specific type of signal over that time and not switching between bits, limiting the bit rate to exactly that time.
A mathematical theorem called the Shannon-Hartley Theorem analyses this precisely and gives the exact bounds on just how fast you can transmit data and still have it reliably heard over a given level of noise relative to the strength of the transmitting signal.
For an understanding of the spatial scales involved here and thus exactly what one is up against: your phone has to deal with a cell tower maybe 10 km away ... but here the probes are easily over 6000 Gm away (that's 6000 billion meters and so 600 million times further), and naturally we need a very large antenna, and because of the concerns just mentioned, the transmission rate is limited to, as said, about 1 kbit/s, taking a full millisecond for every bit transmitted, versus your phone at several Mbit/s or more.
To downlink an uncompressed 8-bit (greyscale) 640x480 picture at that rate of 1 kbit/s takes 640*480*8/1000 ~ 2500 s or 2.5 ks (kiloseconds). A 4K UHD image would take 3840*2160*8/1000 ~ 66 ks to downlink, or the better part of a day (86.4 ks). Compare that to your broadband domestic Internet connection where streaming 4K video (up to 60 frames per second so four million times faster) comes down with ease. (ADD NOTE: as mentioned in the comments, this last comparison may not be entirely accurate as there is also a significant amount of (lossy) compression on "real" 4K streams, or any Internet video streams for that matter, which is unacceptable for high-fidelity scientific data that can at best use purely lossless compression only so as not to introduce unnecessary errors.
Even with no compression, however, your typically decent 100 Mbit/s Internet connection would still be able to downlink maybe about 1-2 frames of video per second which is still enough to perceive something understandable as motion, albeit greatly slowed and incremental, and far higher than the data rates achieved here of a bit more than one frame per day.)
This is also one of the reasons that Martian exploration would be significantly aided by, and it has been proposed to use, telepresence robotics controlled from a human base near, but in orbit of, the planet.
ADD: More accurately, the distance to 2014 MU69 is around 6600 Gm.
This is the actual answer to this question. Detail: When SNR goes up, then your signal gets better. You've confused that in the first part of your answer.
@AtmosphericPrisonEscape : Yeah, of course. Thanks for catching that. Fixed.
It might help to add that *New Horizons* uses phase-shift keying to modulate the digital data signal on the carrier wave. For a fixed data rate, the probability that ground systems will incorrectly detect a phase shift increases as the SNR decreases. For a fixed SNR, the probability that the ground systems will incorrectly detect a phase shift decreases as the data rate decreases. The bit error rate needs to be sufficiently low lest the received data be rendered useless, or not received at all due to loss of frame sync or even worse, loss of bit sync.
Comparing to streaming 4K video is sort of misleading, since that's heavily compressed. Well under 20 mbps, not the ≈4gbps you imply (3840*2160*8*60). (Or more like 18gbps since it's often 12bit color). Probably worth a quick note about why the data can't use lossy compression (artifacts and such).
On top of the slow data transmission rate (explained in astrosnapper's answer), I think it is worth pointing out that New Horizons will enter solar conjunction next week, meaning that we won't be able to receive any transmissions from it due to the Sun blocking them.
I don't know how many times this will happen over those 24 months, but it is an additional reason for the long(er) wait.
Source: NASA News Conference [42:18]
I guess it will happen once a year, as Earth’s movement around the Sun is the determining factor here.
Just to put some perspective on things:
1. New Horizons is really far away from the Earth.
At the moment of closest approach, New Horizons was over 6,600,000,000 kilometers away from Earth. This is about 6 light-hours. And the spacecraft is continuing to get farther by about 14 kilometres per second.
2. Transmissions from farther away are weaker.
The inverse square law states that the intensity of things like radio signals and sources of light (energy per unit of area perpendicular to the source) is inversely proportional to the square of the distance. That means doubling the distance results in us receiving only a quarter of the energy.
3. New Horizons only has so much power to work with.
The spacecraft is powered by a single RTG (radioisotope thermoelectric generator) that contains ~11 kg of Plutonium-238. At launch, this produced 245 watts (at 30 volts of direct current) of power, but due to radioactive decay, this decreased to 200 watts by the time of the July 2015 Pluto flyby, and further to 190 watts by the time of the January 2019 MU69 flyby.
For data transmission, it has a 2.1-meter diameter high gain dish antenna, a 30-centimeter diameter medium-gain dish antenna, and two broad-beam, low-gain antennas. The high-gain beam is 0.3 degrees wide, and the medium-gain beam is 4 degrees wide (used in situations when the pointing might not be as accurate). New Horizon's radio system is powered by a TWTA (Traveling Wave Tube Amplifier), which consumes 12 watts. (That's about the same as a modern CFL light bulb!)
There are actually two TWTAs for redundancy; one with left-hand circular polarization, and one with right-hand circular polarization. After launch, they figured out a trick to use both TWTAs at the same time, which increased the data transfer rate by 1.9 times. They used this two-TWTA mode to get all the data back from the Pluto flyby more quickly.
4. There's a limit how sensitive the antennas on Earth can be.
Even though we listen for New Horizon's transmissions using enormous 70-meter dish antennas from the Deep Space Network, there comes a point where it starts getting difficult to discern the signal amongst a sea of white noise and other interference, because the signal is so weak.
Here's the 70-meter dish from Madrid. It's hard to do much better than this.
5. So, the downlink speed has to be restricted because of the very weak signal.
As elaborated upon in The_Sympathizer's answer, the signal-to-noise ratio gets lower when the signal gets fainter, and so you have to transmit data more slowly in order to make sure that the data you receive is correct.
NASA has a neat interactive page that shows what each antenna in the DSN is doing right now. Here's a screenshot from January 3, 2019, 01:11 UTC:
As you can see, the signal that this dish is receiving from New Horizons is only 1.29E-18 W in strength. That's 1.29 attowatts. That's extremely weak.
So, as a result of the faint signal, it looks like the people at NASA decided to restrict the downlink rate at about 1000 bits per second (125 bytes per second), as an optimal balance between data integrity and downlink speed.
As a point of comparison, the https://google.ca homepage (when you're not logged in) comes out to about 1 MB. So, if you tried to open the Google homepage at the speed of the New Horizons downlink, it'd take over 2 hours for the page to fully load.
6. There is a lot of data.
New Horizons was busy during the flyby. It collected about 50 gigabits of data (6 GB). So at 1,000 bits per second, on-and-off (the solar conjunction that Luis G. pointed out will also briefly delay the data transfer), it'll take about 20 months for the full set of the Ultima flyby data to be sent back to Earth.
- During the Pluto flyby in July 2015, downlink speed was at about 2,000 bits per second, and it took about 15 months to download all 55 gigabits (7 GB) of Pluto data.
- During the Jupiter flyby in February 2007, downlink speed was at about 38,000 bits per second.
Further reading: Here's an interesting related question: How to calculate data rate of Voyager 1?