Does simply opening and closing a JPEG file decrease image quality?
I've had quite a few photography classes, read many photography books, and screened many forums. And I can't find a consistent answer to this question. One "camp" says there is a loss of image quality every time you open and close a JPEG file (due to compression). Other camp says there is no loss of image quality unless you actually EDIT the photo, then re-save it.
Does it make a difference if:
- I open the image in a standard image viewer and simple "close" the pic?
- I open the image in Photoshop Elements Editor and close it there?
- If I simply close an image vs. re-saving it?
Can someone give simple answer on when closing or saving a JPEG causes a decrease in image quality and when it does not?
Opening a JPEG does not 'decompress it' and closing it thus does not 'recompress it' and cause a loss in quality. The compression (and 'damage') is done when the JPEG is originally generated, _not_ when it's opened.
@ElendilTheTall: opening a JPEG image most definitely *will* have it being decompressed, at least if by opening you mean actually displaying it rather than then filesystem operation.
Possible duplicate of What image quality is lost when re-saving a JPEG image in MS Paint?
This is based on a misunderstanding. Loss of quality happens only during the compression that is done when an image is saved as JPEG. But it doesn't matter whether it was edited or not.
So: you will (with some very specific exceptions, see comments) lose quality if you open an image in an image editor and re-save it, even if you didn't make any edits. But if you only open it to display it and then close it instead of saving, then nothing will change.
By the way: this is only for traditional image editing programs like Photoshop. Programs like Lightroom that "develop" RAW files follow a different approach (even when handling JPEG files): they always keep the original image intact and separately save the editing steps that were done, which are applied when exporting the final results. So with such programs, you don't have to worry about losing quality (more than once, that is). But then, you shouldn't be using JPEG source files for them anyway.
"you _will_ lose quality... even if you didn't make any edits." That depends on the application. An application may well know that no changes were made and rewrite the original compressed scheme with no additional quality loss over the original JPEG.
Yeah, or if that program defaults to maximum JPEG quality (read: at least as good as the original quality) and none of the changes have a sufficient impact on the image to cause a net loss of quality through recompression (loading a perfectly compressed JPEG and passing it through a compressor again does not make the quality magically get worse).
@DocMax: theoretically possible, but do you know an application that definitely does that?
@LightnessRacesinOrbit: I'm very certain you're wrong about that. Transforming the data from the frequency realm to the original one and back can add additional errors due to quantization every time. Maybe it will eventually converge towards some steady state, but I doubt it.
@MichaelBorgwardt: I admit I'm committing some sort of hand-wavy conjecture, hoping that convergence is in play here. I don't really see why it can't be, though; it's certainly not entirely infeasible in the very general case of a lossy compression.
@MichaelBorgwardt `jpegtran` can be used to losslessly crop or rotate a JPEG, among other various transformations. I often use it to further reduce file size and/or strip out unwanted fluff such as EXIF data, comments, etc: `jpegtran -optimize -copy none -perfect -v input.jpg > output.jpg `
Some programs don't even let you save (or literally just do nothing) unless the open project is dirty (you made changes). You can always Save As... to make a copy, though.
@LordNeckbeard `jpegtran` is doing something very specifically different from what an image editor does.
@Rawling Why would that distinction matter? Mentioning it was relevant to the discussion.
@MichaelBorgwardt I've done tests with imagemagick, and it actually converges after a handful of saves. There's examples on another question here... I'll find it later....
Here: http://photo.stackexchange.com/a/34192/ converged in 8 or 9 cycles in my examples.
@mattdm: for a counterexample, our own Jeff Atwood got increasingly bad results after 10 cycles: http://blog.codinghorror.com/a-comparison-of-jpeg-compression-levels-and-recompression/ - so I guess it depends on the particular image as well
@LordNeckbeard `jpegtran` doesn't deliberately "know that no changes were made and rewrite the original compressed scheme with no additional quality loss", it just *never uses a lossy compression*.
@DocMax : That will require either that 1) the app stores in memory (or rereads) the compressed stream, or that 2) it actually does not rewrite the JPEG file, leaving it intact (including timestamp, if present). Very improbable scenarios
I agree that it is rare for an application to hold onto and rewrite the compressed stream as it is. I have seen it once, but that was many years ago when memory was more precious than it is today. My point was only that "_will_ lose quality" implies that quality loss cannot be avoided at all; I would have no issue with "will almost certainly lose quality".
I second that It is completely possible for a jpeg program to decompress edit and save a file (with some restriction) without further loss of quality. The DCT can be done with enough precision to completely recover the frequency data that was stored in the original jpeg (this is a non issue), and an encoder is can choose to quantize the data however it wants. It can always choose a quantization that doesn't clip precision any more than it already was if it wants to. Whether there are actual encoders that do this is of course another issue, it is doable in the spec.
@TimSeguine: "It can always choose a quantization that doesn't clip precision any more than it already was if it wants to" - is that actually possible, given limited precision of your numerical types?
@MichaelBorgwardt yes. If it was possible for the first encoder then it is possible for the second as well.
@TimSeguine - that is based on the assumption that the exact original data was recovered from the first encoding, which given the lossy nature of the encoding means that even under perfect / lab conditions it's unlikely.
@JamesSnell It is possible to compute the DCT and inverse DCT with arbitrarily high precision. That is all that is necessary. Period. That's not particularly difficult even in practice. JPEG is only lossy because of the quantization step. If we leave that off and have a good DCT subroutine, then we can do a lot of things with the data without reducing the quality any further including cropping, 90 degree rotation, and reflection. It's not just a theoretical possibility, software exists which does this. In non laboratory conditions. I am just explaining how it is possible.
@TimSeguine: No. The software you're talking abut does not perform an DCT or inverse DCT at all - it merely rearranges the fully compressed data using intimate knowledge about the format and its symmetries.
@MichaelBorgwardt Are you a wizard? Can you read my thoughts? I never mentioned a specific program. Nor do I care how a program I wasn't talking about is implemented.
@TimSeguine: which software *were* you talking about then? Because I very much doubt anyone having the required skills would use such an absurdly wasteful and ineffective method to get the same result.
@MichaelBorgwardt It isn't wasteful if it is more general.The point was (assuming you have a high quality DCT algorithm) you can pretty much do anything to a jpeg without affecting its quality if you skip the quantization step upon resaving it. That is clearly buying you more. You can't do better than anyone else who is viewing the original, but you needn't do worse either. Just be careful and don't quantize.
Absolutely not. You need to edit the file and re-save it as a JPEG in order to compound the effects of image compression. Just viewing it has no effect at all — if it did, all of the JPEGs on the web would "wear out" completely in a day or two at most.
FYI - here's an example of the type of guidance that I find confusing (this is from a photoblog website): "What's the downside to shooting .JPEG? It has a compression scheme that causes image degradation every time the file is opened and saved. The degradation is minor. You probably wouldn't notice it at first. But you would see it over time."
@markthomas: the crucial word is *saved* - saving is a completely different thing than closing.
+1 for "wearing out"... Google will need to apply fresh coats of "paint" every couple of seconds on their search results page!
JPEG compression can be described as having two distinct phases: first a lossy phase, then a lossless phase. Understanding the difference between them is important to this question. This isn't so much because it helps understanding what's going on, but because it helps to understand where the common mistakes come from.
Lossy compression happens only when the file is saved. This is the part that causes loss of quality. However, just closing the file is not enough to trigger lossy compression: you have to save it. Some editors may refuse to save JPEG files that haven't been edited, to avoid accidentally triggering lossless compression, but I don't know off the top of my head whether or not any editors actually do that.
Lossless compression also happens only when the file is saved. The main difference is that even if it happened when the file was closed without saving, it wouldn't matter, because it's lossless. JPEG uses both techniques together.
Lossless decompression happens whenever the file is opened, but not at any other time. Not when it's closed, and not even when it's saved. As with lossless decompression, it wouldn't matter even if it did happen during these times, because it's lossleess.
"Lossy decompression" never happens. There's no such thing. There can't be, because the data that got thrown out during the lossy compression phase is gone. If you could somehow reconstruct it, then you'd have a lossless compression algorithm, not a lossy one. I'm only even mentioning the concept because, having mentioned two types of compression, it would look strange if I mentioned one only type of decompression without explaining why.
Note that saving the file triggers both kinds of compression. There's not much of a way around this, unless you know that the image has not been edited, but then there isn't much point to saving it either. Note also that just closing the file without saving does not trigger either phase, not even the "safe" lossless compression. Because of this, just opening and closing the file cannot decrease image quality.
Re "no point to saving" I've worried about what a program would do if I just edit the metadata (Comment, notes) when the user interface is Open-edit-save and there is no special indication that the image data is copied but not recompressed.
"Lossy decompression" as a concept makes perfect sense, though I'm not aware of anything that implements it. The compressed file contains a certain amount of data; one could imagine a decompression algorithm that extracts only a low-resolution version of that data. Doing so could be much faster than extracting all the available data, and one could use such an algorithm to provide a preview, for example. (For example, JPEG compresses an image as a sequence of 8x8 blocks; you could extract the average colour of each block and render that as a single pixel for a 1/8th-size preview.)
@jdlugosz Programs that only update meta-data in general don't decompress/re-compress the image part of the JPG. They just copy that part as is in the new file. The JPG file-format is constructed such that it is actually easier and less work to do it like that if you only need to update the meta-data. But it all depends on the software. If that is stupid in the way it does things, there is no way around it (besides using other software).
@DavidRicherby There is lot's of software that uses the decompression engine for previews and such. Most web-based photo-albums do this to generate thumbnails and to send a low-res version to a browser with a small display. In MacOSX the thumbnail views in the Finder and iPhoto do it. I suspect that thumbnails in Explorer on Windows do it too, but I'm not sure about that one.
Actually, AIUI, both JPEG compression *and* decompression can be (somewhat) lossy. That is to say, you cannot exactly reconstruct the original image from the compressed data, *and* you cannot (always) exactly reconstruct the compressed data from the decompressed image. Mostly, AFAIK, this can happen due to clipping: when mapping the compressed YUV DCT coefficients back to 8-bit RGB pixel colors, some pixels might end up with color values less than 0 or greater than 255, which will be clipped.
@DavidRicherby: From what I understand, each point in a JPEG file is a sum of various values multiplied by different scaling factors; although the value is generally rendered into a value with 8 bits each for red, green, and blue, the values of the points are defined more precisely than that. If one were to JPEG-compress image that precisely matched the unrounded result of a JPEG decompression, recompressing with the same settings as the earlier image would yield the same result as the earlier compression, but rounding that occurs in the decompression process breaks that.
Just opening and closing a JPEG file should not trigger a save command (in any program that I know of) and therefore there is no re-compression taking place.
For the times that you actually DO hit "save", what happens depends on what changes you've made and how smart the image program in question is.
The user CutNGlass has already mentioned an example of a smart image program, "Better JPEG", that takes advantage of the fact that JPEG images are made up of lots of independently encoded rectangular blocks of pixels, and only blocks that really NEED to are re-compressed when saving the image. For example, with such a program, you can remove red eyes and when the JPEG image is saved, only the blocks that were affected by the change are re-compressed. http://www.betterjpeg.com/features.htm
Now, this technique to avoid having to re-compress any part of a JPEG image that does not need to be re-compressed is really "old news" (I'm no expert and I've known it for over a decade), so I guess I've taken it somewhat for granted that all the good image handling programs would handle this perfectly by now (which would mean that there would not normally be any re-compression from just opening a JPEG image and pressing "save", because the program would know that there has been no alteration to any blocks, and just leave them untouched), but from looking at this question and its varying answers, I can only gather that this STILL isn't true! *Maybe the programming behind such solutions is more complicated than I believe it to be - otherwise all JPEG-handling programs would have had this years ago!*
Hi new user. Thanks for contributing, the inclusion of a link to betterjpeg is a useful addition. From a developers perspective the situation is what we'd call an extreme edge-case - to be a useful it relies on the user making a partial edit AND wanting exactly identical quality output settings AND the source file being one of a few JPEG encoding options. For the amount of work involved there's not enough benefit.
Hi @JamesSnell! First of all, I really do appreciate your feedback and giving me an explanation as to why all image handling programs do not do their all to avoid degradation of JPEGs whenever possible. However, I must say that it is quite subjective if putting in this programming effort is too much work and if the use case represent "extreme edge-cases" or not! It depends on how many image formats the program handles and how common JPEG use is, and it depends on how anal you are about not ever wanting to lose anything that you don't have to lose (needless image quality loss, in this case).
Also, I would say that since JPEG is the worlds #1 (most used) image format, that, in and of itself, makes this something else than an "extreme edge-case" (at least the word "extreme" should be removed). If you also consider the fact that this block-handling technique, if implemented in an image program, can be used for a variety of different situations (partial edits, like red-eye-removing; rotating/flipping the image; cropping), that also indicates something other than an "extreme edge-case." It might not be the MOST requested feature, but at the same time, I'm sure it would be appreciated!
Carl, I would agree with @JamesSnell that this is an edge case - and I've worked on a popular image editor so I have some experience here. When you open a file, it is copied to an in-memory uncompressed version for editing *and the original is immediately discarded*. This saves memory and gives you the flexibility to use any file format as a source. I've seen software that can do a lossless flip or rotate of a JPEG, but it's a very specialized function. The program you link to is the only editor I've ever seen that extends this feature to general editing.
Thanks for weighing in, Mark - and thanks for your insight. I would, however, still not call this "an edge case", since we're talking about very common types of edits using the world's most common image file type... That programmers want to keep it simple and perhaps do not care enough about image degradation for it to feel "worth it" for them to handle this is an explanation to why most image programs don't handle this, but it doesn't make it "an edge case". The JPEG file is normally much smaller than the uncompressed version, so it could be kept in memory and this functionality be added.
You definitely won't lose any quality just by viewing it. But, as pointed out above, you may lose image quality when saving it without making changes if the editor compresses it when it saves the file. For example, say you have a JPEG at no compression:
- You open it in The GIMP, make no changes, and save it
- The GIMP asks you how much compression you want (quality)
- You enter 90% quality (the default)
Do this 20 times, and you'll see a significant decrease in quality, because it has been compressed 20 times. If you save it with no compression (100% quality), you'll see no change.
JPEG _always_ has compression; there's no such thing as "no compression" with JPEG. It is inherently lossy. However, passing a [JPEG-compressed] image through a JPEG compressor at 100% quality _may_ not result in a loss of quality.
Example of quality loss with re-saving at same quality level: http://photo.stackexchange.com/a/34192/
Definitely, like any file, if you don't hit "save" but just close the file, no changes will be made. (think of it like a word Doc that you just open and close)
If you do make changes, most programs will give you a notification asking if you want to "save changes"
So the answer is definitely no to your question.
Hope that helps.
- Opening: no loss of quality
- Copying: no loss of quality
- Displaying: no loss of quality
- Saving without edits: is copying, no loss of quality*
- Saving with only metadata edits: no loss of quality*
- Saving with changes to compression quality: loss of quality
- Saving after image data edits: loss of quality
*Dependant on program, poorly implemented programs may actually recompress even when not needed with the resultant quality loss
Decoding any digital data is lossless. There is not a single digital format in which mere decoding and display would alter the data.
It's only recompression of the image data that is potentially lossy. Certain editing operations that are actually just metadata edits should not cause any loss of quality, for example EXIF rotation is lossless.
It's possible for a program to generate a file using data copied verbatim from parts of the image which haven't changed, but produce new compressed data for other parts. JPEG Wizard, for example, makes it possible to highlight "important" areas of a JPEG and leave them as is, while aggressively compressing other less-important areas of the image by various amounts.
Copying the _file_ (in Explorer, Finder, etc.) will not lead to quality loss. Opening the image, doing a select all, copy, new image from copy and then save _will_ lead to quality loss.
@supercat - block substitution is only possible for certain jpeg encoding schemes. But the selective quality (as in some areas only) option could make for some interesting space savings without major image degradation on the web.
@JamesSnell: I used JPEG Wizard ages ago; I've not looked to see if/how it's been maintained or updated, but it can allow some great space savings in cases where a picture has a few places where detail is needed and lots of area where it isn't. In some cases, blurring out everything but the primary subjects of a picture may *improve* the picture aesthetically at the same time as it makes the file much smaller.
Simply put No.
To be specific. When saving the JPEG image you have some losses as JPEG is defined as lossy compression.
The image is compressed using Huffman coding if I am not mistaken. Now when an image editor opens up an image it does not decompress the image. It simply decodes the compressed image so the screen can show what is in it.
But when you make changes and re-save it the image is recompressed to a new jpeg with more data loss. Software like GIMP ask you how much quality you want though so you can choose 100% to keep the existing quality.
Now opening and closing an image without making any changes would never matter on how it's stored and what data is lost. Opening it for viewing and then closing does not make any changes to the file. No matter what the case (mp3, image, word document). Since nothing is saved the quality will always remain the same.
But as previous answers have said, if you are really worried about data loss you can simply use other formats like png or tiff.
Note that you won't retain the existing quality when resaving the photo in GINMP using the 100 % value of the "quality percentage setting". It only sets certain parameters used by the encoder to the highest possible quality level. Those settings are not lossless.
The simple answer is "That depends."
Does it make a difference if:
I open the image in a standard image viewer and simple "close" the pic?
Should be safe, as a viewer should never be able to change image.
I open the image in Photoshop Elements Editor and close it there?
Should not change image.
If I simply Close and image vs. Re-saving it?
Closing image should not change image. Re-saving image very likely would alter it, depending on the plug-ins you are using.
One reason you will find so many different answers for "when does closing or saving a JPEG causes a decrease in image quality and when it does not?" is that it depends on so many different things, including: the software you are using to edit the image, the plug-ins which are installed on that software, whether your software performs "auto-saves", and on the settings which you use when you save the jpg image! That is actually why I do not edit original files.
I do not use photoshop, but there is a plug-in which is available for it which is supposed to help with the specific issue in question -- avoiding loss of image quality when saving a jpeg: http://www.betterjpeg.com/jpeg-plug-in.htm
Better JPEG Lossless Resave plug-in for Adobe Photoshop and Adobe Photoshop Elements is a tool designed to avoid recompression losses when resaving edited JPEG images in Photoshop. The plug-in takes advantage of the fact that JPEG images consist of a number of small independent blocks and does not recompress unchanged blocks.
Interesting sets of answers. But some still are a little misleading. I will try to sumarize.
1) Opening a file does not afect it in any way. Also closing it. Not in a viewer or editing program.
There is a chance you view the file diferent in diferent programs but that could be because how this program inteprets some information like color mode or color profile. But that process is only reading it.
There is a chance of small changes
2) Doing losless operations, like rotating an image. Normally the programs just re order the data of a jpg file, without analizing and recompressing. But I would not put my hands on the fire for all programs that supose to do that.
Small unnotorious changes
3) Opening and saving with the same compression on the same program.
A first recompression is done the first time you save a jpg file. If you save a second time the file with the same settings the original data loss is already done, but small changes can be applied again. Not in the same extent that the first one, but can be noticable doing this several times. But that depends on the program.
4) The most obvious is re saving with a different compression setting.
Not only on the "scale" on whatever the program has, but also the algorithm used. This is a little too technical but there are at least two main acompression algorithms 4:4:4 and 4:2:2.
You could use the "slider" on your program to top "quality", but if your program is using 4:2:2 and the original was on 4:4:4 you will have a significant data loss.
Here is a small paper I made some years ago so you can see what this data loss means, it is in spanish but you can use google translate: http://otake.com.mx/Apuntes/PruebasDeCompresion2/1-CompresionJpgProceso.htm
A total mess
5) If you open an image and re save it on a program that has limited capabilities. For example, a viwer could only save RGB files and not work properly with CMYK ones, or perhaps it does not understand the embeded color profile. You could totally ruin your image on saving.
6) Using a lot of compression. You save it for your website and compressed it. Do not delet your originals pease!
Only on the edited part of the image
7) The recompression is normally performed on all the image, but as I mentioned on point 3, it is not a lot of it if the image has not changed. When you edit an image this analisis has to be made again on this edited portion.
Remember that an editing can be categorized in three groups.
a) Color corrections, contrast, etc.
b) Altering one part of an image (red eyes, removing a person, cleaning unwanted spots)
c) A totally new collage.
So in some cases the image is a totally diferent one, at least for the analisis and recompression point of view.
In this post: https://photo.stackexchange.com/a/67434/37321 the user mentioned a program that does a very clever analisis of the existing compression and does not re compress it again if it is not necessary.