When you run Google's PageSpeed plugin for Firebug/Firefox on a website it will suggest cases where an image can be losslessly compressed, and provide a link to download this smaller image.
For example:
- Losslessly compressing http://farm3.static.flickr.com/2667/4096993475_80359a672b_s.jpg could save 33.5KiB (85% reduction).
- Losslessly compressing http://farm2.static.flickr.com/1149/5137875594_28d0e287fb_s.jpg could save 18.5KiB (77% reduction).
- Losslessly compressing http://cdn.uservoice.com/images/widgets/en/feedback_tab_white.png could save 262B (11% reduction).
- Losslessly compressing http://ajax.googleapis.com/ajax/libs/jqueryui/1.8.9/themes/base/images/ui-bg_flat_75_ffffff_40x100.png could save 91B (51% reduction).
- Losslessly compressing http://www.gravatar.com/avatar/0b1bccebcd4c3c38cb5be805df5e4d42?s=45&d=mm could save 61B (5% reduction).
This applies across both JPG and PNG filetypes (I haven't tested GIF or others.)
Note too the Flickr thumbnails (all those images are 75x75 pixels.) They're some pretty big savings. If this is really so great, why aren't Yahoo applying this server-side to their entire library and reducing their storage and bandwidth loads?
Even Stackoverflow.com stands for some very minor savings:
- Losslessly compressing http://sstatic.net/stackoverflow/img/sprites.png?v=3 could save 1.7KiB (10% reduction).
- Losslessly compressing http://sstatic.net/stackoverflow/img/tag-chrome.png could save 11B (1% reduction).
I've seen PageSpeed suggest pretty decent savings on PNG files that I created using Photoshop's 'Save for Web' feature.
So my question is, what changes are they making to the images to reduce them by so much? I'm guessing there are different answers for different filetypes. Is this really lossless for JPGs? And how can they beat Photoshop? Should I be a little suspicious of this?
If you're really interested in the technical details, check out the source code:
For PNG files, they use OptiPNG with some trial-and-error approach
// we use these four combinations because different images seem to benefit from
// different parameters and this combination of 4 seems to work best for a large
// set of PNGs from the web.
const PngCompressParams kPngCompressionParams[] = {
PngCompressParams(PNG_ALL_FILTERS, Z_DEFAULT_STRATEGY),
PngCompressParams(PNG_ALL_FILTERS, Z_FILTERED),
PngCompressParams(PNG_FILTER_NONE, Z_DEFAULT_STRATEGY),
PngCompressParams(PNG_FILTER_NONE, Z_FILTERED)
};
When all four combinations are applied, the smallest result is kept. Simple as that.
(N.B.: The optipng
command line tool does that too if you provide -o 2
through -o 7
)
For JPEG files, they use jpeglib with the following options:
JpegCompressionOptions()
: progressive(false), retain_color_profile(false),
retain_exif_data(false), lossy(false) {}
Similarly, WEBP is compressed using libwebp with these options:
WebpConfiguration()
: lossless(true), quality(100), method(3), target_size(0),
alpha_compression(0), alpha_filtering(1), alpha_quality(100) {}
There is also image_converter.cc which is used to losslessly convert to the smallest format.
I use jpegoptim
to optimize JPG files and optipng
to optimize PNG files.
If you're on bash
, the command to losslessly optimize all JPGs in a directory (recursively) is:
find /path/to/jpgs/ -type f -name "*.jpg" -exec jpegoptim --strip-all {} \;
You can add -m[%]
to jpegoptim
to lossy compress JPG images, for example:
find /path/to/jpgs/ -type f -name "*.jpg" -exec jpegoptim -m70 --strip-all {} \;
To optimize all PNGs in a directory:
find /path/to/pngs/ -type f -name "*.png" -exec optipng -o2 {} \;
-o2
is the default optimization level, you can change this from o2
to o7
. Notice that higher optimization level means longer processing time.
Take a look at http://code.google.com/speed/page-speed/docs/payload.html#CompressImages which describes some of the techniques/tools.
It's a matter of trading encoder's CPU time for compression efficiency. Compression is a search for shorter representations, and if you search harder, you'll find shorter ones.
There is also a matter of using image format capabilities to the fullest, e.g. PNG8+a instead of PNG24+a, optimized Huffman tables in JPEG, etc.
Photoshop doesn't really try hard to do that when saving images for the web, so it's not surprising that any tool beats it.
See
- ImageOptim (lossless) and
- ImageAlpha (lossy) for smaller PNG files (high-level description how it works) and
- JPEGmini/MozJPEG (lossy) for better JPEG compressor.
To Replicate PageSpeed's JPG Compression Results in Windows:
I was able to get exactly the same compression results as PageSpeed using the Windows version of jpegtran which you can get at www.jpegclub.org/jpegtran. I ran the executable using the DOS prompt (use Start > CMD). To get exactly the same file size (down to the byte) as PageSpeed compression, I specified Huffman optimization as follows:
jpegtran -optimize source_filename.jpg output_filename.jpg
For more help on compression options, at the command prompt, just type: jpegtran
Or to Use the Auto-generated Images from the PageSpeed tab in Firebug:
I was able to follow Pumbaa80's advice to get access to PageSpeed's optimized files. Hopefully the screenshot here provides further clarity for the FireFox environment. (But I was not able to get access to a local version of these optimized files in Chrome.)
And to Clean up the Messy PageSpeed Filenames using Adobe Bridge & Regular Expressions:
Although PageSpeed in FireFox was able to generate optimized image files for me, it also changed their names turning simple names like:
nice_picture.jpg
into
nice_picture_fff5e6456e6338ee09457ead96ccb696.jpg
I discovered that this seems to be a common complaint. Since I didn't want to rename all my pictures by hand, I used Adobe Bridge's Rename tool along with a Regular Expression. You could use other rename commands/tools that accept Regular Expressions, but I suspect that Adobe Bridge is readily available for most of us working with PageSpeed issues!
- Start Adobe Bridge
- Select all files (using Control A)
- Select Tools > Batch Rename (or Control Shift R)
- In the Preset field select "String Substitution". The New Filenames fields should now display “String Substitution”, followed by "Original Filename"
- Enable the checkbox called “Use Regular Expression”
In the “Find” field, enter the Regular Expression (which will select all characters starting at the rightmost underscore separator):
_(?!.*_)(.*)\.jpg$
In the “Replace with” field, enter:
.jpg
Optionally, click the Preview button to see the proposed batch renaming results, then close
- Click the Rename button
Note that after processing, Bridge deselects files that were not affected. If you want to clean all your .png files, you need reselect all the images and modify the configuration above (for "png" instead of "jpg"). You can also save the configuration above as a preset such as "Clean PageSpeed jpg Images" so that you can clean filenames quickly in future.
Configuration Screenshot / Troubleshooting
If you have troubles, it's possible that some browsers might not show the RegEx expression above properly (blame my escape characters) so for a screenshot of the configuration (along with these instructions), see:
In my opinion the best option out there that effectively handles most image formats in a go is trimage. It effectively utilizes optipng, pngcrush, advpng and jpegoptim based on the image format and delivers near perfect lossless compression.
The implementation is pretty easy if using a command line.
sudo apt-get install trimage
trimage -d images/*
and voila! :-)
Additionally you will find a pretty simple interface to do it manually as well.
There's a very handy batch script that recursively optimizes images beneath a folder using OptiPNG (from this blog):
FOR /F "tokens=*" %G IN ('dir /s /b *.png') DO optipng -nc -nb -o7 -full %G
ONE LINE!
If you are looking for batch processing, keep in mind trimage complains if you don't have Xserver avail. In that case just write a bash or php script to do something like
<?php
echo "Processing jpegs<br />";
exec("find /home/example/public_html/images/ -type f -name '*.jpg' -exec jpegoptim --strip-all {} \;");
echo "Processing pngs<br />";
exec("find /home/example/public_html/images/ -type f -name '*.png' -exec optipng -o7 {} \;");
?>
Change options to suite your needs.
For windows there are several drag'n'drop interfaces for easy access.
https://sourceforge.net/projects/nikkhokkho/files/FileOptimizer/
For PNG files I found this one for my enjoyment, apparently 3 different tools wrapped in this GIU. Just drag and drop and it does it for you.
It takes time though, try compressing a 1MB png file - I was amazed how much CPU went into this compression comparison which has to be what is going on here. Seems the image is compressed a 100 ways and the best one wins :D
Regarding the JPG compression I to feel its risky to strip of color profiles and all extra info - however - if everyone is doing it - its the business standard so I just did it myself :D
I saved 113MB on 5500 files on a WP install today, so its definately worth it!
来源:https://stackoverflow.com/questions/5451597/how-does-googles-page-speed-lossless-image-compression-work