Strange block texture background in imagemagick conversion frin PDF to JPG - imagemagick

I'm using this command to convert PDF to JPG:
exec("convert -scale 772x1000 -density 150 -trim \"".$toc_path.$filename."[0]\" -background white -flatten -quality 100 \"".$img_path. "covers/". $img_filename ."\"");
Random Fuzzy B & W background gets converted in to huge squares:

The trick was to increase the density up to 300, and then scale down to whatever size you you like eg:
convert -scale 772x1000 -density 300 -trim \"".$toc_path.$filename."[0]\" -background white -flatten -quality 80 \"".$img_path. "covers/". $img_filename ."\"");
# density 75 with no scale, size = 1.3mb - smaller blocks looked bad
# density 150 with no scale size = 2.0mb = large blocks looked bad
# density 300 with no scale size = 2.4mb = no blocks looked like original
# density 300 with scale to 1000 lines = 170kb = no blocks looked like original

Related

convert doesn't conserve size when scaling down and up

In order to make applying blur faster I'm first scaling my image down and then scale it back up:
convert - -scale 10% -blur 0x2.5 -resize 1000% RGB:-
This works most of the time but sometimes the output resolution is slightly different from the original input. Is there a way to force the pipeline to be size-preserving?
You should be able to access the original geometry of the image via %G, so you can do:
convert input.jpg -scale 10% -blur 0x2.5 -resize '%G!' RGB:-
If you are using Windows, you probably want "%G!" in double rather than single quotes.
If you are using v7 ImageMagick, replace convert with magick.
I think you are getting errors because if you take 10% of 72 pixels (say), you will get a whole number of pixels, i.e. 7 pixels and then when you scale back up by a factor of 10 you'll get 70 rather than your initial 72.
If you are using Imagemagick 7, you can do the following:
magick input.jpg -set option:wd "%w" -set option:ht "%h" -scale "%[fx:wd/10]x%[fx:ht/10]" -blur 0x2.5 -resize "%[fx:wd]x%[fx:ht]\!" RGB:-
This stores the input width and height. Then uses the stored values to scale by 1/10 of those dimensions, then does the blur, then resizes exactly back to the origin input size. Note the ! that forces the resize to the exact dimensions.
or simpler without storing the input width and height:
magick lena.jpg -scale "%[fx:w/10]x%[fx:h/10]" -blur 0x2.5 -resize "%[fx:w]x%[fx:h]\!" lena_x.jpg

crop image with imagemagick offset given in percentage

imagemagick's crop command supports cropping to a percentage of an image but the offset values must be specified in pixel values, e.g.:
convert image.png -crop 50%x+10+20
I want to crop with offset values x and y given in percentage of the image width, and height respectively. The pixel values can be calculated, for instance if the image size is 100x200 an offset of 10% would result in 10 and 20 respectively. Is it possible to do this calculation as part of the call to convert? Width and height are available as %w and %h at some places, but this does not work:
convert image.png -crop 50%x+(0.1*%w)+(0.1*%h)
If you're running IM v6 you can use FX expressions with "-set" to set image attributes. By setting the page geometry you can specify the offsets to a calculated percentage and do the crop like this...
convert image.png -set page -%[fx:w*0.1]-%[fx:h*0.1] -crop 50%x+0+0 result.png
That reads the image, sets the geometry for the upper left corner to a location outside the original canvas, and crops to the new top left corner specified by the geometry.
Note the offsets are negative numbers.
Also, if you're doing additional processing in the same command you'll probably want to "+repage" after the crop in order to reset the page geometry to the new WxH+0+0.
Edited to add: You can even include the width and height dimensions for the crop when using "-set page". This command would crop an output of 50% the input width and height, and starting at 10% in from the left and top...
convert image.png \
-set page %[fx:w*0.5]x%[fx:h*0.5]-%[fx:w*0.1]-%[fx:h*0.1] -crop +0+0 result.png
Notice how the crop operation is simply "-crop +0+0" since the dimensions and offsets are in the page geometry.
This method lets you use more complex calculations than just using a percent or number of pixels for the cropped output dimensions.
You cannot do that in ImageMagick 6. But you can do that in ImageMagick 7.
magick image.png -crop "50%x+%[fx:0.1*w]+%[fx:0.1*h]" +repage result.png
In ImageMagick 6, you need to do the computations ahead of the command, store them in a variable and use the variable in the crop command.
However, in ImageMagick 6, you can do the equivalent using -distort with viewport processing as follows:
convert image.png -set option:distort:viewport "%[fx:0.5*w]x%[fx:0.5*h]+%[fx:0.1*w]+%[fx:0.1*h]" -filter point -distort SRT 0 result.png
With v7 ImageMagick, make start image:
magick -size 200x100 gradient: a.jpg
Now crop using lots of calculated widths, heights, offsets:
magick a.jpg -crop "%[fx:w*0.9]x%[fx:h*0.8]+%[fx:w*0.1]+%[fx:h*0.05]" b.png
Check:
identify b.png
b.png PNG 180x80 200x100+20+5 8-bit Gray 256c 408B 0.000u 0:00.000
If you only have v6, use bash and integer arithmetic:
read w h < <(identify -format "%w %h" a.jpg)
convert a.jpg -crop $((w*80/100))x$((h*90/100))+$((w*10/100))+$((h*5/100)) result.png
Check:
identify result.png
result.png PNG 160x90 200x100+20+5 8-bit Gray 256c 412B 0.000u 0:00.000

Converting PDF files at a given density when page parameter is set - ImageMagick

If the page parameter is set, conversion of PDF files at a given density outputs blank pages.
"convert -units PixelsPerInch -density 300 $myfiles -page A4 -gravity center test.pdf"
If I omit page parameter from the command, I get appropriate output but at 72dpi default resolution.
Any idea?
A4 page size is 595 x 842. So in ImageMagick you could try
convert -units PixelsPerInch -density 300 $myfiles +repage -resize 595x842 test.pdf
That will make an A4 pixel dimension image with 300 dpi. You could also do
convert -units PixelsPerInch -density 300 $myfiles +repage -resize 595x842 -density XX test.pdf
Where XX is the dpi you want when printing the image of that size.
I added +repage to remove any input image virtual canvas, since you did not specify what format images you are using for $myfiles. Without +repage, that could have caused a large bit of white space at the top of your result.
Note it is always best and most helpful to provide your ImageMagick version and platform when asking questions about its use.
While I was trying to tweak the command I found that a set density (i.e, density 300) with a given page parameter actually sets the density of the -page A4 but not the converted object on page as the set density can't actually determine resolution of the -page A4 to which it shall be applicable. As a result, the command returns blurry or blank image on set page.
However, extent parameter is what, which actually outputs the appropriate image as it is possible to set the page resolution with this parameter at a predefined density. The following example will make it absolutely clear.
Resolution of A4 size page at 300 dpi is 2480x3508, thus correct command for a set density like 300 dpi shall be:
"convert -units PixelsPerInch -density 300 $myfiles -gravity center -extent 2480x3508 test.pdf"
Resolution of A4 size page at 72 dpi is 595x842, thus correct command for a set density like 72 dpi shall be:
"convert -units PixelsPerInch -density 72 $myfiles -gravity center -extent 595x842 test.pdf"

Image Magick - trim and repage - loss of quality

When I use this command to trim a PDF file:
convert -fuzz 1% -trim +repage multi0.pdf multi0new.pdf
The result is very disappointing and the trimmed image size becomes more than 10 times lower than the source.
Is there any way to make a clean image trimming without loss of quality?
(here it is, before and after)
You probably have to use the -density option with a higher value than the default of 72 dpi. Try 300 for a start:
convert -density 300 -fuzz 1% -trim +repage multi0.pdf multi0new.pdf
Note that higher density values will increase the output file size.

Recommendation for compressing JPG files with ImageMagick

I want to compress a JPG image file with ImageMagick but can't get much difference in size. By default the output size is bigger than the input. I don't know why, but after adding some +profile options and setting down the quality I can get an smaller size but still similar to original.
The input image is 255kb, the processed image is 264kb (using +profile to remove profiles and setting quality to 70%). Is there any way to compress that image to 150kb at least? Is that possible? What ImageMagick options can I use?
I use always:
quality in 85
progressive (comprobed compression)
a very tiny gausssian blur to optimize the size (0.05 or 0.5 of radius) depends on the quality and size of the picture, this notably optimizes the size of the jpeg.
Strip any comment or EXIF metadata
in imagemagick should be
convert -strip -interlace Plane -gaussian-blur 0.05 -quality 85% source.jpg result.jpg
or in the newer version:
magick source.jpg -strip -interlace Plane -gaussian-blur 0.05 -quality 85% result.jpg
Source.
From #Fordi in the comments (Don't forget to upvote him if you like this):
If you dislike blurring, use -sampling-factor 4:2:0 instead. What this does is reduce the chroma channel's resolution to half, without messing with the luminance resolution that your eyes latch onto. If you want better fidelity in the conversion, you can get a slight improvement without an increase in filesize by specifying -define jpeg:dct-method=float - that is, use the more accurate floating point discrete cosine transform, rather than the default fast integer version.
I'm using the Google Pagespeed Insights image optimization guidelines, and for ImageMagick they recommend the following:
-sampling-factor 4:2:0
-strip
-quality 85 [it can vary, I use range 60-80, lower number here means smaller file]
-interlace
-colorspace RGB
Command in ImageMagick:
convert image.jpg -sampling-factor 4:2:0 -strip -quality 85 -interlace JPEG -colorspace RGB image_converted.jpg
With these options I get up to 40% savings in JPEG size without much visible loss.
Just saying for those who using Imagick class in PHP:
$im -> gaussianBlurImage(0.8, 10); //blur
$im -> setImageCompressionQuality(85); //set compress quality to 85
Once I needed to resize photos from camera for developing:
Original filesize: 2800 kB
Resolution: 3264x2448
Command:
mogrify -quality "97%" -resize 2048x2048 -filter Lanczos -interlace Plane -gaussian-blur 0.05
Result filesize 753 kB
Resolution 2048x2048
and I can't see any changes in full screen with my 1920x1080 resolution monitor. 2048 resolution is the best for developing 10 cm photos at maximum quality of 360 dpi. I don't want to strip it.
edit: I noticed that I even get much better results without blurring. Without blurring filesize is 50% of original, but quality is better (when zooming).
#JavisPerez -- Is there any way to compress that image to 150kb at least? Is that
possible? What ImageMagick options can I use?
See the following links where there is an option in ImageMagick to specify the desired output file size for writing to JPG files.
http://www.imagemagick.org/Usage/formats/#jpg_write
http://www.imagemagick.org/script/command-line-options.php#define
-define jpeg:extent={size}
As of IM v6.5.8-2 you can specify a maximum output filesize for the JPEG image. The size is specified with a suffix. For example "400kb".
convert image.jpg -define jpeg:extent=150kb result.jpg
You will lose some quality by decompressing and recompressing in addition to any loss due to lowering -quality value from the input.
I would add an useful side note and a general suggestion to minimize JPG and PNG.
First of all, ImageMagick reads (or better "guess"...) the input jpeg compression level and so if you don't add -quality NN at all, the output should use the same level as input. Sometimes could be an important feature. Otherwise the default level is -quality 92 (see www.imagemagick.org)
The suggestion is about a really awesome free tool ImageOptim, also for batch process.
You can get smaller jpgs (and pngs as well, especially after the use of the free ImageAlpha [not batch process] or the free Pngyu if you need batch process).
Not only, these tools are for Mac and Win and as Command Line (I suggest installing using Brew and then searching in Brew formulas).
I added -adaptive-resize 60% to the suggested command, but with -quality 60%.
convert -strip -interlace Plane -gaussian-blur 0.05 -quality 60% -adaptive-resize 60% img_original.jpg img_resize.jpg
These were my results
img_original.jpg = 13,913KB
img_resized.jpg = 845KB
I'm not sure if that conversion destroys my image too much, but I honestly didn't think my conversion looked like crap. It was a wide angle panorama and I didn't care for meticulous obstruction.
Here's a complete solution for those using Imagick in PHP:
$im = new \Imagick($filePath);
$im->setImageCompression(\Imagick::COMPRESSION_JPEG);
$im->setImageCompressionQuality(85);
$im->stripImage();
$im->setInterlaceScheme(\Imagick::INTERLACE_PLANE);
// Try between 0 or 5 radius. If you find radius of 5
// produces too blurry pictures decrease to 0 until you
// find a good balance between size and quality.
$im->gaussianBlurImage(0.05, 5);
// Include this part if you also want to specify a maximum size for the images
$size = $im->getImageGeometry();
$maxWidth = 1920;
$maxHeight = 1080;
// ----------
// | |
// ----------
if($size['width'] >= $size['height']){
if($size['width'] > $maxWidth){
$im->resizeImage($maxWidth, 0, \Imagick::FILTER_LANCZOS, 1);
}
}
// ------
// | |
// | |
// | |
// | |
// ------
else{
if($size['height'] > $maxHeight){
$im->resizeImage(0, $maxHeight, \Imagick::FILTER_LANCZOS, 1);
}
}
Did some experimenting myself here and boy does that Gaussian blur make a nice different. The final command I used was:
mogrify * -sampling-factor 4:2:0 -strip -quality 88 -interlace Plane -define jpeg:dct-method=float -colorspace RGB -gaussian-blur 0.05
Without the Gaussian blur at 0.05 it was around 261kb, with it it was around 171KB for the image I was testing on. The visual difference on a 1440p monitor with a large complex image is not noticeable until you zoom way way in.
An very old but helpful answer.
I need to say, to serious large photography, -gaussian-blur is not acceptable, rather than compress ratio.
Comparing below, %95 with -gaussian-blur 0.05 vs. %85 without blurring. Original 17.5MB (8MP with much defail), %95 without blurring 5MB, %85 without blurring 3036KB, %95 with blurring 3365KB.
Comparing between blurring and compress ratio
Maybe lower blurring like 0.02 will work better.
If the image has big dimenssions is hard to get good results without resizing, below is a 60 percent resizing which for most of the purposes doesn't destroys too much of the image.
I use this with good result for gray-scale images (I convert from PNG):
ls ./*.png | xargs -L1 -I {} convert {} -strip -interlace JPEG -sampling-factor 4:2:0 -adaptive-resize 60% -gaussian-blur 0.05 -colorspace Gray -quality 20 {}.jpg
I use this for scanned B&W pages get them to gray-scale images (the extra arguments cleans shadows from previous pages):
ls ./*.png | xargs -L1 -I {} convert {} -strip -interlace JPEG -sampling-factor 4:2:0 -adaptive-resize 60% -gaussian-blur 0.05 -colorspace Gray -quality 20 -density 300 -fill white -fuzz 40% +opaque "#000000" -density 300 {}.jpg
I use this for color images:
ls ./*.png | xargs -L1 -I {} convert {} -strip -interlace JPEG -sampling-factor 4:2:0 -adaptive-resize 60% -gaussian-blur 0.05 -colorspace RGB -quality 20 {}.jpg

Resources