Opposite to imagemagick montage - imagemagick

I printed a lot of composed images with imagemagick montage tool. I scanned them back and now I want to recover individual images. Due to the amount of images, I want to do it automatically, so I need the opposite operation of montage (plus some clever spiting due to scanner positioning errors).
Is this tool available?

Related

Does ImageMagick decrease image quality by default?

I use ImageMagick to convert images from one format to another, to convert them into a single PDF, to rotate them according to camera orientation, to trim white space, or sometimes to change resolution.
When doing these tasks, I never want to decrease image quality. I would always prefer to preserve the quality as high as possible.
The description of the -quality parameter confuses me. If I understood its description correctly, ImageMagick always slightly decreases the image quality during conversions.
Is it really so? Does it mean I should use -quality 100 as a safety belt each time I use ImageMagick?

What are some examples of imagemagick commands where option can be provided between magick and input filename

For e.g.
magick -density 100 apple.jpg -resize 100x100 apple-edited.jpg
Density here is the option which has to be provided prior to input file name.
Can someone please explain when is such prior options are needed. And also provide me few more examples.
PS: I tried looking for the options but not sure if there is an specific term for such options which will help limit the search results.
ImageMagick is, exclusively, a raster image processor, which means that it processes bitmap images made up of pixels laid out on a rectangular grid (or raster), rather than a vector image processor like Adobe Illustrator or Inkscape which deal with shapes, lines, rectangles and bezier curves described by their vertices or inflection points - not pixels.
When you load a vector image (e.g. an SVG image, or a PDF) into ImageMagick, the very first thing it does is rasterise your image onto a rectangular grid before it can work with the pixels. So, if your image is an SVG or a PDF, you need to set the density, or number of lines in the grid before you load the image.
I don't know of a reason to do that for a JPEG, like your example. I can't think of any other settings you absolutely need to make prior to loading an image.

Is there a loss in quality while using Imagemagick's "convert -append ..." to stitch

(continuation of title) ... multiple images - from a grid layout - into one large complete image?
The reason for my asking is that I am using, i.e. have downloaded from Imagemagick's site, Download section, ImageMagick-7.0.7-26-portable-Q16-x86.zip.
x86 meaning it is for 32-bit OS, Q16 permitting to read or write 16-bit image without losing precision, and static meaning it is statically compiled (that is of less importance).
I am working with a series of images, each but the boundary ones (that can vary in size in one dimension) of size 256x256, 96 dpi and 24 bit depth.
If using the above tool, for stiching those images together,
convert -append 6-0-0.jpg 6-1-0.jpg ... 6-_-0.jpg
Is there a loss of quality, considering they have 24 bit depth and Imagemagick only supports images up to 16 bit depth? Or do I have it wrong? Can someone elaborate on the said "bit depth" to explain the restrictions of Imagemagick tool?

Why am I losing image quality with imagemagick when reducing a very large image to a much smaller image?

I have a bunch of images that I want to convert into a single PDF, the images are primarily images of text (similar to scanned images of a textbook). The image files are extremely large, I have no need for the amount of resolution that they offer.
So first, as a base file, I did a simple conversion of 26 of these "pages" to a single pdf, and the total filesize was 46MB for 26 pages. Viewing in page width mode resulted in a scale of 16% of the original image.
convert *.png kapittel1.pdf
The quality of the PDF pages was perfect, they were just too large. So I figure since 16% of the image is more than adequate for viewing the entire width of the page on my screen, I could reduce the image sizes to 20% of their original values and still maintain the same image quality. The quality of the images is visibly less than before reducing the size.
convert -resize 20% -quality 100% *.png 20percent.pdf
I believe I'm going to need to start looking into filters, but before I potentially waste my time converting using all of the filters then comparing to find the one I want to use, is there a better way to just reduce the size, maintain quality, then convert to PDF? I don't see why I would be losing pixels here.
Edit
I tried with -scale instead of -resize but am really not seeing a difference in the output. It pretty much seems that once I go below 40% I start losing pixel data.
The excellent ImageMagick Examples state that by default, no image compression is used when creating PDFs and suggest to use Zip (Deflate Compression):
convert *.png -compress Zip -quality 100 kapittel1.pdf
If your images are only black and white, you can try the -monochrome option and optionally Group4 (Fax) compression using -compress Group4.
Ok well I discovered that the size of the PDF once following Shawn Patrick Rice's suggestion for Optimizing Scanned PDFs and OCR+ClearText was fairly negligible between a -resize setting of 30-50%. The primary goal here is to reduce the size of the resulting PDF to under 45" in height as this is the threshold for Adobe's OCR. I found no benefit from converting each image individually to a PDF then resizing, or playing with the plethora of other settings in Adobe. The below process kept (as far as I can tell) all of the image quality and reduces the images to the smallest size PDF (at full quality).
My process was as follows:
convert *.png -resize 50% name.pdf
// resize amount dependent on original file dimensions, goal is document height < 45"
Adobe Acrobat => Document Processing => Optimize Scanned PDF (Edit => ClearScan output style) => OK
The size of the resulting PDF document is still quite large, however the size after reducing in Adobe goes down considerably (90MB => 4MB). If I first resized at 30% there would be noticeable image quality loss, however the amount of size I would save after optimizing would be around 800KB for the above file.

Is it possible to tell the quality level of a JPEG?

This is really a two part question, since I don't fully understand how these things work just yet:
My situation: I'm writing a web app which lets the user upload an image. My app then resizes to something displayable (eg: 640x480-ish) and saves the file for use later.
My questions:
Given an arbitrary JPEG file, is it possible to tell what the quality level is, so that I can use that same quality when saving the resized image?
Does this even matter?? Should I be saving all the images at a decent level (eg: 75-80), regardless of the original quality?
I'm not so sure about this because, as I figure it: (let's take an extreme example), if someone had a 5 megapixel image saved at quality 0, it would be blocky as anything. Reducing the image size to 640x480, the blockiness would be smoothed out and barely less noticeable... until I saved it with quality 0 again...
On the other end of the spectrum, if there was an image which was 800x600 with q=0, resizing to 640x480 isn't going to change the fact that it looks like utter crap, so saving with q=80 would be redundant.
Am I even close?
I'm using GD2 library on PHP if that is of any use
You can view compress level using the identify tool in ImageMagick. Download and installation instructions can be found at the official website.
After you install it, run the following command from the command line:
identify -format '%Q' yourimage.jpg
This will return a value from 0 (low quality, small filesize) to 100 (high quality, large filesize).
Information source
JPEG is a lossy format. Every time you save a JPEG same image, regardless of quality level, you will reduce the actual image quality. Therefore even if you did obtain a quality level from the file, you could not maintain that same quality when you save a JPEG again (even at quality=100).
You should save your JPEG at as high a quality as you can afford in terms of file size. Or use a loss-less format such as PNG.
Low quality JPEG files do not simply become more blocky. Instead colour depth is reduced and the detail of sections of the image are removed. You can't rely on lower quality images being blocky and looking ok at smaller sizes.
According to the JFIF spec. the quality number (0-100) is not stored in the image header, although the horizontal and vertical pixel density is stored.
For future visitors, checking the quality of a given jpeg, you could just use imagemagick tooling:
$> identify -format '%Q' filename.jpg
92%
Jpeg compression algorithm has some parameters which influence on the quality of the result image.
One of such parameters are quantization tables which defines how many bits will be used on each coefficient. Different programs use different quatization tables.
Some programs allow user to set quality level 0-100. But there is no common defenition of this number. The image made with Photoshop with 60% quality takes 46 KB, while the image made with GIMP takes only 26 KB.
Quantization tables are also different.
There are other parameters such subsampling, dct method and etc.
So you can't describe all of them by single quality level number and you can't compare quality of jpeg images by single number. But you can create such number like photoshop or gimp which will describe compromiss between size on quality.
More information:
http://patrakov.blogspot.com/2008/12/jpeg-quality-is-meaningless-number.html
Common practice is that you resize the image to appropriate size and apply jpeg after that. In this case huge and middle images will have the same size and quality.
Here is a formula I've found to work well:
jpg100size (the size it should not exceed in bytes for 98-100% quality) = width*height/1.7
jpgxsize = jpg100size*x (x = percent, e.g. 0.65)
so, you could use these to find out statistically what quality your jpg was last saved at. if you want to get it down to let's say 65% quality and if you want to avoid resampling, you should compare the size initially to make sure it's not already too low, and only then reduce the quality
As there are already two answers using identify, here's one that also outputs the file name (for scanning multiple files at once):
If you wish to have a simple output of filename: quality for use on multiple images, you can use
identify -format '%f: %Q' *
to show the filename + compression of all files within the current directory.
So, there are basically two cases you care about:
If an incoming image has quality set too high, it may take up an inappropriate amount of space. Therefore, you might want, for example, to reduce incoming q=99 to q=85.
If an incoming image has quality set too low, it might be a waste of space to raise it's quality. Except that an image that's had a large amount of data discarded won't magically take up more space when the quality is raised -- blocky images will compress very nicely even at high quality settings. So, in my opinion it's perfectly OK to raise incoming q=1 to q=85.
From this I would think simply forcing a decent quality setting is a perfectly acceptable thing to do.
Every new save of the file will further decrease overall quality, by using higher quality values you will preserve more of image. Regardless of what original image quality was.
If you resave a JPEG using the same software that created it originally, using the same settings, you'll find that the damage is minimized - the algorithm will tend to throw out the same information it threw out the first time. I don't think there's any way to know what level was selected just by looking at the file; even if you could, different software almost guarantees different parameters and rounding, making a match almost impossible.
This may be a silly question, but why would you be concerned about micromanaging the quality of the document? I believe if you use ImageMagick to do the conversion, it will manage the quality of the JPEG for you for best effect. http://www.php.net/manual/en/intro.imagick.php
Here are some ways to achieve your (1) and get it right.
There are ways to do this by fitting to the quantization tables. Sherloq - for example - does this:
https://github.com/GuidoBartoli/sherloq
The relevant (python) code is at https://github.com/GuidoBartoli/sherloq/blob/master/gui/quality.py
There is another algorithm written up in https://arxiv.org/abs/1802.00992 - you might consider contacting the author for any code etc.
You can also simulate file_size(image_dimensions,quality_level) and then invert that function/lookup table to get quality_level(image_dimensions,file_size). Hey presto!
Finally, you can adopt a brute-force https://en.wikipedia.org/wiki/Error_level_analysis approach by calculating the difference between the original image and recompressed versions each saved at a different quality level. The quality level of the original is roughly the one for which the difference is minimized. Seems to work reasonably well (but is linear in the for-loop..).
Most often the quality factor used seems to be 75 or 95 which might help you to get to the result faster. Probably no-one would save a JPEG at 100. Probably no-one would usefully save it at < 60 either.
I can add other links for this as they become available - please put them in the comments.
If you trust Irfanview estimation of JPEG compression level you can extract that information from the info text file created by the following Windows line command (your path to i_view32.exe might be different):
"C:\Program Files (x86)\IrfanView\i_view32.exe" <image-file> /info=txtfile
Jpg compression level is recorded in the IPTC data of an image.
Use exiftool (it's free) to get the exif data of an image then do a search on the returned string for "Photoshop Quality". Or at least put the data returned into a text document and check to see what's recorded. It may vary depending on the software used to save the image.
"Writer Name : Adobe Photoshop
Reader Name : Adobe Photoshop CS6
Photoshop Quality : 7"

Resources