how can we apply threshold effect on png image using php - imagemagick

I have some greyscaled png image or colorful images. i want to apply the threshold effect using php language.
I am still looking for the solution.

<?php
function thresholdimage($imagePath, $threshold, $channel) {
$imagick = new \Imagick(realpath($imagePath));
$imagick->thresholdimage($threshold * \Imagick::getQuantum(), $channel);
header("Content-Type: image/jpg");
echo $imagick->getImageBlob();
}
?>

Related

Will PHPSpreadsheet convert xls to xlsx and preserve formatting?

I am on the fact-finding part of our project specs and have not found a definitive answer re whether PHPSpreadsheet can convert an .xls to .xlsx and preserve formatting, such as table borders.
From this question, I see that there are separate imports for read/write and for file format type. This example demonstrates use of the different modules for read/write file formats:
<?php
require 'vendor\autoload.php';
use \PhpOffice\PhpSpreadsheet\Reader\Xls;
use \PhpOffice\PhpSpreadsheet\Writer\Xlsx;
$xls_file = "Example.xls";
$reader = new Xls();
$spreadsheet = $reader->load($xls_file);
$loadedSheetNames = $spreadsheet->getSheetNames();
$writer = new Xlsx($spreadsheet);
foreach($loadedSheetNames as $sheetIndex => $loadedSheetName) {
$writer->setSheetIndex($sheetIndex);
$writer->save($loadedSheetName.'.xlsx');
}
However, I have not seen if the resultant export preserves formatting, specifically border lines. At the moment, I am unable to write this myself.
I haven't tested PhpSpreadsheet (due to Composer requirement), but FWIW PhpExcel (the predecessor to PhpSpreadsheet) does the job quite handily and yes, it does preserve most formatting.
Here is a sample conversion script using PhpExcel:
<?php
$xls_to_convert = 'test.xls';
error_reporting(E_ALL);
ini_set('display_errors', TRUE);
ini_set('display_startup_errors', TRUE);
define('EOL',(PHP_SAPI == 'cli') ? PHP_EOL : '<br />');
date_default_timezone_set('America/Vancouver');
require_once dirname(__FILE__) . '/PHPExcel/Classes/PHPExcel/IOFactory.php';
$objPHPExcel = PHPExcel_IOFactory::load(dirname(__FILE__) . '/' . $xls_to_convert);
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter->save(str_replace('.xls', '.xlsx', $xls_to_convert));
A cursory comparison of my test xls spreadsheet against the xlsx result shows:
font sizes, font colors, bold/italic, borders, centering, cell background/shading/colors, drop-downs with values, are preserved
Buttons are lost
Shapes (e.g. lines, arrows, textboxes) are lost
Charts are lost

Is ImageMagick the fastest way to overlay images? How do I make faster or is there a faster tech I'm unaware of?

I'm in the process of building an image manipulation Nginx CDN/cache server to overlay millions of unique SVG designs files on apparel jpegs. Similar tutorial here: http://sumitbirla.com/2011/11/how-to-build-a-scalable-caching-resizing-image-server/
I have written a test script here:
<?php
$cmd = "composite GOSHEN.svg blank-tshirt.jpg -geometry 600x700+456+335 JPG:-";
header("Content-type: image/jpeg");
passthru($cmd);
exit();
?>
Here is an example result:
My issue is that ImageMagick is so slow. Besides more CPU/Memory, are there any tricks to make it faster? Are there any alternative technologies that could be faster to overlay images?
Any help is much appreciated.
php-vips can be quite a bit quicker than imagick. I made a test program for you:
#!/usr/bin/env php
<?php
require __DIR__ . '/vendor/autoload.php';
use Jcupitt\Vips;
for($i = 0; $i < 100; $i++) {
$base = Vips\Image::newFromFile($argv[1], ["access" => "sequential"]);
$overlay = Vips\Image::newFromFile($argv[2], ["access" => "sequential"]);
// centre the overlay on the image, but lift it up a bit
$left = ($base->width - $overlay->width) * 0.5;
$top = ($base->height - $overlay->height) * 0.45;
$out = $base->composite2($overlay, "over", ["x" => $left, "y" => $top]);
// write to stdout with a mime header
$out->jpegsave_mime();
}
Using the test images from your server:
http://build9.hometownapparel.com/pics/
Then running on my desktop machine (Ubuntu 17.10, a fast i7 CPU) I see:
$ time ./overlay.php blank-tshirt.jpg GOSHEN.svg > /dev/null
real 0m2.488s
user 0m13.446s
sys 0m0.328s
So about 25ms per image. I see this result (taken from the first iteration, obviously):
I tried a looping version of your imagemagick example:
#!/usr/bin/env php
<?php
header("Content-type: image/jpeg");
for($i = 0; $i < 100; $i++) {
$cmd = "composite GOSHEN.svg blank-tshirt.jpg -geometry 600x700+456+335 JPG:-";
passthru($cmd);
}
Running it against IM-6.9.7-4 (the version packaged for Ubuntu) I see:
$ time ./magick.php > /dev/null
real 0m29.084s
user 0m42.289s
sys 0m4.716s
Or 290ms per image. So on this test, php-vips is more than 10x faster. It's a bit unfair: imagick would probably be a little quicker than just shelling out to composite.
There's another benchmark here:
https://github.com/jcupitt/php-vips-bench
On that one, php-vips is about 4x faster than imagick and needs 8x less memory.
Here's the whole thing packaged as a Dockerfile you should be able to run anywhere:
https://github.com/jcupitt/docker-builds/tree/master/php-vips-ubuntu-16.04

Imagemagick with greyscale images and 24 bit depth

I am using the Imagick library to work with ImageMagick in PHP. I am first reading an (JPEG) image from an external server with:
$img = new Imagick();
$img->readImage($source);
And then upload it to my Amazon S3 bucket with the following code:
$s3 = new AmazonS3();
$s3->create_object(BUCKET, $destination_path, array(
'body' => $img->getImageBlob(),
'length' => $img->getImageSize(),
'acl' => AmazonS3::ACL_PUBLIC,
'contentType' => 'image/jpeg'
));
Everything seems to be working fine, the files appear in my storage bucket and I can view them in my browser. However, when handling greyscale images, ImageMagick converts the image from 24 bit depth to 8 bit depth. I would like them to keep their 24 bit depth, how could I achieve this? I've tried the following, without success:
$img->setImageType(imagick::IMGTYPE_TRUECOLOR);
For colorized images, everything works fine, images keep their 24 bit depth.
Edit:
It seems that ImageMagick changes the image type from 6 (truecolor) to 2 (greyscale). Trying to overwrite this does not work, as tested with the following code:
$img = new Imagick();
$img->readImage($source);
$img->setImageType(6);
echo $img->getImageType();
which outputs 2
Use setType() before you load the image. So:
$img = new Imagick();
$img->setType(6); //or use imagick::IMGTYPE_TRUECOLOR instead of 6
$img->loadImage($source);
This will output an image with truecolor, even if the loaded picture uses graycolors only.

imagemagick quality image issue using php

I have convert the PDF to image,I try using the exec,by excute the linux command like '/usr/bin/convert -density 300x300 file.pdf[0] -resample 160 -resize 512x700! images/sample-0.jpeg 2>&1'.But i have read some articles that it slow as compare to imagick php module.so try using php with code
$uploadfile = "file.pdf";
$im = new Imagick ( $uploadfile . "[0]" );
$im->setImageFormat ( "png" );
$im->writeImage ( "p-10.png" );
As you can see that i am not setting the desity and resample,the quality of image is not good at all.In php manual also there no function that set the density,i also try with setResolution,but does not work.
Please help if any body any idea about this.
have dream day

Google PageSpeed & ImageMagick JPG compression

Given a user uploaded image, I need to create various thumbnails of it for display on a website. I'm using ImageMagick and trying to make Google PageSpeed happy. Unfortunately, no matter what quality value I specify in the convert command, PageSpeed is still able to suggest compressing the image even further.
Note that http://www.imagemagick.org/script/command-line-options.php?ImageMagick=2khj9jcl1gd12mmiu4lbo9p365#quality mentions:
For the JPEG ... image formats,
quality is 1 [provides the] lowest
image quality and highest compression
....
I actually even tested compressing the image using 1 (it produced an unusable image, though) and PageSpeed still suggests that I can still optimize such image by "losslessly compressing" the image. I don't know how to compress an image any more using ImageMagick. Any suggestions?
Here's a quick way to test what I am talking about:
assert_options(ASSERT_BAIL, TRUE);
// TODO: specify valid image here
$input_filename = 'Dock.jpg';
assert(file_exists($input_filename));
$qualities = array('100', '75', '50', '25', '1');
$geometries = array('100x100', '250x250', '400x400');
foreach($qualities as $quality)
{
echo("<h1>$quality</h1>");
foreach ($geometries as $geometry)
{
$output_filename = "$geometry-$quality.jpg";
$command = "convert -units PixelsPerInch -density 72x72 -quality $quality -resize $geometry $input_filename $output_filename";
$output = array();
$return = 0;
exec($command, $output, $return);
echo('<img src="' . $output_filename . '" />');
assert(file_exists($output_filename));
assert($output === array());
assert($return === 0);
}
echo ('<br/>');
}
The JPEG may contain comments, thumbnails or metadata, which can be removed.
Sometimes it is possible to compress JPEG files more, while keeping the same quality. This is possible if the program which generated the image did not use the optimal algorithm or parameters to compress the image. By recompressing the same data, an optimizer may reduce the image size. This works by using specific Huffman tables for compression.
You may run jpegtran or jpegoptim on your created file, to reduce it further in size.
To minimize the image sizes even more, you should remove all meta data. ImageMagick can do this by adding a -strip to the commandline.
Have you also considered to put your thumbnail images as inline-d base64 encoded data into your HTML?
This can make your web page load much faster (even though the size gets a bit larger), because it saves the browser from running multiple requests for all the image files (the images) which are referenced in the HTML code.
Your HTML code for such an image would look like this:
<IMG SRC="data:image/png;base64,
iVBORw0KGgoAAAANSUhEUgAAAM4AAABJAQMAAABPZIvnAAAABGdBTUEAALGPC/xh
BQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAA
OpgAABdwnLpRPAAAAAZQTFRFAAAA/wAAG/+NIgAAAAF0Uk5TAEDm2GYAAAABYktH
RACIBR1IAAAACXBIWXMAAABIAAAASABGyWs+AAAB6ElEQVQ4y+3UQY7bIBQG4IeQ
yqYaLhANV+iyi9FwpS69iGyiLuZYpepF6A1YskC8/uCA7SgZtVI3lcoiivkIxu/9
MdH/8U+N6el2pk0oFyibWyr1Q3+PlO2NqJV+/BnRPMjcJ9zrfJ/U+zQ9oAvo+QGF
d+npPqFQn++TXElkrEpEJhAtlTBR6dNHUuzIMhFnEhxAmJDkKxlmt7ATXDDJYcaE
r4Txqtkl42VYSH+t9KrD9b5nxZeog/LWGVHprGInGWVQUTvjDWXca5KdsowqyGSc
DrZRlGlQUl4kQwpUjiSS9gI9VdECZhHFQ2I+UE2CHJQfkNxTNKCl0RkURqlLowJK
1h1p3sjc0CJD39D4BIqD7JvvpH/GAxl2/YSq9mtHSHknga7OKNOHKyEdaFC2Dh1w
9VSJemBeGuHgMuh24EynK03YM1Lr83OjUle38aVSfTblT424rl4LhdglsUag5RB5
uBJSJBIiELSzaAeIN0pUlEeZEMeClC4cBuH6mxOlgPjC3uLproUCWfy58WPN/MZR
86ghc888yNdD0Tj8eAucasl2I5LqX19I7EmEjaYjSb9R/G1SYfQA7ZBuT5H6WwDt
UAfK1BOJmh/eZnKLeKvZ/vA8qonCpj1h6djfbqvW620Tva36++MXUkNDlFREMVkA
AAAldEVYdGRhdGU6Y3JlYXRlADIwMTItMDgtMjJUMDg6Mzc6NDUrMDI6MDBTUnmt
AAAAJXRFWHRkYXRlOm1vZGlmeQAyMDEyLTA4LTIyVDA4OjM3OjQ1KzAyOjAwIg/B
EQAAAA50RVh0bGFiZWwAImdvb2dsZSJdcbX4AAAAAElFTkSuQmCC"
ALT="google" WIDTH=214 HEIGHT=57 VSPACE=5 HSPACE=5 BORDER=0 />
And you would create the base64 encoded image data like this:
base64 -i image.jpg -o image.b64
Google performs those calculations based on it's WebP image format (https://developers.google.com/speed/webp/).
Despite giving performance gains though, it is currently supported only by chrome and opera (http://caniuse.com/webp)

Resources