Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
the image is not compressed Im using the following code
UIImage *image = [UIImage imageNamed:#"1.png"];
CGSize newSize = CGSizeMake(100.0f, 100.0f);
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
_backgroundImageView.image = newImage;
If you want to compress the image
UIImage *img = [UIImage imageNamed:#"yourimagename"];
NSData *postData = UIImageJPEGRepresentation(img, 1.0);
NSInteger imgSizeBytes = [postData length];
double imgSizeKBytes = ceil((double)imgSizeBytes / 1024);
NSString *strBytes;
if(imgSizeKBytes > 1024) {
double imgSizeMBytes = (double)imgSizeKBytes / 1024;
strBytes = [NSString stringWithFormat:#"%.1f MB", imgSizeMBytes];
}
else {
strBytes = [NSString stringWithFormat:#"%d KB", (int)imgSizeKBytes];
}
imageView.image = [UIImage imageWithData:postData scale:0.1];
You can compress with your required image size by using below code
NSData *postData = UIImageJPEGRepresentation(img, (double)(100-75)/100);
Above I have given 75.If you want to give your size,give there.
Related
I am converting each PPT slide into images and creating a PDF from them. Images are created properly with their original sizes. Now to create PDF I am using below code.
CGSize paperSize = CGSizeMake(595.2,841.8);
NSString *strPath = [[documentsPaths objectAtIndex:0] stringByAppendingPathComponent:[NSString stringWithFormat:#"slide%i.png", i]];
NSData *imgData = [[NSData alloc] initWithContentsOfURL:[NSURL fileURLWithPath:strPath]];
UIImage *image = [[UIImage alloc] initWithData:imgData];
CGRect rect = CGRectMake(0, 0,paperSize.width ,paperSize.height);
UIGraphicsBeginPDFPageWithInfo(rect, nil);
[image drawInRect:rect];
UIGraphicsEndPDFContext();
What it exactly does is, drawing image with paper size and not it's original size. I want to draw image with original size on A4 size paper. Any help will be appreciated.
Kriti's suggestion worked for me. Below is working code:
UIGraphicsBeginPDFContextToData(pdfFile, CGRectMake(0, 0, kPaperSizeA4.width, kPaperSizeA4.height), nil);
NSString *strPath = [[[appDelegate getDocumentDirectory] stringByAppendingPathComponent:#"EPC"] stringByAppendingPathComponent:[NSString stringWithFormat:#"slide%i.png", i]];
NSData *imgData = [[NSData alloc] initWithContentsOfURL:[NSURL fileURLWithPath:strPath]];
image = [[UIImage alloc] initWithData:imgData];
rect = CGRectMake((paperSize.width - image.size.width)/2, (paperSize.height - image.size.height)/2 ,image.size.width ,image.size.height);
UIGraphicsBeginPDFPageWithInfo(CGRectMake(0, 0, kPaperSizeA4.width, kPaperSizeA4.height), nil);
[image drawInRect:rect];
UIGraphicsEndPDFContext();
Hopefully it will be a duplicate question, but I am really stuck in writing into png data chunks. Can someone help me how to get headers from data, in objective-C? Here is the information Section 4.7.2 , c part, Textual Information.
Thanks
I think there is not need to care about png data chunks, just have a try with the code below:
//load target image, then draw text on the target image
UIImage * image = [UIImage imageNamed:#"targetImage.png"];
NSString * text = #"Watermark!";
CGPoint startPoint = CGPointMake(20, 30);
UIFont *font = [UIFont boldSystemFontOfSize:12];
UIGraphicsBeginImageContext(image.size);
[image drawInRect:CGRectMake(0,0,image.size.width,image.size.height)];
CGRect rect = CGRectMake(startPoint.x, startPoint.y, image.size.width, image.size.height);
[[UIColor whiteColor] set];
[text drawInRect:CGRectIntegral(rect) withFont:font];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Convert UIImage object into NSData (a wrapper for a stream of bytes) formatted according to PNG spec
NSData *imageData = UIImagePNGRepresentation(newImage);
//Save imageDate to disk as a png file
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
NSString *imageSubdirectory = [documentsDirectory stringByAppendingPathComponent:#"watermarkFolderName"];
NSString *filePath = [imageSubdirectory stringByAppendingPathComponent:#"watermark.png"];
[imageData writeToFile:filePath atomically:YES];
This question already has answers here:
UIImage: Resize, then Crop
(16 answers)
Closed 9 years ago.
I have app in which i capture the image it works fine but i want to reduce the size of the image i searched from net i found some code which is here
This is my code working fine when image captured
NSString *type = [info objectForKey:UIImagePickerControllerMediaType];
UIImage *pickedImage = [info objectForKey:UIImagePickerControllerOriginalImage];
NSData *imageData = UIImagePNGRepresentation(pickedImage);
NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
I want to reduce the size of picked image i got the following code but i am did not getting it use like it use newSize but where it is using the original image to compress
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
what you can do is....
NSString *type = [info objectForKey:UIImagePickerControllerMediaType];
UIImage *pickedImage = [info objectForKey:UIImagePickerControllerOriginalImage];
CGSize sizeCropped = CGSizeMake(602, 450);//you can adjust any size here...
pickedImage = [yourclassname imageWithImage:pickedImage scaledToSize:sizeCropped];//call below function...
+ (UIImage*)imageWithImage:(UIImage*)image
scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
let me know it is working or not!!!!
Happy coding!!!!!
+(UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
you can call this function where you want the compressed image using these lines of code.
CGSize firstSize = CGSizeMake(210.0,210.0);
UIImage *compImage=[mainclass imageWithImage:image scaledToSize:firstSize];
if you want to save that new Image with new size then after your code add that image in library with bellow line..
UIImageWriteToSavedPhotosAlbum(UIImage *image, id completionTarget, SEL completionSelector, void *contextInfo);
EX:
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(newImage, nil, nil, nil);
I'm trying to resize an image on a background thread and the app always crashes after a few low memory warnings. How can I rewrite the code below to fix this?
float max = 1024*1024;
NSData *pngData = UIImagePNGRepresentation(setImage);
while ([pngData length] > max) {
pngData = nil;
CGSize newSize = CGSizeMake(setImage.size.width*.9, setImage.size.height *.9);
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"scale: %f", (1024.0*1024.0)/((float)[pngData length]));
pngData = UIImagePNGRepresentation(image);
}
NSLog(#"image length: %i",[pngData length]);
[pngData writeToFile:imageLocation atomically:YES];
I have already tried doing this by calculating the scale and replace the .9 in the code with a scale value
float scale = (max)/((float)[pngData length]);
CGSize newSize = CGSizeMake(setImage.size.width*scale, setImage.size.height *scale);
This made the image too small.
The end goal is to take an image from the camera and save it to disk. I originally had to resize the image because I was getting a "Low Memory warning" when loading the image.
Your code causes an infinite loop and creates images until you run out of memory. Try something like this to fix the infinite loop:
float max = 1024*1024;
NSData *pngData = UIImagePNGRepresentation(setImage);
CGSize newSize = setImage.size;
while ([pngData length] > max) {
newSize = CGSizeMake(newSize.width * 0.9, newSize.height * 0.9);
UIGraphicsBeginImageContext(newSize);
[setImage drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSLog(#"scale: %f", (1024.0*1024.0)/((float)[pngData length]));
pngData = UIImagePNGRepresentation(image);
image = nil;
}
NSLog(#"image length: %i",[pngData length]);
[pngData writeToFile:imageLocation atomically:YES];
Is there any reason why you need to do this by hand? If you use a UIImageView and set the image either with initWithImage: or with the image property and then change UIImageView's frame the image will resize accordingly
I'm currently uploading an image to a server using Imgur on iOS with the following code:
NSData* imageData = UIImagePNGRepresentation(image);
NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* fullPathToFile = [[paths objectAtIndex:0] stringByAppendingPathComponent:#"SBTempImage.png"];
[imageData writeToFile:fullPathToFile atomically:NO];
[uploadRequest setFile:fullPathToFile forKey:#"image"];
The code works fine when run in the simulator and uploading a file from the simulator's photo library because I'm on a fast ethernet connection. However, the same code times out on the iPhone when selecting an image taken with the iPhone. So, I tried it by saving a small image from the web and attempting to upload that, which worked.
This leads me to believe the large images taken by the iPhone are timing out over the somewhat slow 3G network. Is there any way to compress/resize the image from the iPhone before sending it?
This snippet will resize the image:
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
The variable newSize is a CGSize and can be defined like so:
CGSize newSize = CGSizeMake(100.0f, 100.0f);
A self-contained solution:
- (UIImage *)compressForUpload:(UIImage *)original scale:(CGFloat)scale
{
// Calculate new size given scale factor.
CGSize originalSize = original.size;
CGSize newSize = CGSizeMake(originalSize.width * scale, originalSize.height * scale);
// Scale the original image to match the new size.
UIGraphicsBeginImageContext(newSize);
[original drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *compressedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return compressedImage;
}
Thanks to #Tuan Nguyen.
To complement #Tuan Nguyen, this is maybe the fastest and most elegant way to do that.
To link to John Muchow's post at iphonedevelopertips.com , adding a category to a UIImage is a very very handy way to scale in a very fast fashion.
Just calling
UIImage *_image = [[[UIImage alloc] initWithData:SOME_NSDATA] scaleToSize:CGSizeMake(640.0,480.0)];
returns you a 640x480 representation image of your NSDATA ( that could be an online image ) without any more line of code.
Matt Gemmell's MGImageUtilities are very nice, resizing efficiently and with some effort-reducing methods.
In this code 0.5 means 50% ...
UIImage *original = image;
UIImage *compressedImage = UIImageJPEGRepresentation(original, 0.5f);
use this simple method NSData *data = UIImageJPEGRepresentation(chosenImage, 0.2f);
Swift implementation of Zorayr's function (with a bit of a change to include height or width constraints by actual units not scale):
class func compressForUpload(original:UIImage, withHeightLimit heightLimit:CGFloat, andWidthLimit widthLimit:CGFloat)->UIImage{
let originalSize = original.size
var newSize = originalSize
if originalSize.width > widthLimit && originalSize.width > originalSize.height {
newSize.width = widthLimit
newSize.height = originalSize.height*(widthLimit/originalSize.width)
}else if originalSize.height > heightLimit && originalSize.height > originalSize.width {
newSize.height = heightLimit
newSize.width = originalSize.width*(heightLimit/originalSize.height)
}
// Scale the original image to match the new size.
UIGraphicsBeginImageContext(newSize)
original.drawInRect(CGRectMake(0, 0, newSize.width, newSize.height))
let compressedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return compressedImage
}
#import <ImageIO/ImageIO.h>
#import <MobileCoreServices/MobileCoreServices.h>
+ (UIImage *)resizeImage:(UIImage *)image toResolution:(int)resolution {
NSData *imageData = UIImagePNGRepresentation(image);
CGImageSourceRef src = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
CFDictionaryRef options = (__bridge CFDictionaryRef) #{
(id) kCGImageSourceCreateThumbnailWithTransform : #YES,
(id) kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(id) kCGImageSourceThumbnailMaxPixelSize : #(resolution)
};
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(src, 0, options);
CFRelease(src);
UIImage *img = [[UIImage alloc]initWithCGImage:thumbnail];
return img;
}
Swift 2.0 version of Jagandeep Singh method but need to convert data to image because NSData? is not converted UIImage automatically.
let orginalImage:UIImage = image
let compressedData = UIImageJPEGRepresentation(orginalImage, 0.5)
let compressedImage = UIImage(data: compressedData!)
NsData *data=UiImageJPEGRepresentation(Img.image,0.2f);
UIImage *image = [UIImage imageNamed:#"image.png"];
NSData *imgData1 = UIImageJPEGRepresentation(image, 1);
NSLog(#"Original --- Size of Image(bytes):%d",[imgData1 length]);
NSData *imgData2 = UIImageJPEGRepresentation(image, 0.5);
NSLog(#"After --- Size of Image(bytes):%d",[imgData2 length]);
image = [UIImage imageWithData:imgData2];
imgTest.image = image;
try to convert JPG by scaling factor. Here I am using 0.5
In my case:
Original --- Size of Image(bytes): 85KB & After --- Size of Image(bytes): 23KB
-(UIImage *) resizeImage:(UIImage *)orginalImage resizeSize:(CGSize)size
{
CGFloat actualHeight = orginalImage.size.height;
CGFloat actualWidth = orginalImage.size.width;
// if(actualWidth <= size.width && actualHeight<=size.height)
// {
// return orginalImage;
// }
float oldRatio = actualWidth/actualHeight;
float newRatio = size.width/size.height;
if(oldRatio < newRatio)
{
oldRatio = size.height/actualHeight;
actualWidth = oldRatio * actualWidth;
actualHeight = size.height;
}
else
{
oldRatio = size.width/actualWidth;
actualHeight = oldRatio * actualHeight;
actualWidth = size.width;
}
CGRect rect = CGRectMake(0.0,0.0,actualWidth,actualHeight);
UIGraphicsBeginImageContext(rect.size);
[orginalImage drawInRect:rect];
orginalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return orginalImage;
}
This is the method calling
UIImage *compimage=[appdel resizeImage:imagemain resizeSize:CGSizeMake(40,40)];
it returns image this image u can display any where...........
You should be able to make a smaller image by doing something like
UIImage *small = [UIImage imageWithCGImage:original.CGImage scale:0.25 orientation:original.imageOrientation];
(for a quarter-size image) then convert the smaller image to a PNG or whatever format you need.