UIImageJPEGRepresentation get wrong size - ios

I want to compress UIImage. However,I get a wrong size.
I'm sure UIImageJPEGRepresentation is wrong.how to fix
sorry for my poor English
+ (NSData *)compressDataWithImg:(UIImage *)originalImage compression:(CGFloat)compression size:(CGFloat)size {
NSData *data = [NSData dataWithData:UIImageJPEGRepresentation(originalImage, compression)];
if ((data.length / 1024.0) > size)
{
compression = compression - 0.1;
if (compression > 0) {
return [[self class] compressDataWithImg:originalImage compression:compression size:(CGFloat)size];
}else{
return data;
}
}else{
return data;
}
return data;
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *fullImage = info[UIImagePickerControllerOriginalImage];
fullImage = [fullImage fixOrientationToUp];
UIImage *imgData = [fullImage scaleToSizeWithLimit:0];
NSData *data = [Util compressDataWithImg:imgData compression:0.9 size:256]; // this is right:size:2M
UIImage *image = [UIImage imageWithData:data];//i use this UIImage
NSData *xxx = [NSData dataWithData: UIImageJPEGRepresentation(image,1) ];// wrong size:7~8M WHY?...
if (self.imageSelectBlock) {
self.imageSelectBlock(image);
}
[picker dismissViewControllerAnimated:YES completion:NULL] ;
}
thanks for helping me

I found when I use UIImageJPEGRepresentation and imageWithData deal a image again and again. the length of image's data gradually increase.
Test code:
UIImage *image = [UIImage imageNamed:#"test.png"];
NSLog(#"first: %lu", UIImageJPEGRepresentation(image, 1.0).length);
NSLog(#"second: %lu", UIImageJPEGRepresentation([UIImage imageWithData:UIImageJPEGRepresentation(image, 1.0)], 1.0).length);
NSLog(#"third: %lu", UIImageJPEGRepresentation([UIImage imageWithData:UIImageJPEGRepresentation([UIImage imageWithData:UIImageJPEGRepresentation(image, 1.0)], 1.0)], 1.0).length);
Logs:
first: 361586
second: 385696
third: 403978
I think this problem is caused by UIImageJPEGRepresentation.
Just a test.
And I found a related question:
When i am using UIImagePNGRepresentation or UIImageJPEGRepresentation for converting UIImage into NSdata, the image size is too much increased

Related

How to add image from iPhone gallery to an array IOS

As I am very much new to iOS development, I am trying to add an uiimage from image gallery to an array. But when I try to use the array to add image to uiimageview it shows me an error
[UIImage stringByDeletingPathExtension]: unrecognized selector sent to
instance 0x7fefd3ea02f0.
choosen_Imgae = info[ UIImagePickerControllerOriginalImage];
//
// choosen_Imgae = [self resizeImage:choosen_Image];
NSLog(#"chosen image %#",choosen_Imgae);
[picker dismissViewControllerAnimated:YES completion:NULL];
NSData *dataImage = [[NSData alloc] init];
dataImage = UIImagePNGRepresentation(choosen_Imgae);
if (dataImage != nil) {
[image_Arr insertObject:choosen_Imgae atIndex:indexxpath.row];
NSLog(#"image array after adding object %#",image_Arr);
}
Brother simply you can add the image to array.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *choosen_Imgae=[info objectForKey:#"UIImagePickerControllerOriginalImage"];
[image_Arr addObject:image];
[picker dismissViewControllerAnimated:YES completion:nil];
}
If you use NSData in didFinishPickingMediaWithInfo,it is long process(image->NSData).When fetching image from array again you have to convert NSData into image.So you can directly save the image into array.
Also you don't need to use insertObject to array.
When fetch the image from array
for (int i=0; i<[image_Arr count]; i++)
{
UIImage *fetch_Image = [image_Arr objectATIndex:i];
imageView.image = fetch_Image;
}

Objective C - Image from camera to set on UIButton

I am trying to set image from camera to set on my button but it did not accepting cropped image by default it sets to original image width and height which was captured by camera and because of this image looks shrink from top to bottom .
When I click on UIButton camera opens and then after image captured it shows on this UIButton
- (IBAction)ProfImageBtnCliked:(id)sender
{
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
[picker setAllowsEditing:YES];
picker.delegate = self;
if ([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceRear] || [UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront] )
{
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
[self presentViewController:picker animated:YES completion:nil];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Alert" message:#"No Camera Available"delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:Nil];
NSData *dataForImage = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerOriginalImage"], 1.0);
[ProfImgButton setImage:[UIImage imageWithData:dataForImage] forState:UIControlStateNormal];
ProfImgButton.userInteractionEnabled=NO;
ProfImgButton.contentMode=UIViewContentModeScaleAspectFill;
ProfImgButton.clipsToBounds=YES;
CGSize constraint = CGSizeMake(200,200);
NSData *imgData = UIImageJPEGRepresentation([self imageWithImage [UIImage imageWithData:dataForImage] scaledToSize:constraint], 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
NSString *imageString = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
NSLog(#"%#",imageString);
//assign base64 image to image string
Base64ImageString=[imageString stringByReplacingOccurrencesOfString:#"+" withString:#"%2B"];
}
//reduce image size
-(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
You are setting the original image taken from your camera in your code here
[picker dismissViewControllerAnimated:YES completion:Nil];
NSData *dataForImage = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerEditedImage"], 1.0);
[ProfImgButton setImage:[UIImage imageWithData:dataForImage] forState:UIControlStateNormal];
while you are resizing the image below
CGSize constraint = CGSizeMake(200,200);
NSData *imgData = UIImageJPEGRepresentation([self imageWithImage [UIImage imageWithData:dataForImage] scaledToSize:constraint], 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
This data is not used anywhere. You are not setting it on your ProfImgButton
All you need to do is set the resized imagedata to your button
[ProfImgButton setImage:[UIImage imageWithData:imgData] forState:UIControlStateNormal];
Your code of cropping is wrong!
Write like this:-
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:nil];
CGSize size = CGSizeMake(200,200);
UIImage *originalImage=UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerEditedImage"], 1.0);
//CROP image code
UIGraphicsBeginImageContext(size);
[originalImage drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//now assign image to button
[ProfImgButton setImage:croppedImage forState:UIControlStateNormal];
ProfImgButton.userInteractionEnabled=NO;
ProfImgButton.contentMode=UIViewContentModeScaleAspectFill;
ProfImgButton.clipsToBounds=YES;
NSData *imgData = UIImageJPEGRepresentation(croppedImage , 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
NSString *imageString = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
NSLog(#"%#",imageString);
//assine base64 image to image string
Base64ImageString=[imageString stringByReplacingOccurrencesOfString:#"+" withString:#"%2B"];
}

Sending image in base64 with SOAP in iOS Objective-C

I am a newbie in iOS development. I am trying to send a clicked image by encoding it into base64 format using SOAP. I don't know how to do this.
This is my imagePickerController delegate:
// delegate method for picking images
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info valueForKey:UIImagePickerControllerMediaType];
if([mediaType isEqualToString:(NSString*)kUTTypeImage])
{
UIImage *photoTaken = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//Save Photo to library only if it wasnt already saved i.e. its just been taken
if (picker.sourceType == UIImagePickerControllerSourceTypeCamera)
{
UIImageWriteToSavedPhotosAlbum(photoTaken, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
NSData *data=[[NSData alloc] initWithData:UIImagePNGRepresentation(photoTaken)];
base64= [[NSString alloc]init];
base64 =[data base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn];
}
}
// [picker dismissModalViewControllerAnimated:YES];
[picker dismissViewControllerAnimated:YES completion:NULL];
[picker release];
}
in didFinishPickingMediaWithInfo..
UIImage* chosenImage =info[UIImagePickerControllerEditedImage];
//encoding image to base64
imgData=[[NSData alloc] initWithData:UIImagePNGRepresentation(chosenImage)];
_base64=[[NSString alloc]init];
_base64=[imgData base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
self.tempbase= _base64;
and calling tempbase in soap message
The image size will be huge. It can reduce your app performance hence decrase the image size first by resizing the image.
-(UIImage *) imageWithImage:(UIImage *) image scaledTOSize:(CGSize) newsize
{
UIGraphicsBeginImageContext(newsize);
[image drawInRect:CGRectMake(0, 0, newsize.width, newsize.height)];
UIImage *newImg=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImg;
}
Now convert this small sized UIImage into NSData
NSData *imgData=[[NSData alloc] initWithData:UIImagePNGRepresentation(image)];
Then convert NSData into base64 string using third party library -
base64.h
NSData+Base64.h
NSstring *imgString = [imgData base64EncodedString];
imgString = [imgString stringByReplacingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
Now send this string to your services.
Code:
UIGraphicsBeginImageContext(self.drawImage.frame.size);
[self.drawImage.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imageView = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData* data = UIImageJPEGRepresentation(imageView, 1.0f);
[Base64 initialize];
NSString *strEncoded = [Base64 encode:data];
NOTE: drawImage is object of UIImageView and Import Base64.h class.

Decode iOS base64 image

I keep getting trouble decoding a encoded image.
I use the following code to encode my image returned from UIImagePickerController:
NSData *imageData2 = UIImageJPEGRepresentation(image, 1.0);
NSString *encodedString = [imageData2 base64Encoding];
I tried resizing the image to 600x600, I also tried to compress the image.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [self imageWithImage:[info objectForKey:UIImagePickerControllerEditedImage] scaledToSize:CGSizeMake(600, 600)];
imgPreviewSelected.image = [self imageWithImage:image scaledToSize:CGSizeMake(600, 600)];
CGFloat compression = 0.9f;
CGFloat maxCompression = 0.1f;
int maxFileSize = 250*1024;
NSData *imageData = UIImageJPEGRepresentation(image, compression);
while ([imageData length] > maxFileSize && compression > maxCompression)
{
compression -= 0.1;
imageData = UIImageJPEGRepresentation(image, compression);
}
image = [UIImage imageWithData:imageData];
imagePickerReturnedImage = image;
NSData *imageData2 = UIImageJPEGRepresentation(image, 1.0);
NSString *encodedString = [imageData2 base64Encoding];
[utils postData:#"example.com" : [NSString stringWithFormat:#"image=%#", encodedString]];
[self dismissViewControllerAnimated:YES completion:NULL];
}
When I try decoding the image that was sent to my database it tells me that the file is damaged.
I tried decoding it with PHP and the following website:
http://www.motobit.com/util/base64-decoder-encoder.asp
My MYSQL database tells me that the image is 1MB, I think that is pretty large for an image of 600x600. Without compression it was 1,3MB.
I Use this method to scale my image:
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I hope anyone can help me out, thx
I used another way without base64, so no answers required.
Your code look right.
Have you try copy your base64string to decode in that website?
you can check image size by
NSLog(#"dataSize: %#",[NSByteCountFormatter stringFromByteCount:data.length countStyle:NSByteCountFormatterCountStyleFile]);

UIImageJPEGRepresentation and UIImagePNGRepresentation both are slow

Here i am converting my image to binary data by category on UIImage which have static method.My Problem is UIImageJPEGRepresentation and UIImagePNGRepresentation are very slow upto 6 second. I need 1 sec solution.Can Anybody help me.
Here i pass my image to category method till its size reduce to less than or equal to 10kbs.
-(NSData *)imageConvertToBinary :(UIImage *) image{
NSLog(#"Image Convert ");
//UIImagePNGRepresentation(image);
NSData *imageData = UIImageJPEGRepresentation(image, .000032);
NSLog(#"Image Done ");
//Change size of image to 10kbs
int size = imageData.length;
NSLog(#"SIZE OF IMAGE:First %i ", size);
NSData *data = UIImageJPEGRepresentation(image, .0032);
NSLog(#"Start while ");
int temp=0;
while (data.length / 1000 >= 10) {
image = [UIImage imageWithImage:image andWidth:image.size.width/2 andHeight:image.size.height/2];
data = UIImageJPEGRepresentation(image, .0032);
temp++;
NSLog(#"temp %u",temp);
}
size = data.length;
NSLog(#"SIZE OF IMAGE:after %i ", size);
return data;
}
and also i have category class on UIImage
#implementation UIImage (ImageProcessing)
+(UIImage*)imageWithImage:(UIImage*)image andWidth:(CGFloat)width andHeight:(CGFloat)height
{
UIGraphicsBeginImageContext( CGSizeMake(width, height));
[image drawInRect:CGRectMake(0,0,width,height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
#end
NSData *data ;
must be equal to something
I reduce your code you were using two times more UIImageJPEGRepresentation try this
- (NSData *)imageConvertToBinary :(UIImage *) image{
NSData *data ;
NSLog(#"Start while ");
int temp=0;
while (data.length / 1000 >= 10) {
image = [UIImage imageWithImage:image andWidth:image.size.width/2 andHeight:image.size.height/2];
data = UIImageJPEGRepresentation(image, .0032);
temp++;
NSLog(#"temp %u",temp);
}
NSLog(#"End while ");
int size = data.length;
NSLog(#"SIZE OF IMAGE:after %i ", size);
return data;
}

Resources