Decode iOS base64 image - ios

I keep getting trouble decoding a encoded image.
I use the following code to encode my image returned from UIImagePickerController:
NSData *imageData2 = UIImageJPEGRepresentation(image, 1.0);
NSString *encodedString = [imageData2 base64Encoding];
I tried resizing the image to 600x600, I also tried to compress the image.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [self imageWithImage:[info objectForKey:UIImagePickerControllerEditedImage] scaledToSize:CGSizeMake(600, 600)];
imgPreviewSelected.image = [self imageWithImage:image scaledToSize:CGSizeMake(600, 600)];
CGFloat compression = 0.9f;
CGFloat maxCompression = 0.1f;
int maxFileSize = 250*1024;
NSData *imageData = UIImageJPEGRepresentation(image, compression);
while ([imageData length] > maxFileSize && compression > maxCompression)
{
compression -= 0.1;
imageData = UIImageJPEGRepresentation(image, compression);
}
image = [UIImage imageWithData:imageData];
imagePickerReturnedImage = image;
NSData *imageData2 = UIImageJPEGRepresentation(image, 1.0);
NSString *encodedString = [imageData2 base64Encoding];
[utils postData:#"example.com" : [NSString stringWithFormat:#"image=%#", encodedString]];
[self dismissViewControllerAnimated:YES completion:NULL];
}
When I try decoding the image that was sent to my database it tells me that the file is damaged.
I tried decoding it with PHP and the following website:
http://www.motobit.com/util/base64-decoder-encoder.asp
My MYSQL database tells me that the image is 1MB, I think that is pretty large for an image of 600x600. Without compression it was 1,3MB.
I Use this method to scale my image:
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I hope anyone can help me out, thx

I used another way without base64, so no answers required.

Your code look right.
Have you try copy your base64string to decode in that website?
you can check image size by
NSLog(#"dataSize: %#",[NSByteCountFormatter stringFromByteCount:data.length countStyle:NSByteCountFormatterCountStyleFile]);

Related

UIImageJPEGRepresentation get wrong size

I want to compress UIImage. However,I get a wrong size.
I'm sure UIImageJPEGRepresentation is wrong.how to fix
sorry for my poor English
+ (NSData *)compressDataWithImg:(UIImage *)originalImage compression:(CGFloat)compression size:(CGFloat)size {
NSData *data = [NSData dataWithData:UIImageJPEGRepresentation(originalImage, compression)];
if ((data.length / 1024.0) > size)
{
compression = compression - 0.1;
if (compression > 0) {
return [[self class] compressDataWithImg:originalImage compression:compression size:(CGFloat)size];
}else{
return data;
}
}else{
return data;
}
return data;
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *fullImage = info[UIImagePickerControllerOriginalImage];
fullImage = [fullImage fixOrientationToUp];
UIImage *imgData = [fullImage scaleToSizeWithLimit:0];
NSData *data = [Util compressDataWithImg:imgData compression:0.9 size:256]; // this is right:size:2M
UIImage *image = [UIImage imageWithData:data];//i use this UIImage
NSData *xxx = [NSData dataWithData: UIImageJPEGRepresentation(image,1) ];// wrong size:7~8M WHY?...
if (self.imageSelectBlock) {
self.imageSelectBlock(image);
}
[picker dismissViewControllerAnimated:YES completion:NULL] ;
}
thanks for helping me
I found when I use UIImageJPEGRepresentation and imageWithData deal a image again and again. the length of image's data gradually increase.
Test code:
UIImage *image = [UIImage imageNamed:#"test.png"];
NSLog(#"first: %lu", UIImageJPEGRepresentation(image, 1.0).length);
NSLog(#"second: %lu", UIImageJPEGRepresentation([UIImage imageWithData:UIImageJPEGRepresentation(image, 1.0)], 1.0).length);
NSLog(#"third: %lu", UIImageJPEGRepresentation([UIImage imageWithData:UIImageJPEGRepresentation([UIImage imageWithData:UIImageJPEGRepresentation(image, 1.0)], 1.0)], 1.0).length);
Logs:
first: 361586
second: 385696
third: 403978
I think this problem is caused by UIImageJPEGRepresentation.
Just a test.
And I found a related question:
When i am using UIImagePNGRepresentation or UIImageJPEGRepresentation for converting UIImage into NSdata, the image size is too much increased

Image becomes blur when applying CIFilter

I am working in a iOS app to crop the rectangle image from camera. And I am using the CIDetector to get the rect features and using CIFilter in order to crop the rectangle image. But after applying the filter the result image quality becomes very poor.
Here is my code below.
I am getting video capture output from the following delegate method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
// Convert into CIIImage to find the rectfeatures.
self.sourceImage = [[CIImage alloc] initWithCGImage:[self imageFromSampleBuffer:sampleBuffer].CGImage options:nil];
}
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVPixelBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciimg = [CIImage imageWithCVPixelBuffer:pb];
// show result
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef ref = [context createCGImage:ciimg fromRect:ciimg.extent];
UIImage *image = [UIImage imageWithCGImage:ref scale:1.0 orientation:(UIImageOrientationUp)];
CFRelease(ref);
return (image);
}
And I am running a NSTimer in the background which will start detect rect features from the captured source image for every 0.2 seconds
- (void)performRectangleDetection:(CIImage *)image{
if(image == nil)
return;
NSArray *rectFeatures = [self.rectangleDetector featuresInImage:image];
if ([rectFeatures count] > 0 ) {
[self capturedImage:image];
}
}
-(void)capturedImage:(CIImage *)image
{
NSArray *rectFeatures = [self.rectangleDetector featuresInImage:image];
CIImage *resultImage = [image copy];
for (CIRectangleFeature *feature in rectFeatures) {
resultImage = [image imageByApplyingFilter:#"CIPerspectiveCorrection"
withInputParameters:#{#"inputTopLeft":[CIVector vectorWithCGPoint:feature.topLeft] ,
#"inputTopRight": [CIVector vectorWithCGPoint:feature.topRight],
#"inputBottomLeft": [CIVector vectorWithCGPoint:feature.bottomLeft],
#"inputBottomRight": [CIVector vectorWithCGPoint:feature.bottomRight]}];
}
UIImage *capturedImage = [[UIImage alloc] initWithCIImage: resultImage];
UIImage *finalImage = [self imageWithImage:capturedImage scaledToSize:capturedImage.size];
}
The finalImage will be retrieved after sending to this method
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
The final image quality becomes blurred sometimes. Is that because of the filter or because of camera output image ? Pls help me to solve this.
It is likely that you are not recreating the final image with the correct scale factor.
UIImage *finalImage = [UIImage imageWithCGImage:resultImage
scale:original.scale
orientation:original.imageOrientation];
If this doesn't solve the issue, please provide more code sample from the camera input, and how you converted the final CIImage from the filters into UIImage.
use the following method to crop image
-(UIImage*)cropImage:(UIImage*)image withRect:(CGRect)rect {
CGImageRef cgImage = CGImageCreateWithImageInRect(image.CGImage, rect);
UIImage *cropedImage = [UIImage imageWithCGImage:cgImage];
return cropedImage;
}

Objective C - Image from camera to set on UIButton

I am trying to set image from camera to set on my button but it did not accepting cropped image by default it sets to original image width and height which was captured by camera and because of this image looks shrink from top to bottom .
When I click on UIButton camera opens and then after image captured it shows on this UIButton
- (IBAction)ProfImageBtnCliked:(id)sender
{
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
[picker setAllowsEditing:YES];
picker.delegate = self;
if ([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceRear] || [UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront] )
{
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
[self presentViewController:picker animated:YES completion:nil];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Alert" message:#"No Camera Available"delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:Nil];
NSData *dataForImage = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerOriginalImage"], 1.0);
[ProfImgButton setImage:[UIImage imageWithData:dataForImage] forState:UIControlStateNormal];
ProfImgButton.userInteractionEnabled=NO;
ProfImgButton.contentMode=UIViewContentModeScaleAspectFill;
ProfImgButton.clipsToBounds=YES;
CGSize constraint = CGSizeMake(200,200);
NSData *imgData = UIImageJPEGRepresentation([self imageWithImage [UIImage imageWithData:dataForImage] scaledToSize:constraint], 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
NSString *imageString = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
NSLog(#"%#",imageString);
//assign base64 image to image string
Base64ImageString=[imageString stringByReplacingOccurrencesOfString:#"+" withString:#"%2B"];
}
//reduce image size
-(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
You are setting the original image taken from your camera in your code here
[picker dismissViewControllerAnimated:YES completion:Nil];
NSData *dataForImage = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerEditedImage"], 1.0);
[ProfImgButton setImage:[UIImage imageWithData:dataForImage] forState:UIControlStateNormal];
while you are resizing the image below
CGSize constraint = CGSizeMake(200,200);
NSData *imgData = UIImageJPEGRepresentation([self imageWithImage [UIImage imageWithData:dataForImage] scaledToSize:constraint], 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
This data is not used anywhere. You are not setting it on your ProfImgButton
All you need to do is set the resized imagedata to your button
[ProfImgButton setImage:[UIImage imageWithData:imgData] forState:UIControlStateNormal];
Your code of cropping is wrong!
Write like this:-
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:nil];
CGSize size = CGSizeMake(200,200);
UIImage *originalImage=UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerEditedImage"], 1.0);
//CROP image code
UIGraphicsBeginImageContext(size);
[originalImage drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//now assign image to button
[ProfImgButton setImage:croppedImage forState:UIControlStateNormal];
ProfImgButton.userInteractionEnabled=NO;
ProfImgButton.contentMode=UIViewContentModeScaleAspectFill;
ProfImgButton.clipsToBounds=YES;
NSData *imgData = UIImageJPEGRepresentation(croppedImage , 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
NSString *imageString = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
NSLog(#"%#",imageString);
//assine base64 image to image string
Base64ImageString=[imageString stringByReplacingOccurrencesOfString:#"+" withString:#"%2B"];
}

Sending image in base64 with SOAP in iOS Objective-C

I am a newbie in iOS development. I am trying to send a clicked image by encoding it into base64 format using SOAP. I don't know how to do this.
This is my imagePickerController delegate:
// delegate method for picking images
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSString *mediaType = [info valueForKey:UIImagePickerControllerMediaType];
if([mediaType isEqualToString:(NSString*)kUTTypeImage])
{
UIImage *photoTaken = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
//Save Photo to library only if it wasnt already saved i.e. its just been taken
if (picker.sourceType == UIImagePickerControllerSourceTypeCamera)
{
UIImageWriteToSavedPhotosAlbum(photoTaken, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
NSData *data=[[NSData alloc] initWithData:UIImagePNGRepresentation(photoTaken)];
base64= [[NSString alloc]init];
base64 =[data base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithCarriageReturn];
}
}
// [picker dismissModalViewControllerAnimated:YES];
[picker dismissViewControllerAnimated:YES completion:NULL];
[picker release];
}
in didFinishPickingMediaWithInfo..
UIImage* chosenImage =info[UIImagePickerControllerEditedImage];
//encoding image to base64
imgData=[[NSData alloc] initWithData:UIImagePNGRepresentation(chosenImage)];
_base64=[[NSString alloc]init];
_base64=[imgData base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
self.tempbase= _base64;
and calling tempbase in soap message
The image size will be huge. It can reduce your app performance hence decrase the image size first by resizing the image.
-(UIImage *) imageWithImage:(UIImage *) image scaledTOSize:(CGSize) newsize
{
UIGraphicsBeginImageContext(newsize);
[image drawInRect:CGRectMake(0, 0, newsize.width, newsize.height)];
UIImage *newImg=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImg;
}
Now convert this small sized UIImage into NSData
NSData *imgData=[[NSData alloc] initWithData:UIImagePNGRepresentation(image)];
Then convert NSData into base64 string using third party library -
base64.h
NSData+Base64.h
NSstring *imgString = [imgData base64EncodedString];
imgString = [imgString stringByReplacingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
Now send this string to your services.
Code:
UIGraphicsBeginImageContext(self.drawImage.frame.size);
[self.drawImage.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imageView = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData* data = UIImageJPEGRepresentation(imageView, 1.0f);
[Base64 initialize];
NSString *strEncoded = [Base64 encode:data];
NOTE: drawImage is object of UIImageView and Import Base64.h class.

UIImagePickerController forces images in other views to full size

In one spot of my app, I need to use the camera, so I call up the UIImagePickerController. Unfortunately, once I return from the controller, most of the pictures in the app are full size, no matter what their UIImageView attributes say. The exception appears to be UIImageViews in UITableViewCells. This applies to all views in the app, not just ones that have direct connection to the viewcontroller that called the UIImagePickerController. Once while I was messing around, trying to troubleshoot, the problem seemed to disappear on its own, though I have not been able to replicate that.
The code is as follows.
-(void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
if(![UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]){
UIAlertView *errorAlertView = [[UIAlertView alloc] initWithTitle:#"Error"
message:#"Device has no camera"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[errorAlertView show];
[_app.navController popPage];
}
else if(firstTime){
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.allowsEditing = YES;
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
firstTime = false;
[self presentViewController:picker animated:YES completion:NULL];
}
}
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *cameraImage = info[UIImagePickerControllerEditedImage];
[picker dismissViewControllerAnimated:YES completion:NULL];
NSString *folderName = #"redApp";
if([_page hasChild:[RWPAGE FOLDER]]){
folderName = [_page getStringFromNode:[RWPAGE FOLDER]];
}
NSDate *datetimeNow = [[NSDate alloc] init];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"yyyy-MM-dd_HH-mm-ss-SSS"];
NSString *filename = [NSString stringWithFormat:#"%#.png",[dateFormatter stringFromDate:datetimeNow]];
NSString *applicationDocumentsDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *folderPath = [applicationDocumentsDir stringByAppendingPathComponent:folderName];
NSString *filePath = [folderPath stringByAppendingPathComponent:filename];
NSError *error = nil;
if(![[NSFileManager defaultManager] fileExistsAtPath:folderPath isDirectory:nil]){
[[NSFileManager defaultManager] createDirectoryAtPath:folderPath withIntermediateDirectories:NO
attributes:nil error:&error];
}
if(error != nil){
NSLog(#"Create directory error: %#", error);
}
[UIImagePNGRepresentation(cameraImage) writeToFile:filePath options:NSDataWritingAtomic error:&error];
if(error != nil){
NSLog(#"Error in saving image to disk. Error : %#", error);
}
RWXmlNode *nextPage = [_xml getPage:[_page getStringFromNode:[RWPAGE CHILD]]];
nextPage = [nextPage deepClone];
[nextPage addNodeWithName:[RWPAGE FILEPATH] value:filePath];
[_app.navController pushViewWithPage:nextPage];
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
[picker dismissViewControllerAnimated:YES completion:NULL];
[_app.navController popPage];
}
Edit:
To expand upon the above.
The base of the app is a Custom Container View Controller, acting mostly like a Navigation Controller. When a user navigates to a page (what I call the combination of a view and view controller) it is displayed on the Custom Container view, and the previous page is stored in a stack.
One of my pages calls upon a UIImagePicker. Once the image picker has been closed again, and I return to the app, problems appear across the app when I open new pages. I don't see problems on every page, but they are on several independent pages. Most pages look completely unaffected, while the problem pages appear to not obey their constraints.
if you want to compress image use
+(UIImage *)resizeImage:(UIImage*)image newSize:(CGSize)newSize
{
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = image.CGImage;
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, newSize.height);
CGContextConcatCTM(context, flipVertical);
// Draw into the context; this scales the image
CGContextDrawImage(context, newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(context);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGImageRelease(newImageRef);
UIGraphicsEndImageContext();
return newImage;
}
or
+(UIImage *)scaleImage:(UIImage *)image toSize:(CGSize)targetSize {
CGFloat scaleFactor = 1.0;
if (image.size.width > targetSize.width || image.size.height > targetSize.height)
if (!((scaleFactor = (targetSize.width / image.size.width)) > (targetSize.height / image.size.height))) //scale to fit width, or
scaleFactor = targetSize.height / image.size.height; // scale to fit heigth.
UIGraphicsBeginImageContext(targetSize);
CGRect rect = CGRectMake((targetSize.width - image.size.width * scaleFactor) / 2,
(targetSize.height - image.size.height * scaleFactor) / 2,
image.size.width * scaleFactor, image.size.height * scaleFactor);
[image drawInRect:rect];
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return scaledImage;
}
Try like this :-
UIImage *thumbnail =[UIImage imageNamed:#"yourimage.png"];
CGSize itemSize = CGSizeMake(35, 35);
UIGraphicsBeginImageContext(itemSize);
CGRect imageRect = CGRectMake(0.0, 0.0, itemSize.width, itemSize.height);
[thumbnail drawInRect:imageRect];
// Now this below image contains compressed image
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

Resources