Base64 string Invalid in iOS - ios

I am converting image into base64 string like this.
ALAssetRepresentation *rep = [imageAsset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
UIImage *copyOfOriginalImage = [UIImage imageWithCGImage:[[imageAsset defaultRepresentation] fullResolutionImage]];
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
imageData = UIImageJPEGRepresentation(copyOfOriginalImage, 1.0);
Then I convert this NSData into base64 string like this.
strEncoded=[imageData base64EncodedStringWithOptions:76];
I used this online base64 convertor to test my image(use same image in online tool and within my app)
FROM THE TOOL
9j/4AAQSkZJRgABAQAASABIAAD/4QBYRXhpZgAATU0AKgAAAAgAAgESAAMAAAABAAYAAIdpAAQAAAABAAAAJgAAAAAAA6ABAAMAAAABAAEAAKACAAQAAAABAAAMwKADAAQAAAABAAAJkAAAAAD/7QA4
FROM THE APP
\/9j\/4AAQSkZJRgABAQAASABIAAD\/4QBMRXhpZgAATU0AKgAAAAgAAgESAAMAAAABAAEAAIdpAAQAAAABAAAAJgAAAAAAAqACAAQAAAABAAAMwKADAAQAAAABAAAJkAAAAAD\/
These starting portions are different. But when I converted using online tool that string accepts by my server.But not when I pass from the app.
What is the reason for this?
Please help me.
Thanks

Related

How to convert NSData object to CMSampleBuffer

I have two Apps now, one of those convert the CMSampleBuffer video data to NSData object, then transport it via network.Another App receive those data, now how can I convert NSData object back to CMSampleBuffer data?
Here is a way I use it to convert CMSampleBuffer to NSData object:
CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(sampleBuffer);
size_t length = CMBlockBufferGetDataLength(blockBufferRef);
Byte buffer[length];
CMBlockBufferCopyDataBytes(blockBufferRef, 0, length, buffer);
NSData *data = [NSData dataWithBytes:buffer length:length];
Or I just need another way to transport the video data?

NSImage and UIImage give different NSData representations

Scenario:
I have an image in the iPhone camera roll. I access it using ALAssetLibrary and get an ALAsset object. I get a UIImage and NSData object from it using something like the following code.
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref ];
NSData * data = UIImageJPEGRepresentation(largeimage, 1.0f);
}
I then copy the image from the camera roll using Image Capture onto my mac. I then use NSImage in my Mac Code to open the copied image and try to get a NSData representation using the following code.
NSImage * image = [[NSImage alloc] initWithContentsOfURL:fileURL];
NSBitmapImageRep *imgRep = [[image representations] objectAtIndex: 0];
NSData *data = [imgRep representationUsingType: NSJPEGFileType properties: nil];
Problem:
Unfortunately, the two NSData representations I get are very different. I want to be able to get the same NSData representation in both cases (since it is the same file). I can then go on to hash the NSData objects and compare the hashes to conclude that the two are (possibly) the same image. Ideally I would want the following two functions;
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset;
//or
-(NSData *) getDataFromUIImage:(UIImage*)image;
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url;
//or
-(NSData *) getDataFromNSImage:(NSImage*)image;
Such that the NSData* representation I get in OS X and iOS are exactly the same given that they come from the same source image.
What I have tried:
I have tried to play around with how I get the UIImage object from the ALAsset object, I have tried to UIImagePNGRepresentation (and the corresponding for getting NSData in OS X). I have also tried to play around with different parameters for getting the representation in OS X but nothing has come through. I have also tried to create a CGIImageRef on both platforms, convert that to Bitmaps and read them pixel by pixel and even those seem to be off (and yes I do realise that the NSBitmapImageRep has different co-ordinate system).
I did eventually find a way to do what I wanted. The ALAssetRepresentation class's getBytes:fromOffset:length:error: method can be used to get the NSData object which is same as [NSData dataWithContentsOfURL:fileURL] in OS X. Note that doing it from the UIImage is not possible since the UIImage performs some processing on the image. Here is what the requested functions would look like.
//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset {
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *assetData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
return assetData;
}
//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url
{
return [NSData dataWithContentsOfURL:url];
}

length property returns wrong size(bytes) for an image

I have a .png file in my resources folder.(actual size is 411 KB)
When I convert the uiimage to nsdata and try accessing length property, it gives me wrong value.
Code...
UIImage *image = [UIImage imageNamed:#"sample.png"];
NSData *imgData = [[NSData alloc] initWithData:UIImageJPEGRepresentation(image, 1.0)];
int imageSize = imgData.length;
NSLog(#"Image size in KB is %d",imageSize/1024); //-------- returns 631 KB
Please let me know if there is any other property which helps..
So here is my requirement....
I want to know the size of the image I pick from uimagepicker. The exact size of the image when I see it in the finder and the size which gets returned to me after picking it from the library is totally different... Is there any other property which can be used instead of length?
You are converting a png to a jpeg, and so different file size should be expected.
If you wish to get the file-size of the original, png image, do the following.
NSString *path = [[NSBundle mainBundle] pathForResource:#"sample" ofType:#"png"];
NSData *rawData = [NSData dataWithContentsOfFile:path];
NSLog(#"%d", rawData.length);
try this:
unsigned int len = [data length];
uint32_t little = (uint32_t)NSSwapHostIntToLittle(len);
NSData *byteData = [NSData dataWithBytes:&little length:4];
When you loaded the image you decompressed it. When you created "imgData" the image did not get recompressed with the same algorithm. There would be no reason to expect the two to have the same size.

Interpret XMP-Metadata in ALAssetRepresentation

When a user makes some changes (cropping, red-eye removal, ...) to photos in the built-in Photos.app on iOS, the changes are not applied to the fullResolutionImage returned by the corresponding ALAssetRepresentation.
However, the changes are applied to the thumbnail and the fullScreenImage returned by the ALAssetRepresentation.
Furthermore, information about the applied changes can be found in the ALAssetRepresentation's metadata dictionary via the key #"AdjustmentXMP".
I would like to apply these changes to the fullResolutionImage myself to preserve consistency. I've found out that on iOS6+ CIFilter's filterArrayFromSerializedXMP: inputImageExtent:error: can convert this XMP-metadata to an array of CIFilter's:
ALAssetRepresentation *rep;
NSString *xmpString = rep.metadata[#"AdjustmentXMP"];
NSData *xmpData = [xmpString dataUsingEncoding:NSUTF8StringEncoding];
CIImage *image = [CIImage imageWithCGImage:rep.fullResolutionImage];
NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData
inputImageExtent:image.extent
error:&error];
if (error) {
NSLog(#"Error during CIFilter creation: %#", [error localizedDescription]);
}
CIContext *context = [CIContext contextWithOptions:nil];
for (CIFilter *filter in filterArray) {
[filter setValue:image forKey:kCIInputImageKey];
image = [filter outputImage];
}
However, this works only for some filters (cropping, auto-enhance) but not for others like red-eye removal. In these cases, the CIFilters have no visible effect. Therefore, my questions:
Is anyone aware of a way to create red-eye removal CIFilter? (In a way consistent with the Photos.app. The filter with the key kCIImageAutoAdjustRedEye is not enough. E.g., it does not take parameters for the position of the eyes.)
Is there a possibility to generate and apply these filters under iOS 5?
ALAssetRepresentation* representation = [[self assetAtIndex:index] defaultRepresentation];
// Create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte*)malloc(representation.size); // Copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer fromOffset: 0.0 length:representation.size error:nil];
if (length==0)
return nil;
// Convert the buffer into a NSData object, and free the buffer after.
NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:representation.size freeWhenDone:YES];
// Set up a dictionary with a UTI hint. The UTI hint identifies the type
// of image we are dealing with (that is, a jpeg, png, or a possible
// RAW file).
// Specify the source hint.
NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:
(id)[representation UTI], kCGImageSourceTypeIdentifierHint, nil];
// Create a CGImageSource with the NSData. A image source can
// contain x number of thumbnails and full images.
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef) adata, (CFDictionaryRef) sourceOptionsDict);
[adata release];
CFDictionaryRef imagePropertiesDictionary;
// Get a copy of the image properties from the CGImageSourceRef.
imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);
CFNumberRef imageWidth = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelWidth);
CFNumberRef imageHeight = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelHeight);
int w = 0;
int h = 0;
CFNumberGetValue(imageWidth, kCFNumberIntType, &w);
CFNumberGetValue(imageHeight, kCFNumberIntType, &h);
// Clean up memory
CFRelease(imagePropertiesDictionary);

How to update exif of ALAsset without changing the image?

I use setImageData:metadata:completionBlock: of ALAsset to update the exif(metadata) of an asset.
I just want to update the metadata, but this method require an imageData as the first parameter. I use the code below to generate imageData, but it modified my image(I checked the file size and file hash).
ALAssetRepresentation *dr = asset.defaultRepresentation;
UIImage *image = [UIImage imageWithCGImage:dr.fullResolutionImage scale:dr.scale orientation:dr.orientation];
NSData *data = UIImageJPEGRepresentation(image, 1);
Is there any other method I could use to update just the exif of an ALAsset? Or any way to generate the right imageData for method setImageData:metadata:completionBlock: ?
I found a way to generate imageData. Code below:
Byte *buffer = (Byte *)malloc(dr.size);
NSUInteger k = [dr getBytes:buffer fromOffset:0.0 length:dr.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:k freeWhenDone:YES];
So I can use the data above with setImageData:metadata:completionBlock: to update only the exif of ALAsset.

Resources