how can I save a compressed JPEG from an ALAsset - ios

I have an object ALAsset retrieved from ALAssetLibrary I want to extrapolate a compress JPEG in order to send it to a web services.
any suggestion where to start?
Edit:
I've found a way to get NSData out of the ALAsset
ALAssetRepresentation *rappresentation = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rappresentation.size);
NSUInteger buffered = [rappresentation getBytes:buffer fromOffset:0.0 length:rappresentation.size error:&err];
but I can't find a way to reduce the size of the image by resizing and compressing it.
My idea was to have something like:
UIImage *myImage = [UIImage imageWithData:data];
//resize image
NSData *compressedData = UIImageJPEGRepresentation(myImage, 0.5);
but, first of all, even without resizing, just using this two lines of code compressedData is bigger than data.
and second I'm not sure about what's the best way to resize the UIImage

You can use the
[theAsset thumbnail]
Or;
Compressing might result bigger files after some point, you need to resize the image:
+ (UIImage *)imageWithCGImage:(CGImageRef)cgImage scale:(CGFloat)scale orientation:(UIImageOrientation)orientation

It's easy to get the CGImage from an ALAssetRepresentation:
ALAssetRepresentation repr = [asset defaultRepresentation];
// use the asset representation's orientation and scale in order to set the UIImage
// up correctly
UIImage *image = [UIImage imageWithCGImage:[repr fullResolutionImage] scale:[repr scale] orientation:[repr orientation]];
// then do whatever you want with the UIImage instance

Related

Different data size received from asset image in foreground and background in iOS 9.1?

I want to take backup of images on cloud so I calculate MD5 of an image in iOS.The issue is Md5 differs when it is calculated in foreground and background of the app. This issue come only in iOS 9.1. I use asset library to fetch the images. Below function is used to get the data (Both data are different from each when the application is in foreground and in background)
ALAssetRepresentation *assetRep = [asset defaultRepresentation];
CGImageRef imgRef = [assetRep fullScreenImage];
UIImage *img = [UIImage imageWithCGImage:imgRef
scale:1.0f
orientation:(UIImageOrientation)assetRep.orientation];
NSData *data = UIImageJPEGRepresentation(img, 90);
Thanks in advance....
Have a look at ALAssetRepresentation-MD5 which calculates the md5 hash from an ALAssetRepresentation without creating a UIImage or using UIImageJPEGRepresentation. I assume that one of these UIKit-related steps is responsible for your issue, event though they should be thread-safe.

Get thumbnail from ALAssetsRepresentation as NSData

I am writing NSData to a file and saving it in the device's app documents folder. For that, is it possible to get thumbnail from ALAssetsRepresentation object in NSData format. If so, any helpful links to that?
I couldn't find anything similar, other than getting CGImageRef from ALAssetsRepresentation. I don't want CGImageRef format as I have to use UIImageJPEGRepresentation or UIImagePNGRepresentation to convert it to NSData.
Try this one
GImageRef iref = [myasset thumbnail];
if (iref)
{
UIImage *theThumbnail = [UIImage imageWithCGImage:iref];
NSData *thumnailData = UIImagePNGRepresentation(theThumbnail);
}

UIImageJPEGRepresentation doen't keep my scale settings

When I use initWithCGImage with a certain scale and then UIImageJPEGRepresentation to get data from this image, it seems the system doesn't keep my scale settings. Any idea why ?
Following is my code and the log I get :
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImageOrientation orientation = [self orientationForAsset:asset];
// Scale the image
UIImage* scaledImage = [[UIImage alloc] initWithCGImage:iref scale:2. orientation:orientation];
NSLog (#"Scaled image size %#", NSStringFromCGSize(scaledImage.size));
// Get data from image
NSData* scaledImageData = UIImageJPEGRepresentation(scaledImage, 0.8);
// Check the image size of the data
UIImage* buildedImage = [UIImage imageWithData:scaledImageData];
NSLog (#"Data image size of %#", NSStringFromCGSize (buildedImage.size));
Gives log :
"Scaled image size {1944, 1296}"
"Data image size of {3888, 2592}"
That's really strange because the two images are supposed to be exactly the same.
You should use -[UIImage imageWithData:scale:] method.

How to get UIImage(ImageURL) height and width without converting to NSData

In my project i need to show the different sizes of images in zig-zag fashion. so, i converted the uiimages(url) which are coming from service to NSData and then i get the uiimage. my code is
NSURL *url = [NSURL URLWithString:[[_result objectAtIndex:i ] valueForKey:#"PImage"]];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *image = [UIImage imageWithData:data];
so i can get the image size(width and height), But my problem is according to the image size, i need to create UIView, this code is works fine for me, but it is taking too much of time(almost 25 sec) to load 8 images. i figured converting UIImage to NSData is taking time. Is there any way to get the image size(width and height) without converting it into NSData
Thanks for spending time for me.
You can get image properties without actually loading whole image data from disk using ImageIO framework:
#import ImageIO;
...
NSURL *imageURL = … // Init URL somehow
CGImageSourceRef imgSource = CGImageSourceCreateWithURL((__bridge CFURLRef)url, NULL);
NSDictionary* imageProps = (__bridge_transfer NSDictionary*) CGImageSourceCopyPropertiesAtIndex(imgSource, 0, NULL);
NSLog(#"%#", imageProps);
CFRelease(imgSource);
Image width and height will be stored in dictionary under PixelHeight and PixelWidth keys (tested with png image, may be other image formats will use different keys)
Instead of converting url to data and to UIImage, Use EGOImageView OR AsyncImageView. You can simply pass the URL to them. Again setFrame based on size of the image.

iOS: writing a CGImageRef disk in png or jpeg

What is the best way to save CGImageRef to a png or jpeg file in iOS?
For a general cocoa application (not using UIkit), see: Saving CGImageRef to a png file? but this seems too long and clunky.
Using UIKit, the code becomes a simple 3-liner:
UIImage *uiImage = [UIImage imageWithCGImage:cgImage];
NSData *jpgData = UIImageJPEGRepresentation(uiImage, 0.9f);
[jpgData writeToFile:path atomically:NO];

Resources