I am getting images from library as an ALAsset. I get the raw image like this:
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImage *im = [UIImage imageWithCGImage:iref];
data4 = UIImageJPEGRepresentation(im, 0.5);
I can also obtain metadata for it just like this:
NSDictionary *data = [rep metadata];
which is absolutely fine, but when I want to upload the picture to a server. However when I do that, the picture is uploaded without the metadata. Is there a way how to send it as one-piece?
I think you are asking for ALAsset , send a photo to a web service including its exif data: see ALAsset , send a photo to a web service including its exif data
Related
I want to take backup of images on cloud so I calculate MD5 of an image in iOS.The issue is Md5 differs when it is calculated in foreground and background of the app. This issue come only in iOS 9.1. I use asset library to fetch the images. Below function is used to get the data (Both data are different from each when the application is in foreground and in background)
ALAssetRepresentation *assetRep = [asset defaultRepresentation];
CGImageRef imgRef = [assetRep fullScreenImage];
UIImage *img = [UIImage imageWithCGImage:imgRef
scale:1.0f
orientation:(UIImageOrientation)assetRep.orientation];
NSData *data = UIImageJPEGRepresentation(img, 90);
Thanks in advance....
Have a look at ALAssetRepresentation-MD5 which calculates the md5 hash from an ALAssetRepresentation without creating a UIImage or using UIImageJPEGRepresentation. I assume that one of these UIKit-related steps is responsible for your issue, event though they should be thread-safe.
I am writing NSData to a file and saving it in the device's app documents folder. For that, is it possible to get thumbnail from ALAssetsRepresentation object in NSData format. If so, any helpful links to that?
I couldn't find anything similar, other than getting CGImageRef from ALAssetsRepresentation. I don't want CGImageRef format as I have to use UIImageJPEGRepresentation or UIImagePNGRepresentation to convert it to NSData.
Try this one
GImageRef iref = [myasset thumbnail];
if (iref)
{
UIImage *theThumbnail = [UIImage imageWithCGImage:iref];
NSData *thumnailData = UIImagePNGRepresentation(theThumbnail);
}
When I use initWithCGImage with a certain scale and then UIImageJPEGRepresentation to get data from this image, it seems the system doesn't keep my scale settings. Any idea why ?
Following is my code and the log I get :
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImageOrientation orientation = [self orientationForAsset:asset];
// Scale the image
UIImage* scaledImage = [[UIImage alloc] initWithCGImage:iref scale:2. orientation:orientation];
NSLog (#"Scaled image size %#", NSStringFromCGSize(scaledImage.size));
// Get data from image
NSData* scaledImageData = UIImageJPEGRepresentation(scaledImage, 0.8);
// Check the image size of the data
UIImage* buildedImage = [UIImage imageWithData:scaledImageData];
NSLog (#"Data image size of %#", NSStringFromCGSize (buildedImage.size));
Gives log :
"Scaled image size {1944, 1296}"
"Data image size of {3888, 2592}"
That's really strange because the two images are supposed to be exactly the same.
You should use -[UIImage imageWithData:scale:] method.
I'm developing and iOS app for iPad and I'm using a Repository called Grabkit in order to get images from different services like Instagram and Flicker in addition to images from the Camera Roll. The problem is that when the user selects a picture from the roll I get and URL such this: assets-library://asset/asset.JPG?id=DCFB9E49-93AA-49E3-89C8-2EE64AE2C4C6&ext=JPG
I've tried some codes to get the image from this kind of paths but no one has worked, such as the following:
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
// Ask for the "Asset" for the URL. An asset is a representation of an image in the Photo application.
[library assetForURL:originalImage.URL
resultBlock:^(ALAsset *asset) {
// Here, we have the asset, let's retrieve the image from it
CGImageRef imgRef = [[asset defaultRepresentation] fullResolutionImage];
/* Instead of the full res image, you can ask for an image that fits the screen
CGImageRef imgRef = [[asset defaultRepresentation] fullScreenImage];
*/
// From the CGImage, let's build an UIImage
imatgetemporal = [UIImage imageWithCGImage:imgRef];
} failureBlock:^(NSError *error) {
// Something wrong happened.
}];
Is something in my code wrong? Must I try another code?
I have an object ALAsset retrieved from ALAssetLibrary I want to extrapolate a compress JPEG in order to send it to a web services.
any suggestion where to start?
Edit:
I've found a way to get NSData out of the ALAsset
ALAssetRepresentation *rappresentation = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rappresentation.size);
NSUInteger buffered = [rappresentation getBytes:buffer fromOffset:0.0 length:rappresentation.size error:&err];
but I can't find a way to reduce the size of the image by resizing and compressing it.
My idea was to have something like:
UIImage *myImage = [UIImage imageWithData:data];
//resize image
NSData *compressedData = UIImageJPEGRepresentation(myImage, 0.5);
but, first of all, even without resizing, just using this two lines of code compressedData is bigger than data.
and second I'm not sure about what's the best way to resize the UIImage
You can use the
[theAsset thumbnail]
Or;
Compressing might result bigger files after some point, you need to resize the image:
+ (UIImage *)imageWithCGImage:(CGImageRef)cgImage scale:(CGFloat)scale orientation:(UIImageOrientation)orientation
It's easy to get the CGImage from an ALAssetRepresentation:
ALAssetRepresentation repr = [asset defaultRepresentation];
// use the asset representation's orientation and scale in order to set the UIImage
// up correctly
UIImage *image = [UIImage imageWithCGImage:[repr fullResolutionImage] scale:[repr scale] orientation:[repr orientation]];
// then do whatever you want with the UIImage instance