I am trying to select all real size images from ALAssetsLibrary and save in document folder. Here is my code to get real size image from ALAssetsLibrary.I get the following error.Is there any other way to get real size image from ALAssetsLibrary?
Connection to assetsd was interrupted or assetsd died
-(void)test{
for (int i= 0; i < chosenImagesArray.count; i++) {
NSDictionary *chosenImage = [self.chosenImages objectAtIndex:i];
ALAssetsLibrary *assetLibrary=[[ALAssetsLibrary alloc] init];
[assetLibrary assetForURL:[chosenImage valueForKey:UIImagePickerControllerReferenceURL] resultBlock:^(ALAsset *asset)
{
ALAssetRepresentation *rep = [asset defaultRepresentation];
unsigned long imageDataSize = (unsigned long)[rep size];
imageDataBytes = malloc(imageDataSize);
[rep getBytes:imageDataBytes fromOffset:0 length:imageDataSize error:nil];
NSData *data = [NSData dataWithBytesNoCopy:imageDataBytes length:imageDataSize freeWhenDone:YES];
UIImage *myImg =[UIImage imageWithData:data];
//Crop to Standard Size
UIImage *readyToWriteImage = [self croppIngimageByImageName:[self fixrotation:myImg ] ];
//Write To Document Directory
[self writeChosenImage:readyToWriteImage];
}
failureBlock:^(NSError *err) {
NSLog(#"Error: %#",[err localizedDescription]);
return ;
}];
}
}
You can follow this tutorial "How to get meta data of an Image" as size is in the metadata of image so you can get size from returned dictionary of meta data this is the link http://blog.codecropper.com/2011/05/getting-metadata-from-images-on-ios/
thanks
Related
I'm trying to get all photos from photos library with image's metadata. It works fine for 10-20 images but when there are 50+ images it occupies too much memory, which causes to app crash.
Why i need all images into array?
Answer - to send images to server app. [i'm using GCDAsyncSocket to send data on receiver socket/port and i don't have that much waiting time to request images from PHAsset while sending images on socket/port.
My Code :
+(void)getPhotosDataFromCamera:(void(^)(NSMutableArray *arrImageData))completionHandler
{
[PhotosManager checkPhotosPermission:^(bool granted)
{
if (granted)
{
NSMutableArray *arrImageData = [NSMutableArray new];
NSArray *arrImages=[[NSArray alloc] init];
PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
NSLog(#"%d",(int)result.count);
arrImages = [result copy];
//--- If no images.
if (arrImages.count <= 0)
{
completionHandler(nil);
return ;
}
__block int index = 1;
__block BOOL isDone = false;
for (PHAsset *asset in arrImages)
{
[PhotosManager requestMetadata:asset withCompletionBlock:^(UIImage *image, NSDictionary *metadata)
{
#autoreleasepool
{
NSData *imageData = metadata?[PhotosManager addExif:image metaData:metadata]:UIImageJPEGRepresentation(image, 1.0f);
if (imageData != nil)
{
[arrImageData addObject:imageData];
NSLog(#"Adding images :%i",index);
//--- Done adding all images.
if (index == arrImages.count)
{
isDone = true;
NSLog(#"Done adding all images with info!!");
completionHandler(arrImageData);
}
index++;
}
}
}];
}
}
else
{
completionHandler(nil);
}
}];
}
typedef void (^PHAssetMetadataBlock)(UIImage *image,NSDictionary *metadata);
+(void)requestMetadata:(PHAsset *)asset withCompletionBlock:(PHAssetMetadataBlock)completionBlock
{
PHContentEditingInputRequestOptions *editOptions = [[PHContentEditingInputRequestOptions alloc]init];
editOptions.networkAccessAllowed = YES;
[asset requestContentEditingInputWithOptions:editOptions completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info)
{
CIImage *CGimage = [CIImage imageWithContentsOfURL:contentEditingInput.fullSizeImageURL];
UIImage *image = contentEditingInput.displaySizeImage;
dispatch_async(dispatch_get_main_queue(), ^{
completionBlock(image,CGimage.properties);
});
CGimage = nil;
image = nil;
}];
editOptions = nil;
asset =nil;
}
+ (NSData *)addExif:(UIImage*)toImage metaData:(NSDictionary *)container
{
NSData *imageData = UIImageJPEGRepresentation(toImage, 1.0f);
// create an imagesourceref
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);
// this is the type of image (e.g., public.jpeg)
CFStringRef UTI = CGImageSourceGetType(source);
// create a new data object and write the new image into it
NSMutableData *dest_data = [[NSMutableData alloc] initWithLength:imageData.length+2000];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data, UTI, 1, NULL);
if (!destination) {
NSLog(#"Error: Could not create image destination");
}
// add the image contained in the image source to the destination, overidding the old metadata with our modified metadata
CGImageDestinationAddImageFromSource(destination, source, 0, (__bridge CFDictionaryRef) container);
BOOL success = NO;
success = CGImageDestinationFinalize(destination);
if (!success) {
NSLog(#"Error: Could not create data from image destination");
}
CFRelease(destination);
CFRelease(source);
imageData = nil;
source = nil;
destination = nil;
return dest_data;
}
Well it's not a surprise that you arrive into this situation, since each of your image consumes memory and you instantiate and keep them in memory. This is not really a correct design approach.
In the end it depends on what you want to do with those images.
What I would suggest is that you keep just the array of your PHAsset objects and request the image only on demand.
Like if you want to represent those images into a tableView/collectionView, perform the call to
[PhotosManager requestMetadata:asset withCompletionBlock:^(UIImage *image, NSDictionary *metadata)
directly in the particular method. This way you won't drain the device memory.
There simply is not enough memory on the phone to load all of the images into the photo library into memory at the same time.
If you want to display the images, then only fetch the images that you need for immediate display. For the rest keep just he PHAsset. Make sure to discard the images when you don't need them any more.
If you need thumbnails, then fetch only the thumbnails that you need.
If want to do something with all of the images - like add a watermark to them or process them in some way - then process each image one at a time in a queue.
I cannot advise further as your question doesn't state why you need all of the images.
I am trying to create GIF animated images, for which i pass an array of images.
Lets say I have a 4 seconds video and i extract about 120 frames from it. Regardless of the created GIF size, i create a GIF from all those 120 frames. The problem is, when i open the GIF in iPhone (by attaching it to MailViewComposer or iMessage) it runs fine, but if i email it, or import it to computer, it runs too fast. Can anyone suggest what is wrong here?
I am using HJImagesToGIF for GIF creation. The dictionary for GIF properties is as below:
NSDictionary *prep = [NSDictionary dictionaryWithObject:[NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:0.03f]
forKey:(NSString *) kCGImagePropertyGIFDelayTime]
forKey:(NSString *) kCGImagePropertyGIFDictionary];
NSDictionary *fileProperties = #{
(__bridge id)kCGImagePropertyGIFDictionary: #{
(__bridge id)kCGImagePropertyGIFLoopCount: #0, // 0 means loop forever
}
};
creation of GIF:
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:path];
CGImageDestinationRef dst = CGImageDestinationCreateWithURL(url, kUTTypeGIF, [images count], nil);
CGImageDestinationSetProperties(dst, (__bridge CFDictionaryRef)fileProperties);
for (int i=0;i<[images count];i++)
{
//load anImage from array
UIImage * anImage = [images objectAtIndex:i];
CGImageDestinationAddImage(dst, anImage.CGImage,(__bridge CFDictionaryRef)(prep));
}
bool fileSave = CGImageDestinationFinalize(dst);
CFRelease(dst);
if(fileSave) {
NSLog(#"animated GIF file created at %#", path);
}else{
NSLog(#"error: no animated GIF file created at %#", path);
}
To save the GIF, I’m using:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSData *data = [NSData dataWithContentsOfURL:[NSURL URLWithString:tempPath]];
data = [NSData dataWithContentsOfFile:tempPath];
[library writeImageDataToSavedPhotosAlbum:data metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"Error Saving GIF to Photo Album: %#", error);
} else {
// TODO: success handling
NSLog(#"GIF Saved to %#", assetURL);
}
}];
Thanks everyone.
Trying to save ultra hd image in documents directory from gallery. My app is crashing due to memory pressure. How to save image directly from alasset to documents folder without taking into uiimage.
If you have problems with memory I will suggest you to read this How can I release memory of UIImages no longer used first.
If you will decide that you still need copying without UIImage usage you can try following
ALAsset *result; // do not forget to initialize it
ALAssetRepresentation *rawImage = [result defaultRepresentation];
uint8_t *buffer = malloc( rawImage.size );
[rawImage getBytes:buffer fromOffset:0 length:rawImage.size error:NULL];
NSData *d = [NSData dataWithBytes:buffer length:rawImage.size];
[d writeToFile:#"your_file_path_here" atomically:YES];
free(buffer);
UPDATE:
Following code could be more efficient
long long sizeOfRawDataInBytes = rawImage.size;
NSMutableData* rawData = [[NSMutableData alloc]initWithCapacity:sizeOfRawDataInBytes];
void* bufferPointer = [rawData mutableBytes];
NSError* error=nil;
[rawImage getBytes:bufferPointer fromOffset:0 length:sizeOfRawDataInBytes error:&error];
if (error) {
NSLog(#"Getting bytes failed with error: %#",error);
}
else {
[rawData writeToFile: #"your_file_path_here" atomically:YES];
}
With this code below, I can extract metadata from an image (pre-added to my project), and render the info as text. This is exactly what I want to do. The SYMetadata is created by pointing to an image via URL. initWithAbsolutePathURL. I want to do the same thing with a UIImage or maybe the image that is being loaded to the UIImage. How do I get the URL to the image that the picker selects? Or how do I create an "asset" from this incoming image?
The documentation describes initWithAsset. Have not figured out how to use it yet though, or if this is the right way to go for my purpose. Any help greatly appreciated.
NSURL *imageURL = [[NSBundle mainBundle] URLForResource:#"someImage" withExtension:#"jpg"];
SYMetadata *metadata = [[SYMetadata alloc] initWithAbsolutePathURL:imageURL];
[textView setText:[[metadata allMetadatas] description]];
Note: I tried adding an NSURL like this imageURL = [info valueForKey:#"UIImagePickerControllerReferenceURL"];, in the "pickerDidFinish" method but the metadata is null after I add this URL to the above code.
If you are using the imagePickerController, the delegate method will give you what you need
- (void) imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
if ([[info allKeys] containsObject:UIImagePickerControllerReferenceURL]){
// you will get this key if your image comes from a library
[self setMetaDataFromAssetLibrary:info];
} else if ([[info allKeys] containsObject:UIImagePickerControllerMediaMetadata]){
// if the image comes from the camera you get the metadata in it's own key
self.rawMetaData = [self metaDataFromCamera:info];
}
}
From Asset Library - bear in mind that it takes time to complete and has an asynchronous completion block, so you might want to add a completion flag to ensure you don't access the property before it has been updated.
- (void) setMetaDataFromAssetLibrary:(NSDictionary*)info
{
NSURL *assetURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:^(ALAsset *asset) {
self.rawMetaData = asset.defaultRepresentation.metadata;
}
failureBlock:^(NSError *error) {
NSLog (#"error %#",error);
}];
}
From Camera:
- (NSDictionary*)metaDataFromCamera:(NSDictionary*)info
{
NSMutableDictionary *imageMetadata = [info objectForKey:UIImagePickerControllerMediaMetadata];
return imageMetadata;
}
Here is how to get metadata from a UIImage
- (NSDictionary*)metaDataFromImage:(UIImage*)image
{
NSData *jpegData = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0)];
return [self metaDataFromData:jpegData];
}
But take care - a UIImage can already stripped of much of the metadata from the original.. you will be better off getting the metadata from the NSData that was used to create the UIImage...
- (NSDictionary*)metaDataFromData:(NSData*)data
{
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)data, NULL);
CFDictionaryRef imageMetaData = CGImageSourceCopyPropertiesAtIndex(source,0,NULL);
return (__bridge NSDictionary *)(imageMetaData);
}
If you've an ALAsset (in my sample _detailItem), you can have metadata in this way:
NSDictionary *myMetadata = [[_detailItem defaultRepresentation] metadata];
I'm working with an application in which I'm loading images from photolibrary.
I'm using the following code for binding the image to imageView.
-(void)loadImage:(UIImageView *)imgView FileName:(NSString *)fileName
{
typedef void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *asset);
typedef void (^ALAssetsLibraryAccessFailureBlock)(NSError *error);
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImage *lImage;
if (iref)
{
lImage = [UIImage imageWithCGImage:iref scale:[rep scale] orientation:(UIImageOrientation)[rep orientation]];
}
else
{
lImage = [UIImage imageNamed:#"Nofile.png"];
}
dispatch_async(dispatch_get_main_queue(), ^{
[imgView setImage:lImage];
});
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
UIImage *images = [UIImage imageNamed:#"Nofile.png"];
dispatch_async(dispatch_get_main_queue(), ^{
[imgView setImage:images];
});
};
NSURL *asseturl = [NSURL URLWithString:fileName];
ALAssetsLibrary *asset = [[ALAssetsLibrary alloc] init];
[asset assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
But when I tried to run it, an error is coming and the application is crashing sometimes.
The error printed on console is:
** * ERROR: FigCreateCGImageFromJPEG returned -12910. 423114 bytes. We will fall back to software decode.
Received memory warning.
My photo library contains high resolution images and their size between 10-30 MB.
Finally I fixed the issue.
I think the issue is with fetching the full resolution image.
Instead of :
CGImageRef iref = [rep fullResolutionImage];
I used:
CGImageRef iref = [myasset aspectRatioThumbnail];
And everything worked fine. No error in console, no crash, but quality/resolution of the image is reduced.
I have a similar error:
* ERROR: FigCreateCGImageFromJPEG returned -12909. 0 bytes. We will fall back to software decode.
app crush on call:
CGImageRef originalImage = [representation fullResolutionImage];
I fix it by replace to:
CGImageRef originalImage = [representation fullScreenImage];
[UIImage imageWithCGImage:]
imageWithCGImage is a stack memory function, it seems to overflow if the large image.
What about using the heap functions?.
lImage = [[[UIImage alloc]initWithCGImage:iref scale:[rep scale] orientation:(UIImageOrientation)[rep orientation]] autorelease];