How to get panoramas from ALAsset objects - ios

We have about 2k objects, which are instance of class ALAsset, and we need to know, which files are panoramic images.
We have tried to get CGImageRef from ALAsset instance and check width/height ratio.
ALAsset *alasset = ...
CFImageRef = alasset.thumbnail; // return square thumbnail and not suitable for me
CFImageRef = alasset.aspectRationThumbnail; //return aspect ration thumbnail, but very slowly
It isn't suitable for us, because it works slowly for many files.
Also we have tried to get metadata from defaultRepresentation and check image EXIF, but it works slowly to.
NSDictionary *dictionary = [alasset defaultRepresentation] metadata]; //very slowly to
Is there any way to make it better?
Thanks

Finaly, I've found this solution for ALAsset:
ALAssetsLibrary *assetsLibrary = ...;
NSOperation *queue = [NSoperationQueue alloc] init];
static NSString * const kAssetQueueName = ...;
static NSUInteger const kAssetConcurrentOperationCount = ...; //I use 5
queue.maxConcurrentOperationCount = kAssetConcurrentOperationCount;
queue.name = kAssetQueueName;
dispatch_async(dispatch_get_main_queue(), ^{
[assetsLibrary enumerateGroupsWithTypes:ALAssetsGroupAll usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
/*You must check the group is not nil */
if (!group)
return;
/*Then you need to select group where you will search panoramas: for iPhone-Simulator it's #"Saved Photos" and "Camera Roll" for iPhone. It's actuality only for iOS 7 or early. */
static NSString * const kAssetGroupName = ...;
if ([[group valueForProperty:ALAssetsGroupPropertyName] kAssetGroupName]) {
[group enumerateAssetsUsingBlock:^(ALAsset *asset, NSUInteger index, BOOL *stop) {
if (!asset)
return;
[queue addOperationWithBlock:^{
//I use #autoreleasepool for instant memory release, after I find panoramas asset url
#autoreleasepool {
ALAssetRepresentation *defaultRepresentation = asset.defaultRepresentation;
if ([defaultRepresentation.UTI isEqualToString:#"public.jpeg"]) {
NSDictionary *metadata = defaultRepresentation.metadata;
if (!metadata)
return;
if (metadata[#"PixelWidth"] && metadata[#"PixelHeight"]) {
NSInteger pixelWidth = [metadata[#"PixelWidth"] integerValue];
NSInteger pixelHeight = [metadata[#"PixelHeight"] integerValue];
static NSUInteger const kSidesRelationshipConstant = ...; //I use 2
static NSUInteger const kMinimalPanoramaHeight = ...; //I use 600
if (pixelHeight >= kMinimalPanoramaHeight && pixelWidth/pixelHeight >= kSidesRelationshipConstant) {/*So, that is panorama.*/}
}
}];
}];
}
} failureBlock:^(NSError *error) {
//Some failing action, you know.
}];
};
That is. So, I think that is not the best solution. However, for today I have not found any better.

Related

Determine image MB size from PHAsset

I want to determine the memory size of the image accessed through the PHAsset. This size is so that we know how much memory it occupies on the device. Which method does this?
var imageSize = Float(imageData.length)
var image = UIImage(data: imageData)
var jpegSize = UIImageJPEGRepresentation(image, 1)
var pngSize = UIImagePNGRepresentation(image)
var pixelsMultiplied = asset.pixelHeight * asset.pixelWidth
println("regular data: \(imageSize)\nJPEG Size: \(jpegSize.length)\nPNG Size: \(pngSize.length)\nPixel multiplied: \(pixelsMultiplied)")
Results in:
regular data: 1576960.0
JPEG Size: 4604156
PNG Size: 14005689
Pixel multiplied: 7990272
Which one of these values actually represents the amount it occupies on the device?
After emailing the picture to myself and checking the size on the system, it turns out approach ONE is the closest to the actual size.
To get the size of a PHAsset (Image type), I used the following method:
var asset = self.fetchResults[index] as PHAsset
self.imageManager.requestImageDataForAsset(asset, options: nil) { (data:NSData!, string:String!, orientation:UIImageOrientation, object:[NSObject : AnyObject]!) -> Void in
//transform into image
var image = UIImage(data: data)
//Get bytes size of image
var imageSize = Float(data.length)
//Transform into Megabytes
imageSize = imageSize/(1024*1024)
}
Command + I on my macbook shows the image size as 1,575,062 bytes.
imageSize in my program shows the size at 1,576,960 bytes.
I tested with five other images and the two sizes reported were just as close.
The NSData approach becomes precarious when data is prohibitively large. You can use the below as an alternative:
[[PHImageManager defaultManager] requestAVAssetForVideo:self.phAsset options:nil resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
CGFloat rawSize = 0;
if ([asset isKindOfClass:[AVURLAsset class]])
{
AVURLAsset *URLAsset = (AVURLAsset *)asset;
NSNumber *size;
[URLAsset.URL getResourceValue:&size forKey:NSURLFileSizeKey error:nil];
rawSize = [size floatValue] / (1024.0 * 1024.0);
}
else if ([asset isKindOfClass:[AVComposition class]])
{
// Asset is an AVComposition (e.g. slomo video)
float estimatedSize = 0.0;
NSArray *tracks = [self tracks];
for (AVAssetTrack * track in tracks)
{
float rate = [track estimatedDataRate] / 8.0f; // convert bits per second to bytes per second
float seconds = CMTimeGetSeconds([track timeRange].duration);
estimatedSize += seconds * rate;
}
rawSize = estimatedSize;
}
if (completionBlock)
{
NSError *error = info[PHImageErrorKey];
completionBlock(rawSize, error);
}
}];
Or for ALAssets, something like this:
[[[ALAssetsLibrary alloc] init] assetForURL:asset.URL resultBlock:^(ALAsset *asset) {
long long sizeBytes = [[asset defaultRepresentation] size];
if (completionBlock)
{
completionBlock(sizeBytes, nil);
}
} failureBlock:^(NSError *error) {
if (completionBlock)
{
completionBlock(0, error);
}
}];

Get photos only taken by iPhone Camera using ALAssetLibray

I am using ALAssetLibrary to access camera roll. But it is getting all images, like what's App images, Facebook Image etc.
My code like this:
[_library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
if (group) {
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
[group enumerateAssetsUsingBlock:^(ALAsset *asset, NSUInteger index, BOOL *stop) {
if (asset) {
//Getting photos here
}];
} } failureBlock:^(NSError *error) {
NSLog(#"Failed.");
}];
Is there any way to get only Camera capture photos using ALAssetLibrary ?
The only difference between camera captured photos and WhatsApp images is the EXIF Data.
You can read it out with
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSDictionary *meta = [representation metadata];
or in Swift:
var representation = asset.defaultRepresentation()
var meta = representation.metadata()
This returns the following:
{TIFF}: {
DateTime = "2014:04:01 20:33:59";
Make = Apple;
Model = "iPhone 5";
Orientation = 3;
ResolutionUnit = 2;
Software = "7.1";
XResolution = 72;
YResolution = 72;
}, PixelWidth: 3264]
So you can check if Make is Apple, for WhatsApp images it is empty:
if([metaData["{TIFF}"]["Make"] isEqualToString: #"Apple"])
or in Swift:
if metaData["{TIFF}"]!["Make"] == "Apple"

Memory leaks when Retrieving photos from Photo Library in ios

In my app, I want to Retrieve Photos and Videos from the Photo Library, and then save them into my app documents directory.
Following is my codes:
- (UIImage *)getImageFromAsset:(ALAsset *)asset type:(NSInteger)nType
{
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
CGImageRef imageReference = [assetRepresentation fullResolutionImage];
CGFloat imageScale = [assetRepresentation scale];
UIImageOrientation imageOrientation = (UIImageOrientation)[assetRepresentation orientation];
UIImage *iImage = [[UIImage alloc] initWithCGImage:imageReference scale:imageScale orientation:imageOrientation];
return iImage;
}
- (UIImage *)getImageAtIndex:(NSInteger)nIndex type:(NSInteger)nType
{
return [self getImageFromAsset:(ALAsset *)_assetPhotos[nIndex] type:nType];
}
......
for (NSIndexPath *index in _dSelected) {
DLog(#"the selected index is %#", index);
image = nil;
image = [ASSETHELPER getImageAtIndex:index.row type:ASSET_PHOTO_FULL_RESOLUTION];
NSString *name = [ASSETHELPER getImageNameAtIndex:index.row];
NSString *filepath = [files stringByAppendingPathComponent:name];
NSString *aliapath = [alias stringByAppendingPathComponent:name];
aliapath = [aliapath stringByAppendingString:THUMBNAIL];
DLog(#"the files is %# the alias is %#", filepath, aliapath);
image = nil;
}
If I retrieve just 20 or 30 photos, it would be ok, but if I retrieve too many photos(maybe 50 ones), the App will Terminate due to Memory Pressure. I think I have set the image to nil after every one image , so the ios system shoud get back the memory after each for loop. But why Memory leak happens?
ARC still needs to manage your memory. As long as you are in your for loop, ARC will never have the chnace to release your memory. You need to put the inside of your loop within an autorelase pool.
for (NSIndexPath *index in _dSelected) {
#autoreleasepool {
DLog(#"the selected index is %#", index);
image = nil;
image = [ASSETHELPER getImageAtIndex:index.row type:ASSET_PHOTO_FULL_RESOLUTION];
NSString *name = [ASSETHELPER getImageNameAtIndex:index.row];
NSString *filepath = [files stringByAppendingPathComponent:name];
NSString *aliapath = [alias stringByAppendingPathComponent:name];
aliapath = [aliapath stringByAppendingString:THUMBNAIL];
DLog(#"the files is %# the alias is %#", filepath, aliapath);
image = nil;
}
}
This will let your memory get freed as you deal with each image.

Failure to save image with custom Metadata

I'm trying to create a custom image gallery within my iOS app. I would like to enable the user to be able to save certain meta data with the image so that it can be pulled up in the app later with the attached information.
First, when the user takes a picture, the app saves the image into a custom album for the app:
UITextField *nameField = [alertView textFieldAtIndex:0];
NSMutableDictionary *metaData = [[NSMutableDictionary alloc] init];
[metaData setObject:currentEvent forKey:kMetaDataEventKey];
[metaData setObject:[AppDelegate getActivePerson].name forKey:kMetaDataPersonKey];
[metaData setObject:nameField.text forKey:kMetaDataNameKey];
NSLog(#"Saving image with metadata: %#", metaData);
NSMutableDictionary *realMetaData = [[NSMutableDictionary alloc] init];
[realMetaData setObject:metaData forKey:kCGImagePropertyTIFFDictionary];
[library saveImage:imageToSave toAlbum:albumName metadata:realMetaData withCompletionBlock:^(NSError *error) {
if ( error != nil )
{
NSLog(#"Error saving picture? %#", error);
}
[self.tableView reloadData];
}];
Upon saving I get the following log message:
Saving image with metadata: {
Event = t;
Person = "George James";
PictureName = tt;
}
Then when I attempt to retrieve the images later, I use this function
-(void) loadAssets
{
self.assets = [NSMutableArray arrayWithCapacity:album.numberOfAssets];
[album enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
if ( result != nil )
{
NSDictionary *metaData = result.defaultRepresentation.metadata;
NSLog(#"Retrieved image metadata: %#", metaData);
}
else
{
[self.tableView reloadData];
}
}];
}
But the log indicates that it did not successfully save the meta data associated with the image:
Retrieved image metadata: {
ColorModel = RGB;
DPIHeight = 72;
DPIWidth = 72;
Depth = 8;
Orientation = 1;
PixelHeight = 720;
PixelWidth = 960;
"{Exif}" = {
ColorSpace = 1;
ComponentsConfiguration = (
1,
2,
3,
0
);
ExifVersion = (
2,
2,
1
);
FlashPixVersion = (
1,
0
);
PixelXDimension = 960;
PixelYDimension = 720;
SceneCaptureType = 0;
};
"{TIFF}" = {
Orientation = 1;
ResolutionUnit = 2;
XResolution = 72;
YResolution = 72;
"_YCbCrPositioning" = 1;
};
}
library is an ALAssetsLibrary instance, and the saveImage: toAlbum: method is from this blog post, only slightly modified so that I can save metadata as such:
-(void)saveImage:(UIImage *)image toAlbum:(NSString *)albumName metadata:(NSDictionary *)metadata withCompletionBlock:(SaveImageCompletion)completionBlock
{
//write the image data to the assets library (camera roll)
[self writeImageToSavedPhotosAlbum:image.CGImage
metadata:metadata
completionBlock:^(NSURL* assetURL, NSError* error) {
//error handling
if (error!=nil) {
completionBlock(error);
return;
}
//add the asset to the custom photo album
[self addAssetURL: assetURL
toAlbum:albumName
withCompletionBlock:completionBlock];
}
];
}
The image is coming from a UIImagePickerController that uses the camera. The picture is successfully being saved to the correct album, just missing the metadata.
Am I doing something wrong in the save/load process? Am I actually not allowed to save custom meta data to an image?
I did some testing, and from what I can tell, the short answer is 'you can't do that.' It looks like the metadata has to conform to specific EXIF Metadata keys. You could look up the available TIFF Metadata keys and see if there are any values you want to set/overwrite. You could try, for example, using kCGImagePropertyTIFFImageDescription to store your data.
NSMutableDictionary *tiffDictionary= [[NSMutableDictionary alloc] init];
NSMutableDictionary *myMetadata = [[NSMutableDictionary alloc] init];
[tiffDictionary setObject:#"My Metadata" forKey:(NSString*)kCGImagePropertyTIFFImageDescription];
[myMetadata setObject:tiffDictionary forKey:(NSString*)kCGImagePropertyTIFFDictionary];
... and save myMetadata with the image.
For other keys, see this:
http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Reference/CGImageProperties_Reference/Reference/reference.html
Otherwise, what I would do is create an NSDictionary that uses an image's unique identifier as a key, and store the metadata object as the value. Save/Load this NSDictionary whenever you save/load an image.

How to retrieve storage info in iOS

I would like to retrieve the storage information programmatically like capacity, available storage, total number of apps, videos, pics etc.. Thanks in advance..
Try these. Not guaranteed to work on non-jailbroken devices though.
- (NSNumber *) totalDiskSpace
{
NSDictionary *fattributes = [[NSFileManager defaultManager] attributesOfFileSystemForPath:NSHomeDirectory() error:nil];
return [fattributes objectForKey:NSFileSystemSize];
}
- (NSNumber *) freeDiskSpace
{
NSDictionary *fattributes = [[NSFileManager defaultManager] attributesOfFileSystemForPath:NSHomeDirectory() error:nil];
return [fattributes objectForKey:NSFileSystemFreeSize];
}
To count number of files in a directory (including it's sub directories, I've used this (which isn't the most efficient way):
-(NSString *)numberOfSongs
{
NSString *musicPath = #"/var/mobile/Media/iTunes_Control/Music/";
NSArray *dirs = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:musicPath error:nil];
NSArray *subs = [[NSFileManager defaultManager] subpathsOfDirectoryAtPath:musicPath error:nil];
int totalFiles;
int subT = [subs count];
int dirT = [dirs count];
totalFiles = subT - dirT;
return [NSString stringWithFormat:#"%i", totalFiles];
}
Looks like WrightsCS answered disk space question.
If you want number of images, check out ALAssetsLibrary of the AssetsLibrary.framework (you'll have to include this framework in your "Link Binary With Libraries" section of the Target settings) and then:
#import <AssetsLibrary/AssetsLibrary.h>
// get the image assets
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSAssert(library, #"Unable to open ALAssetsLibrary");
NSUInteger __block images = 0;
[library enumerateGroupsWithTypes:ALAssetsGroupAll
usingBlock:^(ALAssetsGroup *group, BOOL *stop){
NSLog(#"%#", group);
images += group.numberOfAssets;
dispatch_async(dispatch_get_main_queue(), ^{
// update my UI with the number of images
});
}
failureBlock:^(NSError *err){
NSLog(#"err=%#", err);
}];
If you want to access the iTunes library on non-jailbroken devices, check out the iPod Library Access Programming Guide, which shows you how to use MPMediaQuery (remember to include the MediaPlayer.framework in your project), and then:
#import <MediaPlayer/MediaPlayer.h>
MPMediaQuery *everything = [[MPMediaQuery alloc] init];
NSAssert(everything, #"Unable to open MPMediaQuery");
iTunesMediaCount = [[everything items] count];
I don't know if there's a published API for getting the number of apps. There are solutions for jailbroken devices, but I don't know about the rest of us.
You didn't ask about this, but if you want available RAM (not flash storage, but memory available for apps), you can get it via:
#import <mach/mach.h>
#import <mach/mach_host.h>
- (void)determineMemoryUsage
{
mach_port_t host_port;
mach_msg_type_number_t host_size;
vm_size_t pagesize;
host_port = mach_host_self();
host_size = sizeof(vm_statistics_data_t) / sizeof(integer_t);
host_page_size(host_port, &pagesize);
vm_statistics_data_t vm_stat;
if (host_statistics(host_port, HOST_VM_INFO, (host_info_t)&vm_stat, &host_size) != KERN_SUCCESS)
NSLog(#"Failed to fetch vm statistics");
/* Stats in bytes */
natural_t mem_used = (vm_stat.active_count +
vm_stat.inactive_count +
vm_stat.wire_count) * pagesize;
natural_t mem_free = vm_stat.free_count * pagesize;
natural_t mem_total = mem_used + mem_free;
// do whatever you want with mem_used, mem_free, and mem_total
}

Resources