I have seen other apps do it where you can import the last photo from the Photos app for quick use but as far as I know, I only know how to get A image and not the last (most recent one). Can anyone show me how to get the last image?
This code snippet will get the latest image from the camera roll (iOS 7 and below):
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just photos.
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
// Chooses the photo at the last index
[group enumerateAssetsWithOptions:NSEnumerationReverse usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
UIImage *latestPhoto = [UIImage imageWithCGImage:[representation fullScreenImage]];
// Stop the enumerations
*stop = YES; *innerStop = YES;
// Do something interesting with the AV asset.
[self sendTweet:latestPhoto];
}
}];
} failureBlock: ^(NSError *error) {
// Typically you should handle an error more gracefully than this.
NSLog(#"No groups");
}];
iOS 8 and above:
PHFetchOptions *fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:fetchOptions];
PHAsset *lastAsset = [fetchResult lastObject];
[[PHImageManager defaultManager] requestImageForAsset:lastAsset
targetSize:self.photoLibraryButton.bounds.size
contentMode:PHImageContentModeAspectFill
options:PHImageRequestOptionsVersionCurrent
resultHandler:^(UIImage *result, NSDictionary *info) {
dispatch_async(dispatch_get_main_queue(), ^{
[[self photoLibraryButton] setImage:result forState:UIControlStateNormal];
});
}];
Great answer from iBrad, worked almost perfectly for me. The exception being that it was returning images at their original orientation (eg. upside down, -90°, etc).
To fix this I simply changed fullResolutionImage to fullScreenImage.
Here:
UIImage *latestPhoto = [UIImage imageWithCGImage:[representation fullScreenImage]];
It now works a treat.
iBrad's example includes an iOS8 snippet that apparently works, but I found myself confused by the return type he described. Here is a snippet that grabs the last image, including options for version and size requirements.
Of note are the ability to request a specific version (original, current) and size. In my case, as I wish to apply the returned image to a button, I request it sized and scaled to fit the button I'm applying it to:
PHFetchOptions *fetchOptions = [[PHFetchOptions alloc] init];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:YES]];
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:fetchOptions];
PHAsset *lastAsset = [fetchResult lastObject];
[[PHImageManager defaultManager] requestImageForAsset:lastAsset
targetSize:self.photoLibraryButton.bounds.size
contentMode:PHImageContentModeAspectFill
options:PHImageRequestOptionsVersionCurrent
resultHandler:^(UIImage *result, NSDictionary *info) {
dispatch_async(dispatch_get_main_queue(), ^{
[[self photoLibraryButton] setImage:result forState:UIControlStateNormal];
});
}];
Well, here is a solution of how to load last image from gallery with Swift 3 guys:
func loadLastImageThumb(completion: #escaping (UIImage) -> ()) {
let imgManager = PHImageManager.default()
let fetchOptions = PHFetchOptions()
fetchOptions.fetchLimit = 1
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: true)]
let fetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.image, options: fetchOptions)
if let last = fetchResult.lastObject {
let scale = UIScreen.main.scale
let size = CGSize(width: 100 * scale, height: 100 * scale)
let options = PHImageRequestOptions()
imgManager.requestImage(for: last, targetSize: size, contentMode: PHImageContentMode.aspectFill, options: options, resultHandler: { (image, _) in
if let image = image {
completion(image)
}
})
}
}
If you need more speed, you can also use PHImageRequestOptions and set those:
options.deliveryMode = .fastFormat
options.resizeMode = .fast
And this is the way you get it in your viewController (you should replace GalleryManager.manager with your class):
GalleryManager.manager.loadLastImageThumb { [weak self] (image) in
DispatchQueue.main.async {
self?.galleryButton.setImage(image, for: .normal)
}
}
Thanks for your answer iBrad Apps.
Just wanted to point out an error prevention for the special case when user has no images on his/her photo roll (strange case I know):
// Within the group enumeration block, filter to enumerate just photos.
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
//Check that the group has more than one picture
if ([group numberOfAssets] > 0) {
// Chooses the photo at the last index
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:([group numberOfAssets] - 1)] options:0 usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
UIImage *latestPhoto = [UIImage imageWithCGImage:[representation fullScreenImage]];
[self.libraryButton setImage:latestPhoto forState:UIControlStateNormal];
}
}];
}
else {
//Handle this special case
}
Refer to answer by Liam. fullScreenImage will return a scaled image fitting your device's screen size. For getting the actual image size:
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
ALAssetOrientation orientation = [representation orientation];
UIImage *latestPhoto = [UIImage imageWithCGImage:[representation fullResolutionImage] scale:[representation scale] orientation:(UIImageOrientation)orientation];
Quoting Apple's ALAssetRepresentation Class Reference on fullResolutionImage:
To create a correctly-rotated UIImage object from the CGImage, you use
imageWithCGImage:scale:orientation: or
initWithCGImage:scale:orientation:, passing the values of orientation
and scale.
I found a typo that I'm embarrassed to admit to me longer than it should have to figure out. Maybe it will save someone else some time.
This line was missing a colon after indexSetWithIndex:
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:[group numberOfAssets] - 1]options:0 usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
Building upon iBrad's answer, here's a quick & dirty Swift version that works for me in iOS 8.1:
let imgManager = PHImageManager.defaultManager()
var fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: true)]
if let fetchResult = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions) {
imgManager.requestImageForAsset(fetchResult.lastObject as PHAsset, targetSize: self.destinationImageView.frame.size, contentMode: PHImageContentMode.AspectFill, options: nil, resultHandler: { (image, _) in
self.destinationImageView.image = image
})
}
Note: this requires iOS 8.0+. Be sure to link the Photos framework and add "import Photos" in your file.
Here is a version in Swift which requests the data and converts it to an UIImage, as the provided version returned an empty UIImage every time
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
let fetchResult = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)
if let lastAsset: PHAsset = fetchResult.lastObject as? PHAsset {
let manager = PHImageManager.defaultManager()
let imageRequestOptions = PHImageRequestOptions()
manager.requestImageDataForAsset(lastAsset, options: imageRequestOptions) {
(let imageData: NSData?, let dataUTI: String?,
let orientation: UIImageOrientation,
let info: [NSObject : AnyObject]?) -> Void in
if let imageDataUnwrapped = imageData, lastImageRetrieved = UIImage(data: imageDataUnwrapped) {
// do stuff with image
}
}
}
Heres a combination of iBrad's & Javier's answers (which worked great), but I am getting the thumbnail asset instead of the full resolution image. Some others may find this handy.
- (void)setCameraRollImage {
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
if ([group numberOfAssets] > 0) {
// Chooses the photo at the last index
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:([group numberOfAssets] - 1)] options:0 usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
UIImage *latestPhoto = [UIImage imageWithCGImage:[alAsset thumbnail]];
[self.cameraRollButton setImage:latestPhoto forState:UIControlStateNormal];
}
}];
}
} failureBlock: ^(NSError *error) {
}];
}
Xamarin.iOS version of accepted answer (how to get last image) including all notices from other answers:
private void ChooseLastTakenPictureImplementation()
{
var library = new ALAssetsLibrary();
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
library.Enumerate(ALAssetsGroupType.SavedPhotos, (ALAssetsGroup assetsGroup, ref bool stop) =>
{
if (stop || assetsGroup == null)
{
return;
}
//Xamarin does not support ref parameters in nested lamba expressions
var lambdaStop = false;
//Check that the group has more than one picture
if (assetsGroup.Count > 0)
{
// Within the group enumeration block, filter to enumerate just photos.
assetsGroup.SetAssetsFilter(ALAssetsFilter.AllPhotos);
// Chooses the photo at the last index
assetsGroup.Enumerate(NSEnumerationOptions.Reverse, (ALAsset result, int index, ref bool innerStop) =>
{
// The end of the enumeration is signaled by asset == nil.
if (result != null)
{
var representation = result.DefaultRepresentation;
var latestPhoto = new UIImage(representation.GetImage(), representation.Scale, (UIImageOrientation)representation.Orientation);
// Stop the enumerations
lambdaStop = innerStop = true;
// Do something interesting with the AV asset.
HandleImageAutoPick(latestPhoto);
}
});
stop = lambdaStop;
return;
}
else
{
//Handle this special case where user has no pictures
}
}, error =>
{
// Typically you should handle an error more gracefully than this.
Debug.WriteLine(error.Description);
});
}
This is a very cool approach but one of the issues is that you have to be able to instantiate PHPhotoLibrary and the other PHPhoto classes at runtime because otherwise there will be link errors on iOS 7.X.X Just wanted to point that out because I am running into these issues now.
Also I believe you have to weak link in the Photos framework in order for the app to run on both devices with iOS 8.X.X and iOS 7.X.X installed (although I have not tested this out yet.)
ONe of the issues I am running into is how to instantiate the PHPhotoLibrary at runtime. Does anyone have code snippets for that?
Actually for the app that I was working on, I did have to finally write runtime code for instantiating PHPhotoLibrary class and calling PHotos framework methods so the app would run on both iOS 7.x.x and iOS 8.x.x. Someone else may run into the same issues so I provided the code below ->
// PHPhotoLibrary_class will only be non-nil on iOS 8.x.x
Class PHPhotoLibrary_class = NSClassFromString(#"PHPhotoLibrary");
if (PHPhotoLibrary_class) {
/**
*
iOS 8..x. . code that has to be called dynamically at runtime and will not link on iOS 7.x.x ...
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
[PHAssetCollectionChangeRequest creationRequestForAssetCollectionWithTitle:title];
} completionHandler:^(BOOL success, NSError *error) {
if (!success) {
NSLog(#"Error creating album: %#", error);
}
}];
*/
// dynamic runtime code for code chunk listed above
id sharedPhotoLibrary = [PHPhotoLibrary_class performSelector:NSSelectorFromString(#"sharedPhotoLibrary")];
SEL performChanges = NSSelectorFromString(#"performChanges:completionHandler:");
NSMethodSignature *methodSig = [sharedPhotoLibrary methodSignatureForSelector:performChanges];
NSInvocation* inv = [NSInvocation invocationWithMethodSignature:methodSig];
[inv setTarget:sharedPhotoLibrary];
[inv setSelector:performChanges];
void(^firstBlock)() = ^void() {
Class PHAssetCollectionChangeRequest_class = NSClassFromString(#"PHAssetCollectionChangeRequest");
SEL creationRequestForAssetCollectionWithTitle = NSSelectorFromString(#"creationRequestForAssetCollectionWithTitle:");
[PHAssetCollectionChangeRequest_class performSelector:creationRequestForAssetCollectionWithTitle withObject:albumName];
};
void (^secondBlock)(BOOL success, NSError *error) = ^void(BOOL success, NSError *error) {
if (success) {
[assetsLib enumerateGroupsWithTypes:ALAssetsGroupAlbum usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
if (group) {
NSString *name = [group valueForProperty:ALAssetsGroupPropertyName];
if ([albumName isEqualToString:name]) {
groupFound = true;
handler(group, nil);
}
}
} failureBlock:^(NSError *error) {
handler(nil, error);
}];
}
if (error) {
NSLog(#"Error creating album: %#", error);
handler(nil, error);
}
};
// Set the first and second blocks.
[inv setArgument:&firstBlock atIndex:2];
[inv setArgument:&secondBlock atIndex:3];
[inv invoke];
}
else {
// code that always creates an album on iOS 7.x.x but fails
// in certain situations such as if album has been deleted
// previously on iOS 8...x. .
[assetsLib addAssetsGroupAlbumWithName:albumName
resultBlock:^(ALAssetsGroup *group) {
handler(group, nil);
} failureBlock:^(NSError *error) {
NSLog( #"Failed to create album: %#", albumName);
handler(nil, error);
}];
}
The following code works with iOS7 and iOS8. It also checks if there is an image in the filter. Before you execute the code you should check the album permission:
// get the latest image from the album
-(void)getLatestPhoto
{
NSLog(#"MMM TGCameraViewController - getLatestPhoto");
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just photos.
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
// For this example, we're only interested in the last item [group numberOfAssets]-1 = last.
if ([group numberOfAssets] > 0) {
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:[group numberOfAssets]-1]
options:0
usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
// Do something interesting with the AV asset.
UIImage *img = [UIImage imageWithCGImage:[representation fullScreenImage]];
// use the photo here ...
// we only need the first (most recent) photo -- stop the enumeration
*innerStop = YES;
}
}];
}
}
failureBlock: ^(NSError *error) {
// Typically you should handle an error more gracefully than this.
NSLog(#"No groups");
}];
}
(This code is a modified version from here.)
Related
my company is having a big problem with getting correct size metadata by fetching PHAssets.
We have developed an iOS applications that lets customers choose pictures from library, get the size (in pixel) for each of them, calculate coordinates for adjusting to gadgets we sell, then upload high quality version of picture to our server to print gadgets.
For some of our customers, the problem is that the size in pixel of some of the high-quality versions of pictures sent, does not match pixelWidth and pixelHeight given by the PHAsset object.
To make an example, we have a picture that:
is reported to be 2096x3724 by PHAsset object
but, when full size image is requested, a 1536x2730 picture is generated
The picture is not in iCloud, and is sent by an iPhone 6 SE running iOS 10.2.
This is the code to get full size image version:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageForAsset:imageAsset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:imgOpts resultHandler:^(UIImage * result, NSDictionary * info) {
NSData * imageData = UIImageJPEGRepresentation(result, 0.92f);
//UPLOAD OF imageData TO SERVER HERE
}]
Also tried with requestImageDataForAsset method, but with no luck:
PHImageRequestOptions *imgOpts = [[PHImageRequestOptions alloc] init];
imgOpts.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
imgOpts.networkAccessAllowed = YES;
imgOpts.resizeMode = PHImageRequestOptionsResizeModeExact;
imgOpts.version = PHImageRequestOptionsVersionCurrent;
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
[imageManager requestImageDataForAsset:imageAsset options:imgOpts resultHandler:^(NSData * imageData, NSString * dataUTI, UIImageOrientation orientation, NSDictionary * info) {
//UPLOAD OF imageData TO SERVER HERE
}]
Getting exact size from high-resolution version of every picture, before doing upload, is not an option for us, 'cause it would degrade a lot performance when selecting a large amount of assets from the library.
Are we missing or doing something wrong?
Is there a way to get asset size in pixel without loading full-resolution image into memory?
Thanks for helping
This is due to a bug in Photos framework. Details about the bug can be found here.
Sometimes, after a photo is edited, a smaller version is created. This only occurs for some larger photos.
Calling either requestImageForAsset: (with PHImageManagerMaximumSize) or requestImageDataForAsset: (with PHImageRequestOptionsDeliveryModeHighQualityFormat) will read the data from the smaller file version, when trying to retrieve the edited version (PHImageRequestOptionsVersionCurrent).
The info in the callback of the above methods will point the path to the image. As an example:
PHImageFileURLKey = "file:///[...]DCIM/100APPLE/IMG_0006/Adjustments/IMG_0006.JPG";
Inspecting that folder, I was able to find another image, FullSizeRender.jpg - this one has the full size and contains the latest edits. Thus, one way would be to try and read from the FullSizeRender.jpg, when such a file is present.
Starting with iOS 9, it's also possible to fetch the latest edit, at highest resolution, using the PHAssetResourceManager:
// if (#available(iOS 9.0, *)) {
// check if a high quality edit is available
NSArray<PHAssetResource *> *resources = [PHAssetResource assetResourcesForAsset:_asset];
PHAssetResource *hqResource = nil;
for (PHAssetResource *res in resources) {
if (res.type == PHAssetResourceTypeFullSizePhoto) {
// from my tests so far, this is only present for edited photos
hqResource = res;
break;
}
}
if (hqResource) {
PHAssetResourceRequestOptions *options = [[PHAssetResourceRequestOptions alloc] init];
options.networkAccessAllowed = YES;
long long fileSize = [[hqResource valueForKey:#"fileSize"] longLongValue];
NSMutableData *fullData = [[NSMutableData alloc] initWithCapacity:fileSize];
[[PHAssetResourceManager defaultManager] requestDataForAssetResource:hqResource options:options dataReceivedHandler:^(NSData * _Nonnull data) {
// append the data that we're receiving
[fullData appendData:data];
} completionHandler:^(NSError * _Nullable error) {
// handle completion, using `fullData` or `error`
// uti == hqResource.uniformTypeIdentifier
// orientation == UIImageOrientationUp
}];
}
else {
// use `requestImageDataForAsset:`, `requestImageForAsset:` or `requestDataForAssetResource:` with a different `PHAssetResource`
}
can you try this to fetch camera Roll pics:
__weak __typeof(self) weakSelf = self;
PHFetchResult<PHAssetCollection *> *albums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumSelfPortraits options:nil];
[albums enumerateObjectsUsingBlock:^(PHAssetCollection * _Nonnull album, NSUInteger idx, BOOL * _Nonnull stop) {
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.wantsIncrementalChangeDetails = YES;
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %d",PHAssetMediaTypeImage];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
PHFetchResult<PHAsset *> *assets = [PHAsset fetchAssetsInAssetCollection:album options:options];
if(assets.count>0)
{
[assets enumerateObjectsUsingBlock:^(PHAsset * _Nonnull asset, NSUInteger idx, BOOL * _Nonnull stop) {
if(asset!=nil)
{
[[PHImageManager defaultManager] requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeAspectFill options:nil resultHandler:^(UIImage *result, NSDictionary *info)
{
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf addlocalNotificationForFilters:result];
// [weakSelf.buttonGalery setImage:result forState:UIControlStateNormal];
});
}];
*stop = YES;
}
else{
[weakSelf getlatestAferSelfie];
}
}];
}
When I take a video using UIImagePickerController and save it to SavedPhotosAlbum using
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:(NSURL *)movieURL
completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
NSLog(#"Save video fail:%#",error);
} else {
NSLog(#"Save video succeed.");
[self getsavedvideo];
}}];
the saved file contains a name and creation date but not the GPS info. I am surprised by this since I am using the built in camera and the imagePicker. Why wouldn't Apple just include the GPS coordinates (even if I had to ask users permission to use location services).
anyway I digress
I have not found anyway to add the GPS coordinates to the video as it is being saved and I haven't been able to find anyway to add the this information in the completionBlock.
If someone knows how please let me know.
So I wrote a bit of code that is called after the video is saved [self getsavedvideo]; above which gets the saved video and I can get the movie name and creation date and it shows there is no location info.
-(void) getsavedvideo;{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just photos.
[group setAssetsFilter:[ALAssetsFilter allVideos]];
// Chooses the photo at the last index
[group enumerateAssetsWithOptions:NSEnumerationReverse usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
// Stop the enumerations
*stop = YES; *innerStop = YES;
// Do something interesting with the AV asset.
NSString *fileName = [representation filename];
NSDate *myDate = [alAsset valueForProperty:ALAssetPropertyDate];
CLLocation *location = [alAsset valueForProperty:ALAssetPropertyLocation];
NSLog(#"fileName!!!!,%#",fileName);
NSDateFormatter * dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:mad:"dd-MM-yyyy 'at' HH:mm"];
NSLog( #"date for this file is %# %#", [[[alAsset defaultRepresentation] url] absoluteString], [dateFormatter stringFromDate: myDate] );
NSLog (#"locate!!!!%#",location);
} }];
} failureBlock: ^(NSError *error) {
NSLog(#"No groups");}];}
so does anyone know how to append this with the GPS information and save it back to the video's metadata.
I have been wracking my brains out on this for a few days and haven't really found anything helpful.
Plz help
So I figured this out for a device running ios8 or newer anyway.
add this code in the -(void) getsavedvideo that was in the completionBlock above.
Make sure you have
import
if ([PHAsset class]) { // If this class is available, we're running iOS 8
PHFetchOptions *fetchOptions = [PHFetchOptions new];
fetchOptions.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
fetchOptions.fetchLimit = 1;
PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeVideo options:fetchOptions];
PHAsset *lastImageAsset = [fetchResult lastObject];
[[PHImageManager defaultManager]requestImageForAsset:lastImageAsset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:nil resultHandler:^(UIImage *result, NSDictionary *info){
if ([info objectForKey:PHImageErrorKey] == nil && ![[info objectForKey:PHImageResultIsDegradedKey] boolValue]) {
NSArray *resources = [PHAssetResource assetResourcesForAsset:lastImageAsset];
NSString *orgFilename = ((PHAssetResource*)resources[0]).originalFilename;
[lastImageAsset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
NSString *urlstring = [imageURL absoluteString];
NSLog(#"urlstring%#",urlstring);
}];
CLLocationCoordinate2D locationNew = CLLocationCoordinate2DMake( currentLocation.coordinate.latitude, currentLocation.coordinate.longitude) ;
NSDate *nowDate = [NSDate date];
CLLocation *myLocation = [[CLLocation alloc ]initWithCoordinate:locationNew altitude:0.0 horizontalAccuracy:1.0 verticalAccuracy:1.0 timestamp:nowDate];
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
// Create a change request from the asset to be modified.
PHAssetChangeRequest *request = [PHAssetChangeRequest changeRequestForAsset:lastImageAsset];
// Set a property of the request to change the asset itself.
request.location = myLocation;
} completionHandler:^(BOOL success, NSError *error) {
NSLog(#"Finished updating asset. %#", (success ? #"Success." : error));
}];
}
}];
}
else {
//if you aren't running ios8 or newer I haven't found a way to add/change metadata but I will keep looking.So if you leave this blank no new metadata is added.
}
You can add other metadata just create the data in the form needed and then use request.(what you want to change) = new data.
you can check to see if the data already exist and if there is a metadata for it by putting in something like
lastImageAsset (which was the PHAsset fetch result above) and then the property like
NSDate *originalFileDate = [lastImageAsset creationDate]; etc.
there is a list at
https://developer.apple.com/reference/photos/phasset
I have no idea why this is so difficult. I'm trying to determine the file type of a PHAsset, specifically, I want to know if a given asset represents a GIF image or not.
Simply inspecting the asset's filename tells me it's an MP4:
[asset valueForKey:#"filename"] ==> "IMG_XXXX.MP4"
Does iOS convert GIF's to videos when saved to the devices image library? I've also tried fetching the image's data and looking at it's dataUTI, but it just returns nil for GIF's (I'm assuming all videos as well). I'm fetching the image data as follows:
PHImageManager *manager = asset.imageManager ? asset.imageManager : [PHImageManager defaultManager];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
PHImageRequestOptions *o = [[PHImageRequestOptions alloc] init];
o.networkAccessAllowed = YES;
[manager requestImageDataForAsset:asset.asset options:o resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
dispatch_async(dispatch_get_main_queue(), ^{
CIImage *ciImage = [CIImage imageWithData:imageData];
if(completion) completion(imageData, dataUTI, orientation, info, ciImage.properties);
});
}];
});
the dataUTI returned from the above call is nil.
If anyone knows of a reliable way to determine a PHAsset's file type (I'm specifically looking for GIF's, but being able to determine for any type of file would be great) let me know!
Use PHAssetResource.
NSArray *resourceList = [PHAssetResource assetResourcesForAsset:asset];
[resourceList enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
PHAssetResource *resource = obj;
if ([resource.uniformTypeIdentifier isEqualToString:#"com.compuserve.gif"]) {
isGIFImage = YES;
}
}];
Also you can find uniformTypeIdentifier from PHContentEditingInput class. For this; use requestContentEditingInput function from PHAsset
Don't forget to
import MobileCoreServices for kUTTypeGif
Sample Swift 3.1 code:
let options = PHContentEditingInputRequestOptions()
options.isNetworkAccessAllowed = true //for icloud backup assets
let asset : PHAsset = ..... //sampleAsset
asset.requestContentEditingInput(with: options) { (contentEditingInput, info) in
if let uniformTypeIdentifier = contentEditingInput?.uniformTypeIdentifier {
if uniformTypeIdentifier == (kUTTypeGIF as String) {
debugPrint("This asset is a GIF👍")
}
}
}
For Swift 3.0 and above
import MobileCoreServices
var isGIFImage = false
if let identifier = asset.value(forKey: "uniformTypeIdentifier") as? String
{
if identifier == kUTTypeGIF as String
{
isGIFImage = true
}
}
I guess since iOS 11, we can use:
if asset.playbackStyle == .imageAnimated {
// try to show gif animation
}
First of all, I am not sure what do you mean by the GIF image.
Are you referring to Live Photo or Time-lapse ?
However, if you want to check the current asset is Live Photo, Time-lapse, then you can check like this
if(asset.mediaSubtypes == PHAssetMediaSubtypePhotoLive)
{
// this is a Live Photo
}
if(asset.mediaSubtypes == PHAssetMediaSubtypeVideoTimelapse)
{
// this is a Time-lapse
}
for determining the generic file type of a PHAsset, you can check
asset.mediaType == PHAssetMediaTypeImage
asset.mediaType == PHAssetMediaTypeVideo
asset.mediaType == PHAssetMediaTypeAudio
//phAsset if object of phAsset
if let imageType = phAsset.value(forKey: "uniformTypeIdentifier") as? String {
if imageType == kUTTypeGIF as String {
//enter code here
}
}
I am iOS developer i want to get all images from library, without UIImagepickercontroller,and take first 10 images
Any ideas?
There are lots of example out der which will guide you how to get images from ALAssetLibrary
https://www.cocoacontrols.com/search?q=image+picker.
Below is example to get the latest image from ImagePicker
- (void)latestPhotoWithCompletion:(void (^)(UIImage *photo))completion
{
ALAssetsLibrary *library=[[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
// Within the group enumeration block, filter to enumerate just photos.
[group setAssetsFilter:[ALAssetsFilter allPhotos]];
// For this example, we're only interested in the last item [group numberOfAssets]-1 = last.
if ([group numberOfAssets] > 0) {
[group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:[group numberOfAssets]-1] options:0
usingBlock:^(ALAsset *alAsset, NSUInteger index, BOOL *innerStop) {
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
// Do something interesting with the AV asset.
UIImage *img = [UIImage imageWithCGImage:[representation fullScreenImage]];
// completion
completion(img);
// we only need the first (most recent) photo -- stop the enumeration
*innerStop = YES;
}
}];
}
} failureBlock: ^(NSError *error) {
// Typically you should handle an error more gracefully than this.
}];
}
Usage
__weak __typeof(self)wSelf = self;
[self latestPhotoWithCompletion:^(UIImage *photo) {
UIImageRenderingMode renderingMode = YES ? UIImageRenderingModeAlwaysOriginal : UIImageRenderingModeAlwaysTemplate;
[wSelf.switchCameraBut setImage:[photo imageWithRenderingMode:renderingMode] forState:UIControlStateNormal];
}];
How get access to the full-size images from iСloud? Every time I try to get this picture, I get image size 256x342. I not see progress too.
Code:
PHFetchResult *result = [PHAsset fetchAssetsWithLocalIdentifiers:#[assetIdentifier] options:nil];
PHImageManager *manager = [PHImageManager defaultManager];
[result enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
options.synchronous = YES;
options.networkAccessAllowed = YES;
options.progressHandler = ^(double progress, NSError *error, BOOL *stop, NSDictionary *info) {
NSLog(#"%f", progress);
};
[manager requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:options resultHandler:^(UIImage *resultImage, NSDictionary *info)
{
UIImage *image = resultImage;
NSLog(#"%#", NSStringFromCGSize(resultImage.size));
}];
}];
Until I click the picture in Photo app, this picture will be of poor quality. But as soon as I click on the picture, it downloaded on the device and will be full-size quality.
I think the below should get the full resolution image data:
[manager requestImageDataForAsset:asset
options:options
resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info)
{
UIImage *image = [UIImage imageWithData:imageData];
//...
}];
The entire Photos Framework (PhotoKit) is covered in the WWDC video: https://developer.apple.com/videos/wwdc/2014/#511
Hope this helps.
Edit:
The resultHandler can be called twice. This is explained in the video I linked to at around 30:00. Could be that you are only getting the thumbnail and the full image will come with the second time its called.
I'm having some of the same issues. It is either a bug or poor documentation. I've been able to get around the issue by specifying a requested size of 2000x2000. The problem with this is that I do get the full size image but sometimes it comes back marked as degraded so I keep waiting for a different image which never happens. This is what I do to get around those issues.
self.selectedAsset = asset;
self.collectionView.allowsSelection = NO;
PHImageRequestOptions* options = [[[PHImageRequestOptions alloc] init] autorelease];
options.synchronous = NO;
options.version = PHImageRequestOptionsVersionCurrent;
options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
options.resizeMode = PHImageRequestOptionsResizeModeNone;
options.networkAccessAllowed = YES;
options.progressHandler = ^(double progress,NSError *error,BOOL* stop, NSDictionary* dict) {
NSLog(#"progress %lf",progress); //never gets called
};
[self.delegate choosePhotoCollectionVCIsGettingPhoto:YES]; //show activity indicator
__block BOOL isStillLookingForPhoto = YES;
currentImageRequestId = [[PHImageManager defaultManager] requestImageForAsset:asset targetSize:CGSizeMake(2000, 2000) contentMode:PHImageContentModeAspectFill options:options resultHandler:^(UIImage *result, NSDictionary *info) {
NSLog(#"result size:%#",NSStringFromCGSize(result.size));
BOOL isRealDealForSure = NO;
NSNumber* n = info[#"PHImageResultIsPlaceholderKey"]; //undocumented key so I don't count on it
if (n != nil && [n boolValue] == NO){
isRealDealForSure = YES;
}
if([info[PHImageResultIsInCloudKey] boolValue]){
NSLog(#"image is in the cloud"); //never seen this. (because I allowed network access)
}
else if([info[PHImageResultIsDegradedKey] boolValue] && !isRealDealForSure){
//do something with the small image...but keep waiting
[self.delegate choosePhotoCollectionVCPreviewSmallPhoto:result];
self.collectionView.allowsSelection = YES;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(3.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{ //random time of 3 seconds to get the full resolution in case the degraded key is lying to me. The user can move on but we will keep waiting.
if(isStillLookingForPhoto){
self.selectedImage = result;
[self.delegate choosePhotoCollectionVCPreviewFullPhoto:self.selectedImage]; //remove activity indicator and let the user move on
}
});
}
else {
//do something with the full result and get rid of activity indicator.
if(asset == self.selectedAsset){
isStillLookingForPhoto = NO;
self.selectedImage = result;
[self.delegate choosePhotoCollectionVCPreviewFullPhoto:self.selectedImage];
self.collectionView.allowsSelection = YES;
}
else {
NSLog(#"ignored asset because another was pressed");
}
}
}];
To get the full size image you need to check the info list.
I used this to test if the returned result is the full image, or a degraded version.
if ([[info valueForKey:#"PHImageResultIsDegradedKey"]integerValue]==0){
// Do something with the FULL SIZED image
} else {
// Do something with the regraded image
}
or you could use this to check if you got back what you asked for.
if ([[info valueForKey:#"PHImageResultWantedImageFormatKey"]integerValue]==[[info valueForKey:#"PHImageResultDeliveredImageFormatKey"]integerValue]){
// Do something with the FULL SIZED image
} else {
// Do something with the regraded image
}
There are a number of other, undocumented but useful, keys e.g.
PHImageFileOrientationKey = 3;
PHImageFileSandboxExtensionTokenKey = "/private/var/mobile/Media/DCIM/100APPLE/IMG_0780.JPG";
PHImageFileURLKey = "file:///var/mobile/Media/DCIM/100APPLE/IMG_0780.JPG";
PHImageFileUTIKey = "public.jpeg";
PHImageResultDeliveredImageFormatKey = 9999;
PHImageResultIsDegradedKey = 0;
PHImageResultIsInCloudKey = 0;
PHImageResultIsPlaceholderKey = 0;
PHImageResultRequestIDKey = 1;
PHImageResultWantedImageFormatKey = 9999;
Have fun.
Linasses
I believe it's related to you setting PHImageRequestOptionsDeliveryModeOpportunistic.
Note that this is not even supported for asynchronous mode (default).
Try PHImageRequestOptionsDeliveryModeHighQualityFormat intead.