I am creating a button to let the user choose an image from their photo library, and I would like to hide this button if there are no images in the user’s photo library.
BOOL stillImagesAvailable = [[UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypePhotoLibrary] containsObject:(NSString *)kUTTypeImage];
if (!stillImagesAvailable) {
// Hide button
return;
}
stillImagesAvailable = [UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypePhotoLibrary];
if (!stillImagesAvailable) {
// Hide button
return;
}
// Show button
+[UIImagePickerController isSourceTypeAvailable:] is documented to return NO if there are no photos in the library, but I'm seeing it return YES in this case on iOS 11 running on the simulator. Am I using this method wrong, or is the documentation incorrect, or am I running into a bug?
Is there another good way to detect whether there are any images in the user’s photo library?
UIImagePickerController doesn't expose this information. That class generally doesn't provide information about the user's photo library, merely some user-selected contents thereof. (The documentation you cite appears to be incorrect — I'd recommend filing a bug against the documentation for them to change it.)
Aside: The "user-selected" part is important in iOS 11 and later: the image picker runs in a separate process, meaning your app gets access only to the picked assets, meaning you don't have to ask the user for blanket read/write access to the Photos library through privacy settings.) Keep that privacy stuff in mind for further down in this answer, though...
If you need to learn about the contents of the user's Photos library, use the Photos framework. If specifically you want to know whether the library is "empty", you'll need to define what "empty" means for your app. No assets saved in the local library through iOS? No assets synced onto the device through iTunes? What if I have no assets, but I do have some empty albums?
Assuming one possible answer to those questions (no local or synced assets, don't care about albums), here's some (untested) code that should get your answer:
- (BOOL)isPhotoLibraryEmpty {
PHFetchOptions *options = [PHFetchOptions new];
options.includeAssetSourceTypes = PHAssetSourceTypeUserLibrary | PHAssetSourceTypeiTunesSynced;
PHFetchResult *results = [PHAsset fetchAssetsWithOptions:options];
return results.count == 0;
}
However, if this would be your app's only use of the Photos framework, it might be wiser to think about whether it's worthwhile to preemptively check for an empty library. If you use the Photos framework at all, your app needs blanket read/write access to the Photos library through the iOS privacy settings (that is, you provide a NSPhotoLibraryUsageDescription in your info.plist, and iOS prompts the user for permission the first time you call any Photos API).
For example, if all you're using the Photos framework for is to check for an empty library so you know whether to disable a "pick a photo" button in your UI... getting that deep into the privacy/permissions system probably isn't worth it. (Now you're actively interrupting them with a privacy prompt instead of passively disabling a button.) It's probably better to just let the user do whatever your UI does for invoking the image picker, and let UIImagePickerController show an appropriate screen if the library is empty (which it does).
Related
According to those SO questions: UIImagePickerController not asking for permission and No permission to pick a photo from the photo library
If you want to select one image on iOS, you don't have to ask for permission to do it as the app doesn't actually access the gallery.
However, I can't find a way of doing it Flutter. Packages like ImagePicker always ask for permission.
Has anyone succeeded in picking an image in Flutter on iOS without asking for permission?
From Apple documentation:
PHPickerViewController is a new picker that replaces UIImagePickerController. Its user interface matches that of the Photos app, supports search and multiple selection of photos and videos, and provides fluid zooming of content. Because the system manages its life cycle in a separate process, it’s private by default. The user doesn’t need to explicitly authorize your app to select photos, which results in a simpler and more streamlined user experience.
This library uses PHPickerViewController as seen here
The old UIImagePickerController allowed it on older iOS'es, but it has been deprecated, since iOS 14.
The Flutter ImagePicker plugin uses the PHPicker in the iOS code, as I checked for their code on Github, and it allows you to pick an image from the user without requesting permissions. I recommend highly to use that plugin.
Try file_picker it should work for you as it supports all the platform including IOS and Mac supporting various types of file type, you can specify your custom file types also limiting your file selections as well as you can pick files from cloud (GDrive, Dropbox, iCloud)...
First of all add the latest file_picker as a dependency in your pubspec.yaml file.
Import this dependency in file wherever you want to use and then you are good to go...
for picking single file use this code:
FilePickerResult? result = await FilePicker.platform.pickFiles();
if (result != null) {
File file = File(result.files.single.path);
} else {
// User canceled the picker
}
files with extension filter:
FilePickerResult? result = await FilePicker.platform.pickFiles(
type: FileType.custom,
allowedExtensions: ['jpg', 'pdf', 'doc'],
);
You can find detailed usage Here
you can find the documentation Here
I am trying to get the PHAssets from Camera roll, iCloud, and Photostream in the same order as they were added by the user. In ios 11, I use
PHFetchOptions *options = [[PHFetchOptions alloc] init];
PHFetchResult *fetchResults = [PHAsset fetchAssetsWithOptions:options];
But this has stopped working in iOS 12. Using this iCloud images and camera roll images come seperately. There are only 2 sort descriptors provided creationDate and modifiedDate.
The second solution that I tried was to fetch the photos from the 'user library' using smart album. 'User library' works differently based on whether the user has turned on the iCloud library. If the iCloud library is on, then User library contains all the images from camera roll and iCloud library. If the iCloud library is turned off then the user library only contains the images from camera roll. This returns the images from camera roll and iCloud library in correct order. The problem is that it cannot fetch the images from Photostream.
In another solution, the photos were fetched from user library if the iCloud library is on, and if iCloud is off then they are fetched using the old method giving us the correct order of photostream and camera roll images.
This solutions also did not work.
Please check the following scenario:
The iCloud library is off and Photostream is on. We will use fetchAssetsWithOption with default options. The order will be correct.
User turns on the iCloud. Photostream automatically turns off. We will fetch the photos from 'user library'. Giving us all the images from iCloud as well as camera roll sorted with default options. The order will be correct.
Now user turns off the iCloud library. All the images from iCloud should get deleted from the device. But, some images from iCloud are left in the camera roll. They are not deleted (may be due to a bug in iCloud). As a result, these undeleted images always appear on the top. This also happens if the user downloads the images from iCloud and does not delete the image when the iCloud is turned off.
So is there any way to do it.
The answer is being updated for iOS 13 where using 'smartAlbumRecentlyAdded' works better than 'smartAlbumUserLibrary', because the behavior changed: Feedback was sent to Apple about the change in behavior in iOS 13. Hopefully a better solution will be at hand.
Because, although smartAlbumRecentlyAdded will return the items by their date added, you will not get ALL the items, just RECENT items.
var fetchResult:PHFetchResult<PHAssetCollection>
if #available(iOS 13.0, *) {
fetchResult = PHAssetCollection.fetchAssetCollections(with:.smartAlbum,subtype:.smartAlbumRecentlyAdded,options: nil)
}
else {
fetchResult = PHAssetCollection.fetchAssetCollections(with:.smartAlbum,subtype:.smartAlbumUserLibrary,options: nil)
}
if let assetCollection = fetchResult.firstObject {
self.allPhotos = PHAsset.fetchAssets(in: assetCollection, options: nil)
}
this question is a lot like Share data between two or more iPhone applications except:
I'm looking for a way to do this in an iOS 8+ (or 9+) application using swift
I want to be able to use the sound file contained in the first app so the easter egg I'm making is only available when the user has both apps installed
since this is part of an easter egg, i don't want to use any method that would cause anything extra to be displayed on screen such as a browser redirect or some kind of permission popup
(this basically rules out using the share sheet and custom url's to pass data as described in the post above)
I am using AVAudioPlayer and AVAudioSession in the first app to play the sound if that is at all helpful.
Use App Group
You can share actual NSData through the NSUserDefaults:
if let userDefaults = NSUserDefaults(suiteName: <group>) {
userDefaults.setObject(obj, forKey: key)
}
and retrieve from another app in the same group like so:
if let userDefaults = NSUserDefaults(suiteName: <group>) {
if let obj = userDefaults.objectForKey(key) {
// magic
}
}
It appears that the only limitation for passing data through the user defaults is the device storage capacity, and since NSUserDefaults accepts NSData as a storage format, it makes a prime candidate for sharing tidbits information.
If both apps are yours you can implement a custom url scheme in the second app, and then from the first app ask if it knows how to open an URL with that scheme. If the answer is yes, the app is installed. The function is called canOpenURL. It's an instance method of UIApplication.
I vaguely remember that in iOS 9 and later, Apple added a restriction that you have to register the URLs you are going to ask about in your info.plist, but I don't remember the details. That won't prevent this scheme from working, but it is an extra step you have to take.
Apple will introduce Live Photo in iOS 9/iPhone 6s. Where is the file format documented?
A live photo has two resources. They are tied together with an asset identifier (a UUID as a string).
A JPEG; this must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier] (17 is the Apple Maker Note Asset Identifier key).
A Quicktime MOV encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)
Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.
The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.
Here's the link. Otherwise, here's the text:
Live Photos
Live Photos is a new feature of iOS 9 that allows users to capture and
relive their favorite moments with richer context than traditional
photos. When the user presses the shutter button, the Camera app
captures much more content along with the regular photo, including
audio and additional frames before and after the photo. When browsing
through these photos, users can interact with them and play back all
the captured content, making the photos come to life.
iOS 9.1 introduces APIs that allow apps to incorporate playback of
Live Photos, as well as export the data for sharing. There is new
support in the Photos framework to fetch a PHLivePhoto object from the
PHImageManager object, which is used to represent all the data that
comprises a Live Photo. You can use a PHLivePhotoView object (defined
in the PhotosUI framework) to display the contents of a Live Photo.
The PHLivePhotoView view takes care of displaying the image, handling
all user interaction, and applying the visual treatments to play back
the content.
You can also use PHAssetResource to access the data of a PHLivePhoto
object for sharing purposes. You can request a PHLivePhoto object for
an asset in the user’s photo library by using PHImageManager or
UIImagePickerController. If you have a sharing extension, you can also
get PHLivePhoto objects by using NSItemProvider. On the receiving side
of a share, you can recreate a PHLivePhoto object from the set of
files originally exported by the sender.
Guidelines for Displaying Live Photos
It’s important to remember that a Live Photo is still a photo. If you have to display a Live Photo in
an environment that doesn’t support PHLivePhotoView, it’s recommended
that you present it as a regular photo.
Don’t display the extra frames and audio of a Live Photo separately.
It's important that the content of the Live Photo be presented in a
consistent way that uses the same visual treatment and interaction
model in all apps.
It’s recommended that you identify a photo as a Live Photo by placing
the badge provided by the PHLivePhotoView class method
livePhotoBadgeImageWithOptions:PHLivePhotoBadgeOptionsOverContent in
the top-left corner of the photo.
Note that there is no support for providing the visual effect that
users experience as they swipe through photos in the Photos app.
Guidelines for Sharing Live Photos
The data of a Live Photo is
exported as a set of files in a PHAssetResource object. The set of
files must be preserved as a unit when you upload them to a server.
When you rebuild a PHLivePhoto with these files on the receiver side,
the files are validated; loading fails if the files don’t come from
the same asset.
If your app lets users apply effects or adjustments to a photo before
sharing it, be sure to apply the same adjustments to all frames of the
Live Photo. Alternatively, if you don’t support adjusting the entire
contents of a Live Photo, share it as a regular photo and show an
appropriate indication to the user.
If your app has UI for picking photos to share, you should let users
play back the entire contents so they know exactly what they are
sharing.When selecting photos to share in your app, users should also
be able to turn a Live Photo off, so they can post it as a traditional
photo.
Outside of the documentation, Live Photos are made up of 2 resources, an image and an mov (quicktime movie file). So every Live Photo has 2 'actual' files connected by the wrapper of the Live Photo type.
Live Photos is actually two files. Original JPEG Image and Full HD Video.
Uniform Type Identifier (UTI) for the format is kUTTypeLivePhoto / com.apple.live-photo
#available(OSX 10.12, *)
public let kUTTypeLivePhoto: CFString
/*
*
* kUTTypeLivePhoto
*
* Live Photo
*
* UTI: com.apple.live-photo
*
*
*/
some additional info about live photos:
agree, it has .mov file extension
it saved in directory /var/mobile/media/DCIM/100apple/ alongside
with jpg version of photo
live photos can be played even on device without 3D touch (I can
play it on my ipad 2017 via long-pressing on the photo)
it can be played even on old phones (such as iphone 5) even on iOS8
if you install PhotosLive tweak
My application only gets image and it's metadata from ios device using ALAssetsLibrary.
When application starts loading images list ios warning is shown..
"Application" Would Like to Use Your Current Location.
This allows access to location information in photos and videos.
Is there way to fix program code in such way that this warning not to be shown? (not by using Setting->General....)
I think user doesn't understand why application during picking the image asks about location.
If you need the metadata info, using assetslibrary is your only option. Using Alassetslibrary means that the user grants permission to location services. The simple reason for that is that the photos metadata might contain location/gps data.
Cheers,
Hendrik
UPDATED: It will not request access permission in iOS 6. You can check the client's [[UIDevice currentDevice]systemVersion] while using ALAssetsLibrary.