This question already has answers here:
Swift: asynchronously loading and displaying photos
(2 answers)
Closed 3 years ago.
When I am trying to fetch an 10 images from my gallery it takes very long until I get it.
Why is the image fetching async?
If I change the requestOptions.deliveryMode to .fastFormat it gets fast but I lose a looooot of quality
What is the best that I can do?
func fetchPhotos() {
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchOptions.fetchLimit = 10
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.image, options: fetchOptions)
if fetchResult.count > 0 {
let totalImageCountNeeded = 10 // <-- The number of images to fetch
fetchPhotoAtIndex(0, totalImageCountNeeded, fetchResult)
}
}
func fetchPhotoAtIndex(_ index:Int, _ totalImageCountNeeded: Int, _ fetchResult: PHFetchResult<PHAsset>) {
let requestOptions = PHImageRequestOptions()
requestOptions.deliveryMode = .highQualityFormat
PHImageManager.default().requestImage(for: fetchResult.object(at: index) as PHAsset, targetSize: view.frame.size, contentMode: PHImageContentMode.aspectFill, options: requestOptions, resultHandler: { (image, _) in
if let image = image {
// Add the returned image to your array
self.images += [image]
}
if index + 1 < fetchResult.count && self.images.count < totalImageCountNeeded {
self.fetchPhotoAtIndex(index + 1, totalImageCountNeeded, fetchResult)
} else {
print("Completed array: \(self.images)")
self.collectionView.reloadData()
}
})
}
I just tested your code and it took about a second to load 50 images, check if there something else in your code that could be slowing downs this process.
If your intentions is to have all the images loaded as soon as you present the View Controller consider loading the images on the previous screen.
The request is async because it needs to use the network. Async just means it takes time to perform, outside of the syncronous execution of the application's threads. Synchronous would be it executes the code in sync with the application thread the code is running on. Here, the thread must wait for the network request to finish.
You'll have a tradeoff between quality and speed. Obviously, the better quality the image, the larger it is, and therefore it takes longer to download.
I'd suggest you download smaller chunks of photos, or rethink your UI such that less images are shown at any one time. You could instead of 10 show 3 images, and preload the next 3 with pagination.
Related
I am trying to fetch images from album and this code fetch all images from specific album, but on scrolling app will close and give error "Message from debugger: Terminated due to memory issue". Please check the code and find the error.(I want to fetch all albums and images like "Lalalab" app without memory warning).
func fatchImagesfromAlbum() {
DispatchQueue.global(qos: .background).async {
self.photoAssets = self.fetchResult as! PHFetchResult<AnyObject>
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
self.photoAssets = PHAsset.fetchAssets(in: self.assetCollection, options: fetchOptions) as! PHFetchResult<AnyObject>
for i in 0..<self.photoAssets.count{
let asset = self.photoAssets.object(at: i)
let imageSize = CGSize(width: asset.pixelWidth,
height: asset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .fastFormat
options.isSynchronous = true
self.imageManager.requestImage(for: asset as! PHAsset, targetSize: imageSize, contentMode: .aspectFill, options: options, resultHandler: { (image, info) -> Void in
self.images.append(image!)
let url:NSURL = info!["PHImageFileURLKey"] as! NSURL
let urlString: String = url.path!
let theFileName = (urlString as NSString).lastPathComponent
print("file name\(info!)")
self.imageName.append("\(theFileName)")
self.imagePath.append("\(urlString)")
})
print(self.imagePath)
print(self.imageName)
DispatchQueue.main.async
{
[unowned self] in
self.collectionView.reloadData()
}
}
}
PHPhotoLibrary.shared().register(self)
if fetchResult == nil {
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
fetchResult = PHAsset.fetchAssets(with: allPhotosOptions)
}
}
Faced such an issue; It has nothing to do with making weak or unowned references. When objects have been created by either your Objective-C code or using Cocoa classes, what you should do is to deal with autoreleasepool, try to call your method inside autoreleasepool:
autoreleasepool {
fatchImagesfromAlbum()
}
Should it be fetchImagesfromAlbum instead of fatchImagesfromAlbum?
Citing from Advanced Memory Management Programming Guide:
Autorelease pool blocks provide a mechanism whereby you can relinquish
ownership of an object, but avoid the possibility of it being
deallocated immediately (such as when you return an object from a
method). Typically, you don’t need to create your own autorelease pool
blocks, but there are some situations in which either you must or it
is beneficial to do so.
However, it should not be required to execute the whole method in the autoreleasepool, probably the reason of the memory issue is caused by executing fetchAssets or requestImage iteratively (inside the for loop).
Refers to: Is it necessary to use autoreleasepool in a Swift program?
You need to create a diapatch group then whenever you run loop enter group and leave. After loop has finished iteration you can call diapatch.notify and relaod your UI
I'm building Gallery app like a iOS standard Photos App. (Swift 4.1)
I want to fetch the thumbnails, titles, and total number of images that I can see when I launch the standard photo app.
The Photos framework seems more complex than I thought.
It is not easy to find a way to explain why, and what procedures should be approached.
Can you tell me about this?
The minimum number of steps to achieve what you are asking is:
// import the framework
import Photos
// get the albums list
let albumList = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .albumRegular, options: nil)
// you can access the number of albums with
albumList.count
// individual objects with
let album = albumList.object(at: 0)
// eg. get the name of the album
album.localizedTitle
// get the assets in a collection
func getAssets(fromCollection collection: PHAssetCollection) -> PHFetchResult<PHAsset> {
let photosOptions = PHFetchOptions()
photosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
photosOptions.predicate = NSPredicate(format: "mediaType == %d", PHAssetMediaType.image.rawValue)
return PHAsset.fetchAssets(in: collection, options: photosOptions)
}
// eg.
albumList.enumerateObjects { (coll, _, _) in
let result = self.getAssets(fromCollection: coll)
print("\(coll.localizedTitle): \(result.count)")
}
// Now you can:
// access the count of assets in the PHFetchResult
result.count
// get an asset (eg. in a UITableView)
let asset = result.object(at: indexPath.row)
// get the "real" image
PHCachingImageManager.default().requestImage(for: asset, targetSize: CGSize(width: 200, height: 200), contentMode: .aspectFill, options: nil) { (image, _) in
// do something with the image
}
I also suggest to take a look at the Apple sample code for the Photos framework, is not hard to follow, together with the Photos framework documentation.
I have an array of image assets. I have to turn those assets into images, add them to an array and upload them to Firebase Database. I have 2 issues with this.
Issue 1:
In a custom UICollectionViewCell I display all the images that the user has selected, I see 4 images in the cell, when I've selected 4 images from the Photos (I'm using a custom framework). Now, when I call requestImage method, I get double the amount of images in the array that's supposed to convert each asset from the asset array and store it into a UIImage array called assetsTurnedIntoImages. I read more about it and it's related to the PHImageRequestOptions and if its isSynchronous property returns true or false, that or if PHImageRequestOptions is nil. Now, obviously I didn't get something because my code still doesn't work.
Issue 2:
As you can see from the code below, the targetSize gives me a somewhat thumbnail image size. When I upload the image to the storage, I don't need a thumbnail, I need it's original size. If I set it to PHImageManagerMaximumSize I get an error:
"Connection to assetsd was interrupted or assetsd died”
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell
{
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "PhotoPostCVCell", for: indexPath) as! PhotoPostCVCell
if let takenImage = cameraPhotoUIImage
{
cell.cellImage.image = takenImage
}
if assets.count > 0
{
let asset = assets[indexPath.row]
let requestOptions = PHImageRequestOptions()
requestOptions.isSynchronous = true // synchronous works better when grabbing all images
requestOptions.deliveryMode = .opportunistic
imageManager.requestImage(for: asset, targetSize: CGSize(width: 100, height: 100), contentMode: .aspectFill, options: requestOptions)
{ (image, _) in
DispatchQueue.main.async {
print("WE ARE IN")
cell.cellImage.image = image!
self.assetsTurnedIntoImages.append(image!)
}
}
}
return cell
}
Change the option could solve the problem:
options.deliveryMode = .highQualityFormat
I found that solution in the source code:
#available(iOS 8, iOS 8, *)
public enum PHImageRequestOptionsDeliveryMode : Int {
#available(iOS 8, *)
case opportunistic = 0 // client may get several image results when the call is asynchronous or will get one result when the call is synchronous
#available(iOS 8, *)
case highQualityFormat = 1 // client will get one result only and it will be as asked or better than asked
#available(iOS 8, *)
case fastFormat = 2 // client will get one result only and it may be degraded
}
To avoid completion handler's twice calling, just add an option in this request to make it synchronous
let options = PHImageRequestOptions()
options.isSynchronous = true
let asset: PHAsset = self.photoAsset?[indexPath.item] as! PHAsset
PHImageManager.default().requestImage(for: asset, targetSize: CGSize(width: 1200, height: 1200), contentMode: .aspectFit, options: options, resultHandler: {(result, info) in
if result != nil {
//do your work here
}
})
To avoid crash on loading image you should compress the image or reduce its size for further work
I want to list all photos from "My Photo Stream", here is my code:
private func fetchAssetCollection(){
let result = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .albumMyPhotoStream, options: nil)
result.enumerateObjects({ (collection, index, stop) in
if let albumName = collection.localizedTitle {
print("Album => \(collection.localIdentifier), \(collection.estimatedAssetCount), \(albumName) ")
}
let assResult = PHAsset.fetchAssets(in: collection, options: nil)
let options = PHImageRequestOptions()
options.resizeMode = .exact
let scale = UIScreen.main.scale
let dimension = CGFloat(78.0)
let size = CGSize(width: dimension * scale, height: dimension * scale)
assResult.enumerateObjects({ (asset, index, stop) in
print("index \(index)")
PHImageManager.default().requestImage(for: asset, targetSize: size, contentMode: .aspectFill, options: options) { (image, info) in
if let name = asset.originalFilename {
print("photo \(name) \(index) \(asset.localIdentifier)")
}
}
})
})
}
extension PHAsset {
var originalFilename: String? {
var fname:String?
if #available(iOS 9.0, *) {
let resources = PHAssetResource.assetResources(for: self)
if let resource = resources.first {
fname = resource.originalFilename
}
}
if fname == nil {
// this is an undocumented workaround that works as of iOS 9.1
fname = self.value(forKey: "filename") as? String
}
return fname
}
}
it works, but the problem is that it print duplicated record.
It prints 329*2 records but actually I have 329 photos in my "My Photo stream".
photo IMG_0035.JPG 10 0671E1F3-CB7C-459E-8111-FCB381175F29/L0/001
photo IMG_0035.JPG 10 0671E1F3-CB7C-459E-8111-FCB381175F29/L0/001
......
From the documentation for PHImageManager requestImage:
By default, this method executes asynchronously. If you call it from a background thread you may change the isSynchronous property of the options parameter to true to block the calling thread until either the requested image is ready or an error occurs, at which time Photos calls your result handler.
For an asynchronous request, Photos may call your result handler block more than once. Photos first calls the block to provide a low-quality image suitable for displaying temporarily while it prepares a high-quality image. (If low-quality image data is immediately available, the first call may occur before the method returns.) When the high-quality image is ready, Photos calls your result handler again to provide it. If the image manager has already cached the requested image at full quality, Photos calls your result handler only once. The PHImageResultIsDegradedKey key in the result handler’s info parameter indicates when Photos is providing a temporary low-quality image.
So either make the request synchronous or check the PHImageResultIsDegradedKey value from the info dictionary to see if this instance of the image is the one you actually wish to keep or ignore.
I would like to fetch all photos that are saved in device and save them to my app and then eventually (if user allow this) delete originals.
This is my whole class I created for this task:
class ImageAssetsManager: NSObject {
let imageManager = PHCachingImageManager()
func fetchAllImages() {
let options = PHFetchOptions()
options.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.Image.rawValue)
options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
if #available(iOS 9.0, *) {
options.fetchLimit = 5
} else {
// Fallback on earlier versions
}
let imageAssets = PHAsset.fetchAssetsWithOptions(options)
print(imageAssets.count)
self.getAssets(imageAssets)
}
func getAssets(assets: PHFetchResult) {
var assetsToDelete: [PHAsset] = []
assets.enumerateObjectsUsingBlock { (object, count, stop) in
if object is PHAsset {
let asset = object as! PHAsset
let imageSize = CGSize(width: asset.pixelWidth,height: asset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .FastFormat
options.synchronous = true
self.imageManager.requestImageForAsset(asset, targetSize: imageSize, contentMode: .AspectFill, options: options, resultHandler: { [weak self]
image, info in
self.addAssetToSync(image, info: info)
assetsToDelete.append(asset)
})
}
}
self.deleteAssets(assetsToDelete)
}
func addAssetToSync(image: UIImage?, info: [NSObject : AnyObject]?) {
guard let image = image else {
return
}
guard let info = info else {
return
}
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), {
let imageData = UIImageJPEGRepresentation(image, 0.95)!
let fileUrl = info["PHImageFileURLKey"] as! NSURL
dispatch_async(dispatch_get_main_queue(), {
let photoRootItem = DatabaseManager.sharedInstance.getPhotosRootItem()
let ssid = DatabaseManager.sharedInstance.getSsidInfoByName(ContentManager.sharedInstance.ssid)
let item = StorageManager.sharedInstance.createFile(imageData, name: fileUrl.absoluteString.fileNameWithoutPath(), parentFolder: photoRootItem!, ssid: ssid!)
})
})
}
func deleteAssets(assetsToDelete: [PHAsset]){
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
PHAssetChangeRequest.deleteAssets(assetsToDelete)
}, completionHandler: { success, error in
guard let error = error else {return}
})
}
}
It's working but my problem is that it's working just for a limited number of photos. When I try it with all I get memory warnings and then app crashed. I know why it is. I know that my problem is that I get all photos to memory and it's too much. I could fetch images with that fetch limit and make it to loop but I am not sure if it is best solution.
I was hoping that with some solution process few photos then release memory and again and again until end. But this change would be somewhere in enumerateObjectsUsingBlock. I am not sure if it helps but I don't even need to get image. I just need to copy image file from device path to my app sandbox path.
What's best solution for this? How to avoid memory warnings and leaks? Thanks
Change your dispatch_async calls to dispatch_sync. Then you will process photos one at a time as you walk through enumerateObjectsUsingBlock, instead of trying to process them all at the same time.