iOS - Fetch all photos from device and save them to app - ios

I would like to fetch all photos that are saved in device and save them to my app and then eventually (if user allow this) delete originals.
This is my whole class I created for this task:
class ImageAssetsManager: NSObject {
let imageManager = PHCachingImageManager()
func fetchAllImages() {
let options = PHFetchOptions()
options.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.Image.rawValue)
options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
if #available(iOS 9.0, *) {
options.fetchLimit = 5
} else {
// Fallback on earlier versions
}
let imageAssets = PHAsset.fetchAssetsWithOptions(options)
print(imageAssets.count)
self.getAssets(imageAssets)
}
func getAssets(assets: PHFetchResult) {
var assetsToDelete: [PHAsset] = []
assets.enumerateObjectsUsingBlock { (object, count, stop) in
if object is PHAsset {
let asset = object as! PHAsset
let imageSize = CGSize(width: asset.pixelWidth,height: asset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .FastFormat
options.synchronous = true
self.imageManager.requestImageForAsset(asset, targetSize: imageSize, contentMode: .AspectFill, options: options, resultHandler: { [weak self]
image, info in
self.addAssetToSync(image, info: info)
assetsToDelete.append(asset)
})
}
}
self.deleteAssets(assetsToDelete)
}
func addAssetToSync(image: UIImage?, info: [NSObject : AnyObject]?) {
guard let image = image else {
return
}
guard let info = info else {
return
}
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), {
let imageData = UIImageJPEGRepresentation(image, 0.95)!
let fileUrl = info["PHImageFileURLKey"] as! NSURL
dispatch_async(dispatch_get_main_queue(), {
let photoRootItem = DatabaseManager.sharedInstance.getPhotosRootItem()
let ssid = DatabaseManager.sharedInstance.getSsidInfoByName(ContentManager.sharedInstance.ssid)
let item = StorageManager.sharedInstance.createFile(imageData, name: fileUrl.absoluteString.fileNameWithoutPath(), parentFolder: photoRootItem!, ssid: ssid!)
})
})
}
func deleteAssets(assetsToDelete: [PHAsset]){
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
PHAssetChangeRequest.deleteAssets(assetsToDelete)
}, completionHandler: { success, error in
guard let error = error else {return}
})
}
}
It's working but my problem is that it's working just for a limited number of photos. When I try it with all I get memory warnings and then app crashed. I know why it is. I know that my problem is that I get all photos to memory and it's too much. I could fetch images with that fetch limit and make it to loop but I am not sure if it is best solution.
I was hoping that with some solution process few photos then release memory and again and again until end. But this change would be somewhere in enumerateObjectsUsingBlock. I am not sure if it helps but I don't even need to get image. I just need to copy image file from device path to my app sandbox path.
What's best solution for this? How to avoid memory warnings and leaks? Thanks

Change your dispatch_async calls to dispatch_sync. Then you will process photos one at a time as you walk through enumerateObjectsUsingBlock, instead of trying to process them all at the same time.

Related

Load Photo Library iOS 14 With Limited Access

I have an app that takes pictures that are stored in the photos library. I would like to be able to load just the images taken with the app into an app library .I have two functions that load the photos library and then request an individual image. The images are requested in a foreach loop. That works fine with full access. However, with limited access I get nothing. If I use the photo picker I get the pictures selected.
My retrieval code is:
func loadLibrary() {
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate",ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
assets = PHAsset.fetchAssets(with: fetchOptions)
}
func loadImage(_ asset: PHAsset) -> UIImage? {
var image: UIImage? = nil
let option = PHImageRequestOptions()
option.isSynchronous = true
option.isNetworkAccessAllowed = true
option.resizeMode = .fast
manager.requestImage(for: asset, targetSize: PHImageManagerMaximumSize, contentMode: .aspectFill, options: option) { img, err in
guard let img = img else { return }
image = img
}
return image
}
My saving code:
final class ImageSaver: NSObject, ObservableObject {
public static let shared = ImageSaver()
let objectDidChange = PassthroughSubject<Void, Never>()
#Published var saved = false {
didSet {
if saved {
objectDidChange.send()
}
}
}
func writeToPhotoAlbum(image: UIImage) {
UIImageWriteToSavedPhotosAlbum(image, self, #selector(saveError), nil)
}
#objc func saveError(_ image: UIImage, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) {
if let error = error {
print("Image not saved. Error: \(error)")
self.saved = false
} else {
print("Save finished!")
self.saved = true
}
}
}
In the WWDC2020 Session Video Handle the Limited Photos Library in your app the video states that "When your app creates new assets they will automatically be included as part of the user's selection for the application." This is exactly the behavior I want, but it is not the behavior I am getting. Changing the privacy settings shows the fetch and load are working as expected.

"Message from debugger: Terminated due to memory issue " with images

I am trying to fetch images from album and this code fetch all images from specific album, but on scrolling app will close and give error "Message from debugger: Terminated due to memory issue". Please check the code and find the error.(I want to fetch all albums and images like "Lalalab" app without memory warning).
func fatchImagesfromAlbum() {
DispatchQueue.global(qos: .background).async {
self.photoAssets = self.fetchResult as! PHFetchResult<AnyObject>
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
self.photoAssets = PHAsset.fetchAssets(in: self.assetCollection, options: fetchOptions) as! PHFetchResult<AnyObject>
for i in 0..<self.photoAssets.count{
let asset = self.photoAssets.object(at: i)
let imageSize = CGSize(width: asset.pixelWidth,
height: asset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .fastFormat
options.isSynchronous = true
self.imageManager.requestImage(for: asset as! PHAsset, targetSize: imageSize, contentMode: .aspectFill, options: options, resultHandler: { (image, info) -> Void in
self.images.append(image!)
let url:NSURL = info!["PHImageFileURLKey"] as! NSURL
let urlString: String = url.path!
let theFileName = (urlString as NSString).lastPathComponent
print("file name\(info!)")
self.imageName.append("\(theFileName)")
self.imagePath.append("\(urlString)")
})
print(self.imagePath)
print(self.imageName)
DispatchQueue.main.async
{
[unowned self] in
self.collectionView.reloadData()
}
}
}
PHPhotoLibrary.shared().register(self)
if fetchResult == nil {
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
fetchResult = PHAsset.fetchAssets(with: allPhotosOptions)
}
}
Faced such an issue; It has nothing to do with making weak or unowned references. When objects have been created by either your Objective-C code or using Cocoa classes, what you should do is to deal with autoreleasepool, try to call your method inside autoreleasepool:
autoreleasepool {
fatchImagesfromAlbum()
}
Should it be fetchImagesfromAlbum instead of fatchImagesfromAlbum?
Citing from Advanced Memory Management Programming Guide:
Autorelease pool blocks provide a mechanism whereby you can relinquish
ownership of an object, but avoid the possibility of it being
deallocated immediately (such as when you return an object from a
method). Typically, you don’t need to create your own autorelease pool
blocks, but there are some situations in which either you must or it
is beneficial to do so.
However, it should not be required to execute the whole method in the autoreleasepool, probably the reason of the memory issue is caused by executing fetchAssets or requestImage iteratively (inside the for loop).
Refers to: Is it necessary to use autoreleasepool in a Swift program?
You need to create a diapatch group then whenever you run loop enter group and leave. After loop has finished iteration you can call diapatch.notify and relaod your UI

How to fetch all albums with images from iPhone without taking time in Swift?

I tried to fetch all images from album. It fetches all images with their URL's and image data, but it does not fetch images directly from given URL path, So I need to download images in Document Directory and then get path. So it's taking too much time. I use below code. I want fetch images like iPhone photos library fetches.
Please find error.
func fatchImagesfromAlbum() {
DispatchQueue.global(qos: .background).async {
self.photoAssets = self.fetchResult as! PHFetchResult<AnyObject>
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
self.photoAssets = PHAsset.fetchAssets(in: self.assetCollection, options: fetchOptions) as! PHFetchResult<AnyObject>
for i in 0..<self.photoAssets.count{
autoreleasepool {
let asset = self.photoAssets.object(at: i)
let imageSize = CGSize(width: asset.pixelWidth,
height: asset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .fastFormat
options.isSynchronous = true
options.isNetworkAccessAllowed = true
self.imageManager.requestImage(for: asset as! PHAsset, targetSize: imageSize, contentMode: .aspectFill, options: options, resultHandler: { (image, info) -> Void in
if image != nil {
let image1 = image as! UIImage
let imageUrl = info!["PHImageFileURLKey"] as? NSURL
let imageName = imageUrl?.lastPathComponent
let urlString: String = imageUrl!.path!
let theFileName = (urlString as NSString).lastPathComponent
self.imageName.append("\(theFileName)")
self.imagePath.append("\(urlString)")
let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first!
let photoURL = NSURL(fileURLWithPath: documentDirectory)
let localPath = photoURL.appendingPathComponent(imageName!)
DispatchQueue.global(qos: .background).async {
if !FileManager.default.fileExists(atPath: localPath!.path) {
do {
try UIImageJPEGRepresentation(image1, 0.1)?.write(to: localPath!)
print("file saved")
}catch {
print("error saving file")
}
}
else {
print("file already exists")
}
}
}
})
DispatchQueue.main.async
{
self.collectionView.reloadData()
}
}
}
self.hudHide()
}
PHPhotoLibrary.shared().register(self)
if fetchResult == nil {
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
fetchResult = PHAsset.fetchAssets(with: allPhotosOptions)
}
}
I recommend simply using UIImagePickerController or if your app requires multiple image selection functionality, a third-party library like DKImagePickerController. As another user already mentioned in the comments, these will only copy the image(s) the user selected into your app's directory and save on processing time.

How to retrieve PHAsset from UIImagePickerController

I'm trying to retrieve a PHAsset however PHAsset.fetchAssets(withALAssetURLs:options:) is deprecated from iOS 8 so how can I properly retrieve a PHAsset?
I had the same the issue, first check permissions and request access:
let status = PHPhotoLibrary.authorizationStatus()
if status == .notDetermined {
PHPhotoLibrary.requestAuthorization({status in
})
}
Just hook that up to whatever triggers your UIImagePickerController. The delegate call should now include the PHAsset in the userInfo.
guard let asset = info[UIImagePickerControllerPHAsset] as? PHAsset
Here is my solution:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if #available(iOS 11.0, *) {
let asset = info[UIImagePickerControllerPHAsset]
} else {
if let assetURL = info[UIImagePickerControllerReferenceURL] as? URL {
let result = PHAsset.fetchAssets(withALAssetURLs: [assetURL], options: nil)
let asset = result.firstObject
}
}
}
The PHAsset will not appear in the didFinishPickingMediaWithInfo: info result unless the user has authorized, which did not happen for me just by presenting the picker. I added this in the Coordinator init():
let status = PHPhotoLibrary.authorizationStatus()
if status == .notDetermined {
PHPhotoLibrary.requestAuthorization({status in
})
}
I am not sure what you want.
Are you trying to target iOS 8?
This is how I fetch photos and it works in iOS (8.0 and later), macOS (10.11 and later), tvOS (10.0 and later).
Code is commented where it may be confusing
The first functions sets the options to fetch the photos
The second function will actually fetch them
//import the Photos framework
import Photos
//in these arrays I store my images and assets
var images = [UIImage]()
var assets = [PHAsset]()
fileprivate func setPhotoOptions() -> PHFetchOptions{
let fetchOptions = PHFetchOptions()
fetchOptions.fetchLimit = 15
let sortDescriptor = NSSortDescriptor(key: "creationDate", ascending: false)
fetchOptions.sortDescriptors = [sortDescriptor]
return fetchOptions
}
fileprivate func fetchPhotos() {
let allPhotos = PHAsset.fetchAssets(with: .image, options: setPhotoOptions())
DispatchQueue.global(qos: .background).async {
allPhotos.enumerateObjects({ (asset, count, stop) in
let imageManager = PHImageManager.default()
let targetSize = CGSize(width: 200, height: 200)
let options = PHImageRequestOptions()
options.isSynchronous = true
imageManager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: options, resultHandler: { (image, info) in
if let image = image {
self.images.append(image)
self.assets.append(asset)
}
if count == allPhotos.count - 1 {
DispatchQueue.main.async {
//basically, here you can do what you want
//(after you finish retrieving your assets)
//I am reloading my collection view
self.collectionView?.reloadData()
}
}
})
})
}
}
Edit based on OP's clarification
You need to set the delegate UIImagePickerControllerDelegate
then implement the following function
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]) {
within said method, get the image like this:
var image : UIImage = info[UIImagePickerControllerEditedImage] as! UIImage

Swift memory will not release

I have a class that get all the user assets, get the image and then perform face detection on each of the images. I am calling my class from the viewDidLoad function like this:
override func viewDidLoad() {
_ = autoreleasepool{
return faceProcessing()
}
}
And this is the class:
class faceProcessing{
let manager = PHImageManager.default()
let option = PHImageRequestOptions()
init() {
let realm = try! Realm()
let assets = getAssets(realm: realm)
option.isSynchronous = true
option.deliveryMode = .fastFormat
option.isNetworkAccessAllowed = true
var featuresCount = 0
for assetIndex in 0...assets.count-1{
featuresCount += autoreleasepool{() -> Int? in
print(assetIndex)
var facesCount = 0
faceDetection(asset: assets[assetIndex]) { (features) in
facesCount = features
}
return facesCount
}!
}
print(featuresCount)
}
func getAssets(realm: Realm)->[PHAsset]{
return autoreleasepool{ () -> [PHAsset] in
let images = realm.objects(Image.self).filter("asset != nil")
let ids: [String] = images.map {$0.id}
let options = PHFetchOptions()
options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
let assets = PHAsset.fetchAssets(withLocalIdentifiers: ids, options: options)
var assetArray = [PHAsset]()
if(assets.count > 1){
for assetIndex in 0...assets.count-1{
assetArray.append(assets[assetIndex])
}
}
return assetArray
}
}
func faceDetection(asset: PHAsset, completionHandler:#escaping (Int)->Void){
manager.requestImageData(for: asset, options: option){
(data, responseString, imageOriet, info) in
if (data != nil){
weak var faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: [CIDetectorAccuracy: CIDetectorAccuracyHigh])
let faces = (faceDetector?.features(in: CIImage(data: data!)!))
faceDetector = nil
completionHandler((faces?.count)!)
}else{
print(info)
}
}
}
}
The problem is that after finishing the process of getting the images and detecting the faces, the memory will not release. Before the process the memory was about 60MB, during the process it was between 400MB to 500MB, and when the process finished it went back to 300MB. Is there is a way I could release the entire memory and have 60MB after the process will finish ?
Here is how the instruments looks:
It looks like the face detection is not releasing its memory, so its there is a way I could release the CIDetection from memory?

Resources