I'm facing the problem that people are saying that the application crashes when loading photos library in collectionView. Most of the time they have more than 2-3 thousands of photos. For me it's working fine. (I have less than 1000 photos in my library).
Question: maybe there's any way to display images in collectionView more "smart" and using less memory?
P.S Maybe when user have all of his photos in iCloud that can cause crash too? Because until now I thought that application is not downloading photos when loading it into the cell. Maybe someone can prove or disapprove that fact.
Here's my function:
func grabPhotos(){
let imgManager = PHImageManager.default()
let requestOptions = PHImageRequestOptions()
requestOptions.isSynchronous = false
requestOptions.deliveryMode = .opportunistic // Quality of images
requestOptions.isNetworkAccessAllowed = true
requestOptions.isSynchronous = true
requestOptions.progressHandler = { (progress, error, stop, info) in
print("progress: \(progress)")
}
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
if let fetchResult : PHFetchResult = PHAsset.fetchAssets(with: .image, options: fetchOptions) {
if fetchResult.count > 0 {
for i in 0..<fetchResult.count {
imgManager.requestImage(for: fetchResult.object(at: i) , targetSize: CGSize(width: 150, height: 150), contentMode: .aspectFill, options: requestOptions, resultHandler: { image, error in
self.imageArray.append(image!)
self.kolekcija.reloadData()
})
}
}
else {
print("You got no photos")
kolekcija.reloadData()
}
}
}
The problem is this line:
self.imageArray.append(image!)
Never make an array of images. Images are huge, and you'll run out of memory.
This is a collection view. You simply supply the image on demand in itemForRowAt:, and what you supply is a small version of the image (a so-called thumbnail), the smallest possible for the display size. Thus there are, at any time, only enough thumbnails to fill the visible screen; that doesn't take very much memory, plus the runtime can release the image memory when an image is scrolled off the screen.
That is what PHCachingImageManager is for. Look closely at how Apple handles this in their sample code:
https://developer.apple.com/library/archive/samplecode/UsingPhotosFramework/Listings/Shared_AssetGridViewController_swift.html
There is no array of images anywhere in that example.
Working in Swift 5. Get the photo assets, but don't load the photos until cellForItem iteration. Happy coding.
var personalImages: PHFetchResult<AnyObject>!
func getPersonalPhotos() {
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
self.personalImages = (PHAsset.fetchAssets(with: .image, options: fetchOptions) as AnyObject?) as! PHFetchResult<AnyObject>?
photoLibraryCollectionView.reloadData()
}
func collectionView(_ collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int {
return personalImages.count
}
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "insertCellIdentifierHere", for: indexPath) as! InsertACollectionViewCellHere
let asset: PHAsset = self.personalImages[indexPath.item] as! PHAsset
let itemWidth = photoLibraryCollectionView.frame.size.width / 2
let itemSize = CGSize(width: itemWidth, height: itemWidth / 2) // adjust to your preferences
PHImageManager.default().requestImage(for: asset, targetSize: itemSize, contentMode: .aspectFill, options: nil, resultHandler: {(result, info)in
if result != nil {
cell.photoImageView.image = result
}
})
cell.layoutIfNeeded()
cell.updateConstraintsIfNeeded()
return cell
}
Related
I want to perform an operation on a video in the iphone camera roll, and I require an absolute URI as I will be using ffmpeg natively within my app.
Is it possible to operate on the video in place? or would I need to copy the video to a tmp dir, operate on it, and then write back to the camera roll?
I've read some docs and tutorials and answering below based on that research.
Is it possible to operate on the video in place?
Yes (By copying it to temp dir) and No (to the original location where the video is actually stored)
Take a look at the following image and quote from official docs
Using PhotoKit, you can fetch and cache assets for display and
playback, edit image and video content or manage collections of
assets such as albums, Moments, and Shared Albums.
We don't have direct access to the location where the image/video is stored instead we get raw data or representation using PHAsset and Asset objects are immutable so we can't perform operations directly on it. We would need PHAssetChangeRequest to create, delete, change the metadata for, or edit the content of a Photos asset.
would I need to copy the video to a temp dir, operate on it, and then write back to the camera roll?
Yep, that's the way to go.
If you already fetched the assets, and have the PHFetchResult object try:
var video = PHAsset() // the video to be edited
…
if video.canPerform(.content) { // check if the selected PHAsset can be edited
video.requestContentEditingInput(with: nil, completionHandler: { editingInput, _ in
let videoAsset = editingInput?.audiovisualAsset // get tracks and metadata of the video and start editing
let videoURL = (videoAsset as? AVURLAsset)?.url // This might be nil so better use videoAsset
/*
Start editing your video here
*/
guard let input = editingInput else { return }
let output = PHContentEditingOutput(contentEditingInput: input)
let outputURL = output.renderedContentURL // URL at which you write/export the edited video, it must be a .mov file
let editedVideo = NSData() // suppose your video fileName is editedVideo.mov, I used NSData since I don't know what final edited object will be.
editedVideo.write(to: outputURL, atomically: false)
PHPhotoLibrary.shared().performChanges({
let changeRequest = PHAssetChangeRequest(for: video)
changeRequest.contentEditingOutput = output
})
})
}
Or if you're using default imagePicker, we can get tmp video url using:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
let videoURL = info[UIImagePickerController.InfoKey.mediaURL] as! NSURL
print(videoURL) // file is already in tmp folder
let video = info[UIImagePickerController.InfoKey.phAsset] as! PHAsset
// implement the above code using this PHAsset
// your will still need to request photo library changes, and save the edited video and/or delete the older one
}
I implement something like this in my project I hope it will help you.
I show all the items in collection view and perform action on selection , you can also get the url of selected video
func getVideoFromCameraRoll() {
let options = PHFetchOptions()
options.sortDescriptors = [ NSSortDescriptor(key: "creationDate", ascending: false) ]
options.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.video.rawValue)
videos = PHAsset.fetchAssets(with: options)
videoLibraryCV.reloadData()
}
func collectionView(_ collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int {
return videos.count
}
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "Cell", for: indexPath)
let asset = videos!.object(at: indexPath.row)
let width: CGFloat = 150
let height: CGFloat = 150
let size = CGSize(width:width, height:height)
cell.layer.borderWidth = 0.5
cell.layer.borderColor = UIColor.lightGray.cgColor
PHImageManager.default().requestImage(for: asset, targetSize: size, contentMode: PHImageContentMode.aspectFit, options: nil)
{ (image, userInfo) -> Void in
let imageView = cell.viewWithTag(1) as! UIImageView
imageView.image = image
let labelView = cell.viewWithTag(2) as! UILabel
labelView.text = String(format: "%02d:%02d",Int((asset.duration / 60)),Int(asset.duration) % 60)
}
return cell
}
func collectionView(collectionView: UICollectionView, didSelectItemAtIndexPath indexPath: NSIndexPath) {
let asset = photos!.object(at: indexPath.row)
guard(asset.mediaType == PHAssetMediaType.Video)
else {
print("Not a valid video media type")
return
}
PHCachingImageManager().requestAVAssetForVideo(asset, options: nil, resultHandler: {
(asset: AVAsset ? , audioMix : AVAudioMix ? , info : [NSObject: AnyObject] ? ) in
let asset = asset as!AVURLAsset
print(asset.URL) // Here is video URL
})
}
I hope it will work for you ...:)
I've tried to fetch photos from the library. It works, but I just got 3 photos from 9 photos from the library. Here's my code:
let options = PHFetchOptions()
let userAlbums = PHAssetCollection.fetchAssetCollections(with: PHAssetCollectionType.album, subtype: PHAssetCollectionSubtype.any, options: options)
let userPhotos = PHAsset.fetchKeyAssets(in: userAlbums.firstObject!, options: nil)
let imageManager = PHCachingImageManager()
userPhotos?.enumerateObjects({ (object: AnyObject!, count: Int, stop: UnsafeMutablePointer) in
if object is PHAsset {
let obj:PHAsset = object as! PHAsset
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let options = PHImageRequestOptions()
options.deliveryMode = .fastFormat
options.isSynchronous = true
imageManager.requestImage(for: obj, targetSize: CGSize(width: obj.pixelWidth, height: obj.pixelHeight), contentMode: .aspectFill, options: options, resultHandler: { img, info in
self.images.append(img!)
})
}
})
When I tried images.count, it said 3. Can anyone help me to find my mistake and get all photos? Big thanks!
Try this,
first import photos
import Photos
then declare Array for store photo before viewDidLoad()
var allPhotos : PHFetchResult<PHAsset>? = nil
Now write code for fetch photo in viewDidLoad()
/// Load Photos
PHPhotoLibrary.requestAuthorization { (status) in
switch status {
case .authorized:
print("Good to proceed")
let fetchOptions = PHFetchOptions()
self.allPhotos = PHAsset.fetchAssets(with: .image, options: fetchOptions)
case .denied, .restricted:
print("Not allowed")
case .notDetermined:
print("Not determined yet")
}
}
Now write this code for display image from Array
/// Display Photo
let asset = allPhotos?.object(at: indexPath.row)
self.imageview.fetchImage(asset: asset!, contentMode: .aspectFit, targetSize: self.imageview.frame.size)
// Or Display image in Collection View cell
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
let cell = self.collectionView.dequeueReusableCell(withReuseIdentifier: "photoCell", for: indexPath) as! SelectPhotoCell
let asset = allPhotos?.object(at: indexPath.row)
cell.imgPicture.fetchImage(asset: asset!, contentMode: .aspectFit, targetSize: cell.imgPicture.frame.size)
return cell
}
extension UIImageView{
func fetchImage(asset: PHAsset, contentMode: PHImageContentMode, targetSize: CGSize) {
let options = PHImageRequestOptions()
options.version = .original
PHImageManager.default().requestImage(for: asset, targetSize: targetSize, contentMode: contentMode, options: options) { image, _ in
guard let image = image else { return }
switch contentMode {
case .aspectFill:
self.contentMode = .scaleAspectFill
case .aspectFit:
self.contentMode = .scaleAspectFit
}
self.image = image
}
}
}
Swift 5 - Update
First import Photos
Import Photos
Create a variable to hold all the images
var images = [UIImage]()
Main function to grab the assets and request image from asset
fileprivate func getPhotos() {
let manager = PHImageManager.default()
let requestOptions = PHImageRequestOptions()
requestOptions.isSynchronous = false
requestOptions.deliveryMode = .highQualityFormat
// .highQualityFormat will return better quality photos
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
let results: PHFetchResult = PHAsset.fetchAssets(with: .image, options: fetchOptions)
if results.count > 0 {
for i in 0..<results.count {
let asset = results.object(at: i)
let size = CGSize(width: 700, height: 700)
manager.requestImage(for: asset, targetSize: size, contentMode: .aspectFill, options: requestOptions) { (image, _) in
if let image = image {
self.images.append(image)
self.collectionView.reloadData()
} else {
print("error asset to image")
}
}
}
} else {
print("no photos to display")
}
}
Check first in info.plist that your app is authorized to access photos from library. than Use the below code to access all photos:
PHPhotoLibrary.requestAuthorization { (status) in
switch status {
case .authorized:
print("You Are Authrized To Access")
let fetchOptions = PHFetchOptions()
let allPhotos = PHAsset.fetchAssets(with: .image, options: fetchOptions)
print("Found number of: \(allPhotos.count) images")
case .denied, .restricted:
print("Not allowed")
case .notDetermined:
print("Not determined yet")
}
}
I am trying to get all the photos from cameraRoll using Photos framework but its taking a lot of time to fetch all the photos from cameraRoll.
Is their anyway to add pagination to it ?
so i can fetch while scrolling.
var images = [UIImage]()
var assets = [PHAsset]()
fileprivate func assetsFetchOptions() -> PHFetchOptions {
let fetchOptions = PHFetchOptions()
//fetchOptions.fetchLimit = 40 //uncomment to limit photo
let sortDescriptor = NSSortDescriptor(key: "creationDate", ascending: false)
fetchOptions.sortDescriptors = [sortDescriptor]
return fetchOptions
}
fileprivate func fetchPhotos() {
let allPhotos = PHAsset.fetchAssets(with: .image, options: assetsFetchOptions())
DispatchQueue.global(qos: .background).async {
allPhotos.enumerateObjects({ (asset, count, stop) in
//print(count)
let imageManager = PHImageManager.default()
let targetSize = CGSize(width: 200, height: 200)
let options = PHImageRequestOptions()
options.isSynchronous = true
imageManager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: options, resultHandler: { (image, info) in
if let image = image {
self.images.append(image)
self.assets.append(asset)
}
if count == allPhotos.count - 1 {
DispatchQueue.main.async {
self.collectionView?.reloadData()
}
}
})
})
}
}
allPhotos is of type PHFetchResult< PHAsset > which is a lazy collection, ie it doesn't actually go out and get the photo until you ask it for one, which is what .enumerateObjects is doing. You can just grab the photos one at a time with the subscript operator or get a range of objects with objects(at:) to page through the collection as needed.
I'm relatively new to Swift.
I'm currently trying to grab videos stored in the photo library and display them in a collection view. After selecting a video in the collection view, I want to be able to play the video.
Right now I've written part of the function grabVideos and I have 2 questions:
How should I store these videos? Can they be stored as UIImages? A lot of the other sources I found grabbed videos from online sources and they just stored the video url
What should I do in the resultHandler? I would assume thats were I store my videos into a global array
Note: code below is in a function called getVideos()
let imgManager = PHImageManager.default()
let requestOption = PHVideoRequestOptions()
requestOption.isSynchronous = true
requestOption.deliveryMode = .highQualityFormat
let fetchOption = PHFetchOptions()
fetchOption.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
if let fetchResult:PHFetchResult = PHAsset.fetchAssets(with: .video, options: fetchOption) {
if fetchResult.count > 0 {
for i in 0...fetchResult.count {
imgManager.requestAVAsset(forVideo: fetchResult.object(at: i) as! PHAsset, options: requestOption, resultHandler: {{ (<#AVAsset?#>, <#AVAudioMix?#>, <#[AnyHashable : Any]?#>) in
<#code#>
}})
}
} else {
print("Error: No Videos Found")
}
}
First you add a variable to your ViewController var fetchResults: PHFetchResult<PHAsset>?
Then you execute the fetch in viewDidLoad for instance
let fetchOption = PHFetchOptions()
fetchOption.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
let fetchResults = PHAsset.fetchAssets(with: .video, options: fetchOption);
self.fetchResults = fetchResults
if fetchResults.count == 0 {
print("Error: No Videos Found")
return
}
In your collectionViewCell you have to add a UIImageView so that we can show thumbnail in each cell, then you just do the following in collection view data source methods
func collectionView(_ collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int {
return fetchResults?.count ?? 0
}
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "reuse", for: indexPath) as! TestCollectionViewCell
let videoAsset = fetchResults!.object(at: indexPath.item)
PHImageManager.default().requestImage(for: videoAsset, targetSize: cell.bounds.size, contentMode: .aspectFill, options: nil) { (image: UIImage?, info: [AnyHashable : Any]?) in
cell.imageView.image = image
}
return cell
}
This can be approved upon, but will be OK for the first try, specifically look at using PHCachingImageManager instead of PHImageManager, more on this in the links below:
PHCachingImageManager
How to use PHCachingImageManager
How to play a video from PHAsset you can find answered:
Swift - Playing Videos from iOS PHAsset
I have an array of image assets. I have to turn those assets into images, add them to an array and upload them to Firebase Database. I have 2 issues with this.
Issue 1:
In a custom UICollectionViewCell I display all the images that the user has selected, I see 4 images in the cell, when I've selected 4 images from the Photos (I'm using a custom framework). Now, when I call requestImage method, I get double the amount of images in the array that's supposed to convert each asset from the asset array and store it into a UIImage array called assetsTurnedIntoImages. I read more about it and it's related to the PHImageRequestOptions and if its isSynchronous property returns true or false, that or if PHImageRequestOptions is nil. Now, obviously I didn't get something because my code still doesn't work.
Issue 2:
As you can see from the code below, the targetSize gives me a somewhat thumbnail image size. When I upload the image to the storage, I don't need a thumbnail, I need it's original size. If I set it to PHImageManagerMaximumSize I get an error:
"Connection to assetsd was interrupted or assetsd died”
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell
{
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "PhotoPostCVCell", for: indexPath) as! PhotoPostCVCell
if let takenImage = cameraPhotoUIImage
{
cell.cellImage.image = takenImage
}
if assets.count > 0
{
let asset = assets[indexPath.row]
let requestOptions = PHImageRequestOptions()
requestOptions.isSynchronous = true // synchronous works better when grabbing all images
requestOptions.deliveryMode = .opportunistic
imageManager.requestImage(for: asset, targetSize: CGSize(width: 100, height: 100), contentMode: .aspectFill, options: requestOptions)
{ (image, _) in
DispatchQueue.main.async {
print("WE ARE IN")
cell.cellImage.image = image!
self.assetsTurnedIntoImages.append(image!)
}
}
}
return cell
}
Change the option could solve the problem:
options.deliveryMode = .highQualityFormat
I found that solution in the source code:
#available(iOS 8, iOS 8, *)
public enum PHImageRequestOptionsDeliveryMode : Int {
#available(iOS 8, *)
case opportunistic = 0 // client may get several image results when the call is asynchronous or will get one result when the call is synchronous
#available(iOS 8, *)
case highQualityFormat = 1 // client will get one result only and it will be as asked or better than asked
#available(iOS 8, *)
case fastFormat = 2 // client will get one result only and it may be degraded
}
To avoid completion handler's twice calling, just add an option in this request to make it synchronous
let options = PHImageRequestOptions()
options.isSynchronous = true
let asset: PHAsset = self.photoAsset?[indexPath.item] as! PHAsset
PHImageManager.default().requestImage(for: asset, targetSize: CGSize(width: 1200, height: 1200), contentMode: .aspectFit, options: options, resultHandler: {(result, info) in
if result != nil {
//do your work here
}
})
To avoid crash on loading image you should compress the image or reduce its size for further work