Handling multiple images in swift - ios

I'm trying to develop an iOS app where users can pick their images from Camera Roll, Facebook, Instagram, etc. and edit them later (crop, filters, rotate).
Now, I'm having problems loading multiple images using Photos Framework to get PHAssets and converting them to full quality images and show them in next view.
When loading full quality images in SelectedImagesView I get memory warning and the app just blows because I'm storing images in memory.
How can I store a list of selected images (original and edited) and show them to the user according to the image "SelectedImagesView"?
1) PickImagesView
2) SelectedImagesView
So far, this is how I get the images using the Photo Framework:
Load the PHAssets from selected album in CollectionView
When user select an image, store the PHAsset identifier in NSUserDefault
When user finish selecting images, load all selected identifiers and send them to next View
Load images from Photos Library in SelectedImagesView with following code:
func processImages() {
for identifier in self.selectedAssets {
let options = PHImageRequestOptions()
options.deliveryMode = .FastFormat
// request images no bigger than 1/3 the screen width
let maxDimension = UIScreen.mainScreen().bounds.width / 3 * UIScreen.mainScreen().scale
let size = CGSize(width: maxDimension, height: maxDimension)
let assets = PHAsset.fetchAssetsWithLocalIdentifiers([identifier], options: nil)
guard let asset = assets.firstObject as? PHAsset
else { fatalError("no asset") }
PHImageManager.defaultManager().requestImageForAsset(asset, targetSize: PHImageManagerMaximumSize, contentMode: .AspectFill, options: options)
{ result, info in
// probably some of this code is unnecessary, too,
// but I'm not sure what you're doing here so leaving it alone
let photo: Photo = Photo()
photo.originalPhoto = result
self.selectedImages.append(photo)
}
}
self.tableView!.reloadData()
self.tableView!.reloadInputViews()
}
But the memory cost is too high.
Thanks in advance.

Related

Get video metadata, how to detect whether the video is taken from front- or back-camera

I want to know whether is video is taken from the front or back camera. As info return nil.
let asset: PHAsset!
let manager = PHImageManager.default()
if asset.mediaType == .video {
let options: PHVideoRequestOptions = PHVideoRequestOptions()
options.version = .original
options.isNetworkAccessAllowed = true
manager.requestAVAsset(forVideo: asset, options: options) { (asset, audiomix, info) in
}
Surprisingly you are able to use PHImageManager.requestImageDataAndOrientation for VIDEO assets.
Xcode 12.5
iPhone SE 2020 (iOS 14.6)
Observations
let imageManager = PHImageManager.default()
imageManager.requestImageDataAndOrientation(for: videoPHAsset, options: nil) { (data, string, orientation, info) in
if let data = data, let ciImage = CIImage(data: data) {
print(ciImage.properties)
}
}
Upon serializing ciImage.properties into JSON, I got following results.
1. Video (back camera)
{"ColorModel":"RGB","PixelHeight":1920,"{Exif}":{"ColorSpace":1,"PixelXDimension":1080,"ExifVersion":[2,2,1],"FlashPixVersion":[1,0],"PixelYDimension":1920,"SceneCaptureType":0,"ComponentsConfiguration":[1,2,3,0]},"ProfileName":"HDTV","DPIHeight":72,"PixelWidth":1080,"{TIFF}":{"YResolution":72,"ResolutionUnit":2,"XResolution":72},"{JFIF}":{"DensityUnit":0,"YDensity":72,"JFIFVersion":[1,0,1],"XDensity":72},"Depth":8,"DPIWidth":72}
2. Video (front camera)
{"Depth":8,"{TIFF}":{"YResolution":72,"ResolutionUnit":2,"XResolution":72},"PixelHeight":1334,"{JFIF}":{"DensityUnit":0,"YDensity":72,"JFIFVersion":[1,0,1],"XDensity":72},"ProfileName":"sRGB IEC61966-2.1","PixelWidth":750,"ColorModel":"RGB","DPIHeight":72,"DPIWidth":72,"{Exif}":{"ColorSpace":1,"PixelXDimension":750,"ExifVersion":[2,2,1],"FlashPixVersion":[1,0],"PixelYDimension":1334,"SceneCaptureType":0,"ComponentsConfiguration":[1,2,3,0]}}
3. Screenshot
{"ColorModel":"RGB","{Exif}":{"PixelXDimension":750,"PixelYDimension":1334,"DateTimeOriginal":"2021:01:21 14:25:56","UserComment":"Screenshot"},"{PNG}":{"InterlaceType":0},"HasAlpha":true,"Depth":16,"{TIFF}":{"Orientation":1},"PixelHeight":1334,"ProfileName":"Display P3","PixelWidth":750,"Orientation":1}
4. Photo (has a TON of information and JSON serialization crashes, so I extracted only the "{Exif}" part)
{"ExifVersion":[2,3,1],"Flash":24,"LensModel":"iPhone SE (2nd generation) back camera 3.99mm f\\/1.8","OffsetTimeDigitized":"+05:30","SubsecTimeOriginal":"630","LensSpecification":[3.990000009536743,3.990000009536743,1.7999999523162842,1.7999999523162842],"ExposureMode":0,"CompositeImage":2,"LensMake":"Apple","FNumber":1.8,"OffsetTimeOriginal":"+05:30","PixelYDimension":3024,"ApertureValue":1.6959938128383605,"ExposureBiasValue":0,"MeteringMode":5,"ISOSpeedRatings":[400],"ShutterSpeedValue":4.6443251405083465,"SceneCaptureType":0,"FocalLength":3.99,"DateTimeOriginal":"2021:01:21 20:47:05","SceneType":1,"FlashPixVersion":[1,0],"ColorSpace":65535,"SubjectArea":[2013,1511,2217,1330],"PixelXDimension":4032,"FocalLenIn35mmFilm":28,"SubsecTimeDigitized":"630","OffsetTime":"+05:30","SensingMethod":2,"BrightnessValue":0.06030004492448258,"DateTimeDigitized":"2021:01:21 20:47:05","ComponentsConfiguration":[1,2,3,0],"WhiteBalance":0,"ExposureTime":0.04,"ExposureProgram":2}
The videos have "ExifVersion":[2,2,1] while the photo has "ExifVersion":[2,3,1]. The videos' exif doesn't provide any useful information about camera at all - while photo's exif does. All of the videos & photos were captured on the same phone.
At this point, still clueless if this information is even encoded into video frames at all.
This may help you. If you get PHAssets with filtering by PHAssetCollectionSubtype.smartAlbumSelfPortraits, those assets
seem to videos are taken in front camera.
let collection = PHAssetCollection.fetchAssetCollections(with: PHAssetCollectionType.smartAlbum,
subtype: PHAssetCollectionSubtype.smartAlbumSelfPortraits,
options: nil)
Even though the video doesn't have faces, it is categorized as a selfie on iOS for now. I tested with iPhone 12 mini/ iOS15.
Apple also says like this:
A smart album that groups all photos and videos captured using the
device’s front-facing camera.
Blockquote
https://developer.apple.com/documentation/photokit/phassetcollectionsubtype/smartalbumselfportraits
So maybe you can prefetch selfie album assets, then check if an asset is contained in the album to detect it is taken by the front camera.

how to get all images from phone in ios

I want to get all images from the phone.for create multiple selections in photos library.I am using below code to get all images from the phone. it's working perfectly but when on the phone have 1000 images its take more time to get all images and show on view.How to reduce the time of getting all images from the phone.
// get all images from photos
func getAllImagesFromPhotos() -> [Photos]?
{
let imgManager = PHImageManager.default()
let requestOptions = PHImageRequestOptions()
requestOptions.isSynchronous = true
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: true)]
let fetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.image, options: fetchOptions)
var allImages = [Photos]()
for index in 0..<fetchResult.count
{
let asset = fetchResult.object(at: index) as PHAsset
imgManager.requestImage(for: asset, targetSize: CGSize(width:200,height:200), contentMode: .aspectFill, options: requestOptions, resultHandler: { (uiimage, info) in
if let image = uiimage
{
allImages.append(Photos(image: image, selected: false))
}
})
}
return allImages
}
To solve your problem you want to actually batch the image loading, load 10 images at a time, and display them, when the user scrolls a bit more, load the next batch, etc..
if you load all images, then of course there will be an unavoidable performance hit.
what you want to use instead is the UIImagePickerController class from UIKit
Here is the documentation from apple on UIImagePickerController.
If you don't want to use UIImagePickerController, then you have to load a smaller set of images at a time and allow the users to "load more" by scrolling further on the selection view.
EDIT: Multi image selection
After reading your comment about wanting to select multiple images, it's clear that UIImagePickerController is made to select single images, however, you can just not dismiss it upon selection, and have a toast message saying "Image selected", when the user select an image, and not dismiss the view, allowing the user to select another image, etc.. if the user select the same image again, you may just show the toast again saying "Image removed" since you can verify if the image has already been selected from before.
if you don't want to implement that workaround, your only choice is to batch load images in smaller chunks of around 10 images or so as I recommended in the original answer.
however, If you really don't want to implement that yourself, i recommend using a 3rd party library, something like ELCImagePickerController for example, there are hundreds of other libraries that do just that.
Hope this helps, good luck

ios - fast loading images/videos from phone with swift (like instagram does)

I am building an app where you can pick images from your phone or make new ones with the camera. The question is about picking internal images. First I was using ImagepickerController where a new Controller pops up and I can select the image I want. Later I decided to change that design approach, but to embed all the pictures from my phone inside the main screen (the one that shows when you enter the app) where you can select each image by just tapping on it (so no new controller pops up). And there is the problem. Reading pictures and loading them inside my collectionview takes just too much time (reading pictures is the main issue). I have over 1000 pictures on my phone, and it takes like 7 seconds for the app to read 100 pictures (than it crashes, not sure why, maybe some memory issue, didnt look further into it bcz my main problem was the reading performance).
For reading I am using PHAssetCollection, this is the code:
let allPhotosResult = PHAsset.fetchAssets(with: PHAssetMediaType.image, options: nil)
let imageManager = PHCachingImageManager()
allPhotosResult.enumerateObjects({(object: AnyObject!,
count: Int,
stop: UnsafeMutablePointer<ObjCBool>) in
if object is PHAsset{
let imgAsset = object as! PHAsset
countI += 1
if countI > 100 {
return
}
let imageSize = CGSize(width: imgAsset.pixelWidth,
height: imgAsset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .fastFormat
options.isSynchronous = true
imageManager.requestImage(for: imgAsset,
targetSize: imageSize,
contentMode: .aspectFill,
options: options,
resultHandler: {
(image, info) -> Void in
self._images.append(image!)
}
})
}
})
I quit at 100 images bcz then they are being displayed, but again thats not the issue I think I will be able to solve that.
So, is there a faster way to read all the images and load them as thumbnails into my collectionview ? I didnt have any of the popular photos apps, but I downloaded some of them to see how their performance is, and Instagram is pretty impressive. All the images are there without any delay, which means there must be a way to get them all pretty fast. Maybe they fetch them in the background even when the app is not running ?
Regards
P.S. - ah yes, I know I can execute the reading async, but that doesnt solve my problem bcz the delay will still be there

Retrieve Multiple Images Using PHAsset

I'm trying to retrieve over 1,000 images from the user's camera roll with PHAsset but it ends up crashing or taking a long time if it's just thumbnails. Here is my function where I retrieve the images...
func retrieveImages(thumbnail: Bool) {
/* Retrieve the items in order of modification date, ascending */
let options = PHFetchOptions()
options.sortDescriptors = [NSSortDescriptor(key: "modificationDate",
ascending: false)]
/* Then get an object of type PHFetchResult that will contain
all our image assets */
let assetResults = PHAsset.fetchAssetsWithMediaType(.Image,
options: options)
let imageManager = PHCachingImageManager()
assetResults.enumerateObjectsUsingBlock{(object: AnyObject!,
count: Int,
stop: UnsafeMutablePointer<ObjCBool>) in
if object is PHAsset{
let asset = object as! PHAsset
print("Inside If object is PHAsset, This is number 1")
var imageSize: CGSize!
if thumbnail == true {
imageSize = CGSize(width: 100, height: 100)
} else {
imageSize = CGSize(width: self.cameraView.bounds.width, height: self.cameraView.bounds.height)
}
/* For faster performance, and maybe degraded image */
let options = PHImageRequestOptions()
options.deliveryMode = .FastFormat
options.synchronous = true
imageManager.requestImageForAsset(asset,
targetSize: imageSize,
contentMode: .AspectFill,
options: options,
resultHandler: { (image, _: [NSObject : AnyObject]?) -> Void in
if thumbnail == true {
self.libraryImageThumbnails.append(image!)
self.collectionTable.reloadData()
} else {
self.libraryImages.append(image!)
self.collectionTable.reloadData()
}
})
/* The image is now available to us */
print("enum for image, This is number 2")
print("Inside If object is PHAsset, This is number 3")
}
print("Outside If object is PHAsset, This is number 4")
}
}
Please tell me if any more information is needed. Thank you!
Generally you don't want to be loading tons of full-size or screen-sized images and keeping them in memory yourself. For that size, you're only going to be presenting one (or two or three, if you want to preload for screen-transition animations) at a time, so you don't need to fetch more than that at once.
Similarly, loading hundreds or thousands of thumbnails and caching them yourself will cause you memory issues. (Telling your collection view to reloadData over and over again causes a lot of churn, too.) Instead, let the Photos framework manage them — it has features for making sure that thumbnails are generated only when needed and cached between uses, and even for helping you with tasks like grabbing only the thumbnails you need as the user scrolls in a collection view.
Apple's Using Photos Framework sample code project shows several best practices for fetching images. Here's a summary of some of the major points:
AssetGridViewController, which manages a collection view of thumbnails, caches a PHFetchResult<PHAsset> that's given to it upon initialization. Info about the fetch result (like its count) is all that's needed for the basic management of the collection view.
AssetGridViewController requests thumbnail images from PHImageManager individually, only when the collection view asks for a cell to display. This means that we only create a UIImage when we know a cell is going to be seen, and we don't have to tell the collection view to reload everything over and over again.
On its own, step 2 would lead to slow scrolling, or rather, to slow catch-up of images loading into cells after you scroll. So, we also do:
Instead of using PHImageManager.default(), we keep an instance of PHCachingImageManager, and tell it what set of images we want to "preheat" the cache for as the user scrolls.
AssetGridViewController has some extra logic for keeping track of the visible rect of the scroll view so that the image manager knows to prepare thumbnails for the currently visible items (plus a little bit of scroll-ahead), and can free resources for items no longer visible if it needs to. Note that prefetching doesn't create UIImages, just starts the necessary loading from disk or downloading from iCloud, so your memory usage stays small.
On selecting an item in the collection view (to segue to a full-screen presentation of that photo), AssetViewController uses the default PHImageManager to request a screen-sized (not full-sized) image.
Opportunistic delivery means that we'll get a lower-quality version (basically the same thumbnail that Photos has already cached for us from its previous use in the collection view) right away, and a higher quality version soon after (asynchronously, after which we automatically update the view).
If you need to support zooming (that sample code app doesn't, but it's likely a real app would), you could either request a full-size image right away after going to the full-screen view (at the cost of taking longer to transition from blown-up-thumbnail to fullscreen image), or start out by requesting a screen-sized image and then also requesting a full-sized image for later delivery.
In CellForRowAtIndexPath use the PHImageManager's RequestImageForAsset method to lazy load the image property on your UIImageView.

PHImageManager.requestImageForAsset returns nil when creating thumbnail for video

For some videos, the requestImageForAsset completes with a UIImage that's nil. For other videos it works fine and I haven't figured out why yet.
func createThumbnailForVideo(video: PHAsset) -> Future<NSURL> {
let promise = Promise<NSURL>()
let options = PHImageRequestOptions()
options.synchronous = true
imageManager.requestImageForAsset(video, targetSize: CGSizeMake(640, 640), contentMode: .AspectFill, options: options) { (image:UIImage!, info) -> Void in
if image == nil {
println("Error: Couldn't create thumbnail for video")
promise.error(MyErrors.videoThumb())
} else {
if let thumbURL = self.savePhotoAsTemporaryFile(image) {
promise.success(thumbURL)
} else {
promise.error(MyErrors.videoThumb())
}
}
}
return promise.future
}
I also get back the info for the request but I don't know how to interpret the information:
[PHImageResultIsDegradedKey: 0, PHImageResultWantedImageFormatKey: 4037, PHImageResultIsPlaceholderKey: 0, PHImageResultIsInCloudKey: 0, PHImageResultDeliveredImageFormatKey: 9999]
I was having the same problem today. For me I had to add the option to download the image if necessary. I think the image manager had a thumbnail size version available but since I hadn't allowed it to fetch the actual image from the network it would return nil on the second call back. So to fix this I created a PHImageRequestOptions() object like this:
var options = PHImageRequestOptions()
options.networkAccessAllowed = true
Then send this as a param with your request:
PHImageManager.defaultManager().requestImageForAsset(asset, targetSize: size, contentMode: PHImageContentMode.AspectFill, options: options) { (image, info) -> Void in
if (image != nil) {
cell.imageView.image = image
}
}
Once I did this the second callback image was not nil. I think it is still a good idea to guard against having a nil image so you don't set the image view's image to nil. I don't think you can assume that the image will always be there. Hope this helps!
EDIT: Just to clarify. In my case for each request, the closure would be called twice. The first time the image was not nil and the second time it was. I think this is because a thumbnail sized version was available but the full size was not. It needed network access to fetch the full size image.
I finally found the problem. The contentMode was .AspectFill but should be .AspectFit. I guess I could get it working by tweaking PHImageRequestOptionsDeliveryMode and PHImageRequestOptionsResizeMode if I read the comments. However .AspectFit was what I was looking for.
enum PHImageContentMode : Int {
// Fit the asked size by maintaining the aspect ratio, the delivered image may not necessarily be the asked targetSize (see PHImageRequestOptionsDeliveryMode and PHImageRequestOptionsResizeMode)
case AspectFit
// Fill the asked size, some portion of the content may be clipped, the delivered image may not necessarily be the asked targetSize (see PHImageRequestOptionsDeliveryMode && PHImageRequestOptionsResizeMode)
case AspectFill
}
I had similar issue, but it was for photos not video. I had missed to specify the media type.
self.assetsFetchResults = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
Above line solved the issue for me.

Resources