ArrayBuffer.getElementSlowPath Does anyone face this error? - ios

In my project I show library image and video to user but in some device I got the crash like ArrayBuffer.getElementSlowPath. Can anyone guide me how i can replicate this issue? I got this issue from Crashlytics.
Here is my code for get videos from phassests.
func getVideo(withCompletionHandler completion:#escaping CompletionHandler) {
let fetchOptions = PHFetchOptions()
let requestOptions = PHVideoRequestOptions()
requestOptions.isNetworkAccessAllowed = false
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: false)]
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.video, options: fetchOptions)
fetchResult.enumerateObjects ({ (assest, index, isCompleted) in
if assest.sourceType != PHAssetSourceType.typeiTunesSynced{
PHImageManager.default().requestAVAsset(forVideo: assest , options: requestOptions, resultHandler: { (asset : AVAsset?, video : AVAudioMix?, dic : [AnyHashable : Any]?) in
if let _ = asset as? AVURLAsset
{
let objAssest = GallaryAssets()
objAssest.objAssetsType = assetsType.videoType
objAssest.createdDate = (assest ).creationDate
objAssest.assetsDuration = (assest ).duration
objAssest.assetsURL = (asset as! AVURLAsset).url
objAssest.localizationStr = assest.localIdentifier
objAssest.locationInfo = LocationInfo()
if let location = (assest).location
{
objAssest.locationInfo.Latitude = "\(location.coordinate.latitude)"
objAssest.locationInfo.Longitude = "\(location.coordinate.longitude)"
}
self.media.add(objAssest)
}
completion(self.media)
}
})
}
})
}
}

I can't say for sure that this is the cause of your crash, but for anyone else finding this question on Google search results:
I encountered a similar cryptic stack trace with ArrayBuffer.getElementSlowPath in Crashlytics. After much debugging I reproduced the crash locally and it turned out that a typed Swift array had an element in it which was not of the expected type. This had happened because the array in question was actually bridged from Objective-C which is less strict about types in arrays.

May be some video/ resource is coming from iCloud you need to remove PHVideorequestoptions then check your flow . Hope your crash will solve.
func getVideo(withCompletionHandler completion:#escaping CompletionHandler) {
let fetchOptions = PHFetchOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: false)]
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.video, options: fetchOptions)
fetchResult.enumerateObjects ({ (assest, index, isCompleted) in
if assest.sourceType != PHAssetSourceType.typeiTunesSynced{
PHImageManager.default().requestAVAsset(forVideo: assest , options: **PHVideoRequestOptions**(), resultHandler: { (asset : AVAsset?, video : AVAudioMix?, dic : [AnyHashable : Any]?) in
if let _ = asset as? AVURLAsset
{
let objAssest = GallaryAssets()
objAssest.objAssetsType = assetsType.videoType
objAssest.createdDate = (assest ).creationDate
objAssest.assetsDuration = (assest ).duration
objAssest.assetsURL = (asset as! AVURLAsset).url
objAssest.localizationStr = assest.localIdentifier
objAssest.locationInfo = LocationInfo()
if let location = (assest).location
{
objAssest.locationInfo.Latitude = "\(location.coordinate.latitude)"
objAssest.locationInfo.Longitude = "\(location.coordinate.longitude)"
}
self.media.add(objAssest)
isCompleted.pointee = true
}
})
}
})
completion(self.media)
}

Are you removing a controller on completion? The enumerateObject function is executed synchronously, if you call the completion before it stops executing, and the fetchResult object gets deallocated before the function ends its execution, this could cause issues.
Try calling the completion block after the enumerate ends, and if you want just one result, change the isCompleted pointer to true, this will stop the execution of the enumeration:
func getVideo(withCompletionHandler completion:#escaping CompletionHandler) {
let fetchOptions = PHFetchOptions()
let requestOptions = PHVideoRequestOptions()
fetchOptions.sortDescriptors = [NSSortDescriptor(key:"creationDate", ascending: false)]
let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.video, options: fetchOptions)
fetchResult.enumerateObjects ({ (assest, index, isCompleted) in
if assest.sourceType != PHAssetSourceType.typeiTunesSynced{
PHImageManager.default().requestAVAsset(forVideo: assest , options: requestOptions, resultHandler: { (asset : AVAsset?, video : AVAudioMix?, dic : [AnyHashable : Any]?) in
if let _ = asset as? AVURLAsset
{
let objAssest = GallaryAssets()
objAssest.objAssetsType = assetsType.videoType
objAssest.createdDate = (assest ).creationDate
objAssest.assetsDuration = (assest ).duration
objAssest.assetsURL = (asset as! AVURLAsset).url
objAssest.localizationStr = assest.localIdentifier
objAssest.locationInfo = LocationInfo()
if let location = (assest).location
{
objAssest.locationInfo.Latitude = "\(location.coordinate.latitude)"
objAssest.locationInfo.Longitude = "\(location.coordinate.longitude)"
}
self.media.add(objAssest)
isCompleted.pointee = true
}
})
}
})
completion(self.media)
}

Related

How to fetch all albums with images from iPhone without taking time in Swift?

I tried to fetch all images from album. It fetches all images with their URL's and image data, but it does not fetch images directly from given URL path, So I need to download images in Document Directory and then get path. So it's taking too much time. I use below code. I want fetch images like iPhone photos library fetches.
Please find error.
func fatchImagesfromAlbum() {
DispatchQueue.global(qos: .background).async {
self.photoAssets = self.fetchResult as! PHFetchResult<AnyObject>
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
self.photoAssets = PHAsset.fetchAssets(in: self.assetCollection, options: fetchOptions) as! PHFetchResult<AnyObject>
for i in 0..<self.photoAssets.count{
autoreleasepool {
let asset = self.photoAssets.object(at: i)
let imageSize = CGSize(width: asset.pixelWidth,
height: asset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .fastFormat
options.isSynchronous = true
options.isNetworkAccessAllowed = true
self.imageManager.requestImage(for: asset as! PHAsset, targetSize: imageSize, contentMode: .aspectFill, options: options, resultHandler: { (image, info) -> Void in
if image != nil {
let image1 = image as! UIImage
let imageUrl = info!["PHImageFileURLKey"] as? NSURL
let imageName = imageUrl?.lastPathComponent
let urlString: String = imageUrl!.path!
let theFileName = (urlString as NSString).lastPathComponent
self.imageName.append("\(theFileName)")
self.imagePath.append("\(urlString)")
let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first!
let photoURL = NSURL(fileURLWithPath: documentDirectory)
let localPath = photoURL.appendingPathComponent(imageName!)
DispatchQueue.global(qos: .background).async {
if !FileManager.default.fileExists(atPath: localPath!.path) {
do {
try UIImageJPEGRepresentation(image1, 0.1)?.write(to: localPath!)
print("file saved")
}catch {
print("error saving file")
}
}
else {
print("file already exists")
}
}
}
})
DispatchQueue.main.async
{
self.collectionView.reloadData()
}
}
}
self.hudHide()
}
PHPhotoLibrary.shared().register(self)
if fetchResult == nil {
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
fetchResult = PHAsset.fetchAssets(with: allPhotosOptions)
}
}
I recommend simply using UIImagePickerController or if your app requires multiple image selection functionality, a third-party library like DKImagePickerController. As another user already mentioned in the comments, these will only copy the image(s) the user selected into your app's directory and save on processing time.

How I can access video/Image from AVURLAsset URL in swift 3.3x?

I have AVUrlAsset something like this "file:///var/mobile/Media/DCIM/101APPLE/IMG_1006.MOV". How I can access video from this url in Swift ?
This way you can fetch all videos
DispatchQueue.global(qos: .background).async {
PHPhotoLibrary.requestAuthorization { (status) -> Void in
let allVidOptions = PHFetchOptions()
allVidOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.video.hashValue) //Any type you want to fetch
allVidOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
//Now the set the option to `fetchAssets`
let allVids = PHAsset.fetchAssets(with: allVidOptions)
print("All Videos Count \(allVids.count)")
for index in 0..<allVids.count {
let videoRequestOptions = PHVideoRequestOptions()
videoRequestOptions.deliveryMode = .fastFormat //quality-> 360p mp4
videoRequestOptions.version = .original
PHImageManager.default().requestPlayerItem(forVideo: allVids[index], options: videoRequestOptions , resultHandler: { (playerItem, result) in
// print(result as! [String: AnyObject])
// print(playerItem)
let currentVideoUrlAsset = playerItem?.asset as? AVURLAsset
let currentVideoFilePAth = currentVideoUrlAsset!.url
let lastObject = currentVideoFilePAth.pathExtension
print(lastObject)
if lastObject == "M4V" {
self.arrOfVideos.append(playerItem!)
}
//NSString *lastPath = [videoURL lastPathComponent];
//NSString *fileExtension = [lastPath pathExtension];
//NSLog(#"File extension %#",fileExtension);
var i = Int()
print("Appending.... \(i);)")
i += 1
print("My Videos Count \(self.arrOfVideos.count)")
DispatchQueue.main.async(execute: {
self.tblVideos.reloadData()
})
})
//fetch Asset here
//print(allVids[index].description)
// print(self.arrOfVideos) //will show empty first then after above completion the execution will come here again
}
}
}
I think, if you knew url path. then you could do as following:
let urlStr = URL.init(fileURLWithPath:"file:///var/mobile/Media/DCIM/101APPLE/IMG_1006.MOV")
let asset = AVURLAsset.init(url: urlStr)
The apple reference helps

Swift memory will not release

I have a class that get all the user assets, get the image and then perform face detection on each of the images. I am calling my class from the viewDidLoad function like this:
override func viewDidLoad() {
_ = autoreleasepool{
return faceProcessing()
}
}
And this is the class:
class faceProcessing{
let manager = PHImageManager.default()
let option = PHImageRequestOptions()
init() {
let realm = try! Realm()
let assets = getAssets(realm: realm)
option.isSynchronous = true
option.deliveryMode = .fastFormat
option.isNetworkAccessAllowed = true
var featuresCount = 0
for assetIndex in 0...assets.count-1{
featuresCount += autoreleasepool{() -> Int? in
print(assetIndex)
var facesCount = 0
faceDetection(asset: assets[assetIndex]) { (features) in
facesCount = features
}
return facesCount
}!
}
print(featuresCount)
}
func getAssets(realm: Realm)->[PHAsset]{
return autoreleasepool{ () -> [PHAsset] in
let images = realm.objects(Image.self).filter("asset != nil")
let ids: [String] = images.map {$0.id}
let options = PHFetchOptions()
options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
let assets = PHAsset.fetchAssets(withLocalIdentifiers: ids, options: options)
var assetArray = [PHAsset]()
if(assets.count > 1){
for assetIndex in 0...assets.count-1{
assetArray.append(assets[assetIndex])
}
}
return assetArray
}
}
func faceDetection(asset: PHAsset, completionHandler:#escaping (Int)->Void){
manager.requestImageData(for: asset, options: option){
(data, responseString, imageOriet, info) in
if (data != nil){
weak var faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: [CIDetectorAccuracy: CIDetectorAccuracyHigh])
let faces = (faceDetector?.features(in: CIImage(data: data!)!))
faceDetector = nil
completionHandler((faces?.count)!)
}else{
print(info)
}
}
}
}
The problem is that after finishing the process of getting the images and detecting the faces, the memory will not release. Before the process the memory was about 60MB, during the process it was between 400MB to 500MB, and when the process finished it went back to 300MB. Is there is a way I could release the entire memory and have 60MB after the process will finish ?
Here is how the instruments looks:
It looks like the face detection is not releasing its memory, so its there is a way I could release the CIDetection from memory?

Not getting all videos of iPhone using PHCachingImageManager

I am using PHCachingImageManager().requestAVAssetForVideo to fetch videos from iPhone. I am not getting all the videos from my iPhone. Videos stored in Photos are being fetched not all.
How can I get all the videos stored in iPhone storage.
Here is the code.
let options = PHFetchOptions()
options.sortDescriptors = [NSSortDescriptor(key: "modificationDate",
ascending: true)]
let assetResults = PHAsset.fetchAssetsWithMediaType(.Video,
options: options)
for i in 0 ..< assetResults.count{
let object: AnyObject = assetResults[i]
if let asset = object as? PHAsset{
let options = PHVideoRequestOptions()
options.deliveryMode = .Automatic
options.networkAccessAllowed = true
options.version = .Current
options.progressHandler = {(progress: Double,
error: NSError?,
stop: UnsafeMutablePointer<ObjCBool>,
info: [NSObject : AnyObject]?) in
}
/* Now get the video */
PHCachingImageManager().requestAVAssetForVideo(asset,options: options,resultHandler: {(asset: AVAsset?,audioMix: AVAudioMix?,info: [NSObject : AnyObject]?) -> Void in dispatch_async(dispatch_get_main_queue(), {
/* Did we get the URL to the video? */
if let asset = asset as? AVURLAsset{
})
})
}
}
You need to have a look inside PHFetchOptions Maybe some of your video assets are hidden?
var fetchOptions = PHFetchOptions()
fetchOptions.includeHiddenAssets = true
From the Docs-
// Whether hidden assets are included in fetch results. Defaults to NO
public var includeHiddenAssets: Bool
Hope it helps.

iOS - Fetch all photos from device and save them to app

I would like to fetch all photos that are saved in device and save them to my app and then eventually (if user allow this) delete originals.
This is my whole class I created for this task:
class ImageAssetsManager: NSObject {
let imageManager = PHCachingImageManager()
func fetchAllImages() {
let options = PHFetchOptions()
options.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.Image.rawValue)
options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
if #available(iOS 9.0, *) {
options.fetchLimit = 5
} else {
// Fallback on earlier versions
}
let imageAssets = PHAsset.fetchAssetsWithOptions(options)
print(imageAssets.count)
self.getAssets(imageAssets)
}
func getAssets(assets: PHFetchResult) {
var assetsToDelete: [PHAsset] = []
assets.enumerateObjectsUsingBlock { (object, count, stop) in
if object is PHAsset {
let asset = object as! PHAsset
let imageSize = CGSize(width: asset.pixelWidth,height: asset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .FastFormat
options.synchronous = true
self.imageManager.requestImageForAsset(asset, targetSize: imageSize, contentMode: .AspectFill, options: options, resultHandler: { [weak self]
image, info in
self.addAssetToSync(image, info: info)
assetsToDelete.append(asset)
})
}
}
self.deleteAssets(assetsToDelete)
}
func addAssetToSync(image: UIImage?, info: [NSObject : AnyObject]?) {
guard let image = image else {
return
}
guard let info = info else {
return
}
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), {
let imageData = UIImageJPEGRepresentation(image, 0.95)!
let fileUrl = info["PHImageFileURLKey"] as! NSURL
dispatch_async(dispatch_get_main_queue(), {
let photoRootItem = DatabaseManager.sharedInstance.getPhotosRootItem()
let ssid = DatabaseManager.sharedInstance.getSsidInfoByName(ContentManager.sharedInstance.ssid)
let item = StorageManager.sharedInstance.createFile(imageData, name: fileUrl.absoluteString.fileNameWithoutPath(), parentFolder: photoRootItem!, ssid: ssid!)
})
})
}
func deleteAssets(assetsToDelete: [PHAsset]){
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
PHAssetChangeRequest.deleteAssets(assetsToDelete)
}, completionHandler: { success, error in
guard let error = error else {return}
})
}
}
It's working but my problem is that it's working just for a limited number of photos. When I try it with all I get memory warnings and then app crashed. I know why it is. I know that my problem is that I get all photos to memory and it's too much. I could fetch images with that fetch limit and make it to loop but I am not sure if it is best solution.
I was hoping that with some solution process few photos then release memory and again and again until end. But this change would be somewhere in enumerateObjectsUsingBlock. I am not sure if it helps but I don't even need to get image. I just need to copy image file from device path to my app sandbox path.
What's best solution for this? How to avoid memory warnings and leaks? Thanks
Change your dispatch_async calls to dispatch_sync. Then you will process photos one at a time as you walk through enumerateObjectsUsingBlock, instead of trying to process them all at the same time.

Resources