I'm working on showing the thumbnails of the videos.
Here is my code.
override func viewDidLoad() {
super.viewDidLoad()
for str in self.imgArray
{
let url = NSURL(string: str)
let movieAsset = AVURLAsset(URL: url!, options: nil)
let assetImageGemerator = AVAssetImageGenerator(asset: movieAsset)
assetImageGemerator.appliesPreferredTrackTransform = true
let frameRef = try! assetImageGemerator.copyCGImageAtTime(CMTimeMake(1, 2), actualTime: nil)
let image = UIImage(CGImage: frameRef)
self.imagesArray.append(image)
}
}
By using this I'm getting thumbnails correctly. The issue is that there is a delay of about 5-10 seconds in generating the thumbnail image. Is there anyway that I could improve the speed of this code and generate the thumbnail fastly?
I don't think there will be a way to actually speed up the code - try with CMTimeMake(0, 10). Maybe it will faster the code since some video files takes some time to seek.
I think you need to cache images you got from the code and consult the cached images next time so that it runs faster overall. There are a lot of ways to cache images - using NSCache is an option.
As a side note, it won't take 5-10 seconds to get thumbnail images. It took less than one second usually.
Related
I am developing an ios video trimmer with swift 4. I am trying to render a horizontal list of video thumbnails spread out over various durations both from local video files and and remote urls. When I test it in the simulator the thumbnails get generated in less than a second which is ok. However, when I test this code on an actual device the thumbnail generation is really slow and sometimes crashes. I tried to add the actual image generation to a background thread and then update the UI on the main thread when it is completed but that doesnt seem to work very well and the app crashes after rendering the screen a few times. I am not sure if that is because I am navigating away from the screen while tasks are still trying to complete. I am trying to resolve this problem and have the app generate the thumbnails quicker and not crash. Here is the code that I am using below. I would really appreciate any assistance for this issue.
func renderThumbnails(view: UIView, videoURL: URL, duration: Float64) {
var offset: Float64 = 0
for i in 0..<self.IMAGE_COUNT{
DispatchQueue.global(qos: .userInitiated).async {
offset = Float64(i) * (duration / Float64(self.IMAGE_COUNT))
let thumbnail = thumbnailFromVideo(videoUrl: videoURL,
time: CMTimeMake(Int64(offset), 1))
DispatchQueue.main.async {
self.addImageToView(image: thumbnail, view: view, index: i)
}
}
}
}
static func thumbnailFromVideo(videoUrl: URL, time: CMTime) -> UIImage{
let asset: AVAsset = AVAsset(url: videoUrl) as AVAsset
let imgGenerator = AVAssetImageGenerator(asset: asset)
imgGenerator.appliesPreferredTrackTransform = true
do{
let cgImage = try imgGenerator.copyCGImage(at: time, actualTime: nil)
let uiImage = UIImage(cgImage: cgImage)
return uiImage
}catch{
}
return UIImage()
}
The first sentence of the documentation says not to do what you’re doing! And it even tells you what to do instead.
Generating a single image in isolation can require the decoding of a large number of video frames with complex interdependencies. If you require a series of images, you can achieve far greater efficiency using the asynchronous method, generateCGImagesAsynchronously(forTimes:completionHandler:), which employs decoding efficiencies similar to those used during playback.
(Italics mine.)
I am getting an Image as Data from server.To show it I am using UIImage?(data:) to show it.For a specific image application is getting crash because of memory pressure. What could be the reason, how to fix it.
if let imgData = Utils.fetchDataFromDocumentDirectory(imageName:"test.jpg"){
attachmentImgView.image = UIImage(data:imgData)
}
How big is the Image? and are you downloading only 1 image or an array?
This might be caused from the size of the image you are trying to download.
My advice if you are trying to show an image, only giving it the url like so:
let url = URL(string: "http://i.imgur.com/w5rkSIj.jpg")
let data = try? Data(contentsOf: url)
if let imageData = data {
let image = UIImage(data: data)
}
even better you can try to use AsyncImageView 3rd party library where you give it the url of the image and it shows it asynchronously without delays in the app.
Hope this helps!
i downloaded images from the server with alamofire framework. I worked with Alamofire.download . Each image has size above 1MB +-, but after each request memory increase a lot. After downloading 4 images the memory used is above 171MB and after that each image give more than 35MB.
Downloading code is:
Alamofire.download(mainReguest, to: self.destination)
.downloadProgress{ progress in
self.progressView.progress = Float(progress.fractionCompleted)
}
.response{ response in
if response.error == nil, let imagePath = response.destinationURL?.path {
let image = UIImage(contentsOfFile: imagePath)
self.addNewImageToTheScrollView(img: image)
}
}
}
Code with destination is:
let destination = DownloadRequest.suggestedDownloadDestination(for: .documentDirectory,
in: .userDomainMask,
with: [DownloadRequest.DownloadOptions.removePreviousFile])
The problem is with this...
UIImage(contentsOfFile: imagePath)
Specifically with the fact that you are (presumably) downloading a compressed format of your image.
However, UIImage is uncompressed in memory. So each pixel of your image will take 4 bytes of information (red, green, blue, alpha).
So if you have an image that is (for example) 5 mega pixels.
Then 5,000,000 pixels * 4b = 20MB.
So, I imagine your images are around 10MP?
The best way around this is to optimise your image download. If you're displaying an image on an iPhone then there is no point downloading a 10MP version of it. You might as well resize it to be much much smaller.
Ideally, you should be resizing the images on the backend before the download happens.
Your Alamofire's code is ok. Most likely you have issues somewhere in UI part, for example because of the high image resolution. The first thing you have to do is to localise an issue. To check networking code please comment all UI related code and run your app one more time.
Alamofire.download(mainReguest, to: self.destination)
.downloadProgress{ progress in
// self.progressView.progress = Float(progress.fractionCompleted)
}
.response{ response in
if response.error == nil, let imagePath = response.destinationURL?.path {
let image = UIImage(contentsOfFile: imagePath)
print("Image is successfully downloaded!")
//self.addNewImageToTheScrollView(img: image)
}
}
}
I am building an app where you can pick images from your phone or make new ones with the camera. The question is about picking internal images. First I was using ImagepickerController where a new Controller pops up and I can select the image I want. Later I decided to change that design approach, but to embed all the pictures from my phone inside the main screen (the one that shows when you enter the app) where you can select each image by just tapping on it (so no new controller pops up). And there is the problem. Reading pictures and loading them inside my collectionview takes just too much time (reading pictures is the main issue). I have over 1000 pictures on my phone, and it takes like 7 seconds for the app to read 100 pictures (than it crashes, not sure why, maybe some memory issue, didnt look further into it bcz my main problem was the reading performance).
For reading I am using PHAssetCollection, this is the code:
let allPhotosResult = PHAsset.fetchAssets(with: PHAssetMediaType.image, options: nil)
let imageManager = PHCachingImageManager()
allPhotosResult.enumerateObjects({(object: AnyObject!,
count: Int,
stop: UnsafeMutablePointer<ObjCBool>) in
if object is PHAsset{
let imgAsset = object as! PHAsset
countI += 1
if countI > 100 {
return
}
let imageSize = CGSize(width: imgAsset.pixelWidth,
height: imgAsset.pixelHeight)
let options = PHImageRequestOptions()
options.deliveryMode = .fastFormat
options.isSynchronous = true
imageManager.requestImage(for: imgAsset,
targetSize: imageSize,
contentMode: .aspectFill,
options: options,
resultHandler: {
(image, info) -> Void in
self._images.append(image!)
}
})
}
})
I quit at 100 images bcz then they are being displayed, but again thats not the issue I think I will be able to solve that.
So, is there a faster way to read all the images and load them as thumbnails into my collectionview ? I didnt have any of the popular photos apps, but I downloaded some of them to see how their performance is, and Instagram is pretty impressive. All the images are there without any delay, which means there must be a way to get them all pretty fast. Maybe they fetch them in the background even when the app is not running ?
Regards
P.S. - ah yes, I know I can execute the reading async, but that doesnt solve my problem bcz the delay will still be there
Let me give you some insight on my application itself.
To put it in short, I am creating a social-networking app. Each post consists of an image, profile picture, and caption. Each post exists in my MySQL database. I am using my own framework to retrieve each post. However, once I retrieve each post I still have to retrieve the profile picture and image using the URLs which I retrieved from the database. I would like to retrieve all images at once rather than running in sequential order.
As of now, there are about 5 posts in the database. Loading the necessary images for one post takes about 4 seconds. So right now I am loading the images for one post then retrieving the next in sequential order. So this whole process takes around 20 seconds. So say have 50 posts then it will take an extremely long time to load all the posts. I have some knowledge of GCD (grand-dispatch-queues) however I don't know how to implement it in my app.
Here is my code for retrieving my posts and images:
ConnectionManager.sharedInstance.retrievePosts(UserInformationInstance.SCHOOL) {
(result: AnyObject) in
if let posts = result as? [[String: AnyObject]] {
print("Retrieved \(posts.count) posts.")
for post in posts {
let postIDCurrent = post["id"] as? Int
var UPVOTES = 0;
var UPVOTED: Bool!
var query = ""
if let profilePictureCurrent = post["profile_picture"] {
// Loading profile picture image
let url = NSURL(string: profilePictureCurrent as! String)
let data = NSData(contentsOfURL: url!)
let image = UIImage(data: data!)
UserInformationInstance.postsProfilePictures.append(image!)
print("added profile pic")
} else {
print("error")
}
if let postPictureCurrent = post["image"] {
if (postPictureCurrent as! String != "") {
// Loading image associated with post
let url = NSURL(string: postPictureCurrent as! String)
let data = NSData(contentsOfURL: url!)
let image = UIImage(data: data!)
let imageArray: [AnyObject] = [postIDCurrent!, image!]
UserInformationInstance.postsImages.append(imageArray)
print("added image pic")
}
} else {
print("error")
}
UserInformationInstance.POSTS.append(post)
}
} else {
self.loadSearchUsers()
}
}
So my question is, how can I retrieve all the images at the same time instead of retrieving one after the other?
It would be great if someone could give an explanation as well as some code :)
I would recommend to revise your approach. If your server is fine - it's not busy and well reachable, so that resources downloading is limited by device network adapter bandwidth (X mbps), then it does not matter how you downloading images - concurrently or sequently.
Let me show this. Downloading time of 10 files with size Y mb simultaneously is equal to downloading time of one file, but in this case the downloading speed will be 10 times slower per file:
X/10 - downloading speed per one file
Time = Amount / Speed
T = Y / (X/10) = 10 * Y / X
Now if your are downloading sequently:
T = 10 * (Y / X) = 10 * Y / X
I would recommend to show posts immediately once you retrived them from the storage, then you need to start image downloading asynchronously and set image once that's downloaded. That's the best practice in the industry, consider Facebook, Twitter, Instagram apps.