I am trying to acquire a thumbnail from a video url. The video is a stream (HLS) with the m3u8 format.
I've already tried requestThumbnailImagesAtTimes from the MPMoviePlayerController, but that didn't work. Does anyone have a solution for that problem? If so how'd you do it?
If you don't want to use MPMoviePlayerController, you can do this:
AVAsset *asset = [AVAsset assetWithURL:sourceURL];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = CMTimeMake(1, 1);
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
Here's an example in Swift:
func thumbnail(sourceURL sourceURL:NSURL) -> UIImage {
let asset = AVAsset(URL: sourceURL)
let imageGenerator = AVAssetImageGenerator(asset: asset)
let time = CMTime(seconds: 1, preferredTimescale: 1)
do {
let imageRef = try imageGenerator.copyCGImageAtTime(time, actualTime: nil)
return UIImage(CGImage: imageRef)
} catch {
print(error)
return UIImage(named: "some generic thumbnail")!
}
}
I prefer using AVAssetImageGenerator over MPMoviePlayerController because it is thread-safe, and you can have more than one instantiated at a time.
Acquire a thumbnail from a video url.
NSString *strVideoURL = #"http://www.xyzvideourl.com/samplevideourl";
NSURL *videoURL = [NSURL URLWithString:strVideoURL] ;
MPMoviePlayerController *player = [[[MPMoviePlayerController alloc] initWithContentURL:videoURL]autorelease];
UIImage *thumbnail = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
player = nil;
Replace your video URL string with strVideoURL.
You will get thumbnail as a output from video.
And thumbnail is UIImage type data !
-(UIImage *)loadThumbNail:(NSURL *)urlVideo
{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:urlVideo options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generate.appliesPreferredTrackTransform=TRUE;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err];
NSLog(#"err==%#, imageRef==%#", err, imgRef);
return [[UIImage alloc] initWithCGImage:imgRef];
}
Add AVFoundation framework in your project and don't forget to import <AVFoundation/AVFoundation.h> and you have to pass the path of your video saved in document directory as a parameter and receive the image as UIImage.
you can get the thumbnail from your video url with the use of below code
MPMoviePlayerController *player = [[[MPMoviePlayerController alloc] initWithContentURL:videoURL]autorelease];
UIImage *thumbnail = [player thumbnailImageAtTime:0.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
Maybe this is useful for someone else who faces the same problem. I needed an easy solution for creating a thumbnail for Images, PDFs and Videos. To solve that problem I've created the following Library (in Swift).
https://github.com/prine/ROThumbnailGenerator
The usage is very straightforward:
var thumbnailImage = ROThumbnail.getThumbnail(url)
It has internally three different implementations and depending on the file extension it does create the thumbnail. You can easily add your own implementation if you need a thumbnail creator for another file extension.
Swift 3 Version :
func createThumbnailOfVideoFromFileURL(videoURL: String) -> UIImage? {
let asset = AVAsset(url: URL(string: videoURL)!)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(Float64(1), 100)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
// Set a default image if Image is not acquired
return UIImage(named: "ico_placeholder")
}
}
For Swift 3.0:
func generateThumbnailForVideoAtURL(filePathLocal: NSString) -> UIImage? {
let vidURL = NSURL(fileURLWithPath:filePathLocal as String)
let asset = AVURLAsset(url: vidURL as URL)
let generator = AVAssetImageGenerator(asset: asset)
generator.appliesPreferredTrackTransform = true
let timestamp = CMTime(seconds: 1, preferredTimescale: 60)
do {
let imageRef = try generator.copyCGImage(at: timestamp, actualTime: nil)
let frameImg : UIImage = UIImage(cgImage: imageRef)
return frameImg
} catch let error as NSError {
print("Image generation failed with error ::\(error)")
return nil
}
}
For Swift 5 you can get it this way:
First import AVKit or AVFoundation to ViewController
import AVKit
Code:
// Get Thumbnail Image from URL
private func getThumbnailFromUrl(_ url: String?, _ completion: #escaping ((_ image: UIImage?)->Void)) {
guard let url = URL(string: url ?? "") else { return }
DispatchQueue.main.async {
let asset = AVAsset(url: url)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMake(value: 2, timescale: 1)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
completion(thumbnail)
} catch {
print("Error :: ", error.localizedDescription)
completion(nil)
}
}
}
Usage
#IBOutlet weak imgThumbnail: ImageView!
and then call getThumbnailFromUrl method and pass URL as a Sting
self.getThumbnailFromUrl(videoURL) { [weak self] (img) in
guard let `self` = self else { return }
if let img = img {
self.imgThumbnail.image = img
}
}
Thank you
Swift 2 code with AVAssetImageGenerator:
func thumbnailImageForVideo(url:NSURL) -> UIImage?
{
let asset = AVAsset(URL: url)
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
var time = asset.duration
//If possible - take not the first frame (it could be completely black or white on camara's videos)
time.value = min(time.value, 2)
do {
let imageRef = try imageGenerator.copyCGImageAtTime(time, actualTime: nil)
return UIImage(CGImage: imageRef)
}
catch let error as NSError
{
print("Image generation failed with error \(error)")
return nil
}
}
Related
I am developing a video based Application in Swift3. Where I have one video url and a Range Slider according to the video duration and user can select any minimum and maximum value from slider. If suppose user has selected min value 3 Sec and Max Value 7 Sec, So for this duration I need to generate a Video Thumbnail Image. For this I am using AVAssetImageGenerator to generate this, I tried below both code to achieve this :
func createThumbnailOfVideoFromFileURL(_ strVideoURL: URL) -> UIImage?{
let asset = AVAsset(url: strVideoURL)
let assetImgGenerate : AVAssetImageGenerator = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMake(1, 30)
let img = try? assetImgGenerate.copyCGImage(at: time, actualTime: nil)
guard let cgImage = img else { return nil }
let frameImg = UIImage(cgImage: cgImage)
return frameImg
}
func generateThumbnailForUrl(vidUrl:URL) -> UIImage {
let asset = AVURLAsset(url: vidUrl, options: nil)
let imgGenerator = AVAssetImageGenerator(asset: asset)
var thmbnlImg = UIImage()
do{
let cgImage = try imgGenerator.copyCGImage(at: CMTimeMake(0, 1), actualTime: nil)
thmbnlImg = UIImage(cgImage: cgImage)
thmbnlImg = thmbnlImg.imageRotatedByDegrees(degrees: 90.0, flip: false)
}
catch{
print(error)
}
// !! check the error before proceeding
return thmbnlImg
}
But the problem is I am getting same thumbnail image using both above methods, bcos I am not setting duration here in both methods. How can I add minimum and maximum duration to generate different thumbnail image for each different duration. Please help me resolve my problem. Thank you!
Edit: I tried to set duration like :
let time: CMTime = CMTimeMakeWithSeconds(rangeSlider!.lowerValue, 1)
Then I am getting different thumbnail image but for some slider ranges I am getting nil thumbnail image also. Can anyone have some idea how to set preferredTimeScale value in CMTimeMakeWithSeconds ?
Try this code
static func generateThumbnail(videoUrl: String) -> UIImage? {
do {
let url = URL(string: videoUrl)
let asset = AVURLAsset(url: url!)
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
let cgImage = try imageGenerator.copyCGImage(at: CMTime(seconds: 2.0, preferredTimescale: 60),
actualTime: nil)
return UIImage(cgImage: cgImage)
} catch {
print(error.localizedDescription)
return nil
}
}
Assume, your trimmed video URL is videoURL. After successfully trimming a video, add this code snippet. Actually this code snippet will help you to extract images from trimmed video at each second (meaning, if the duration of your trimmed video is 10 second, this code will extract 10 images at each second and all of these images are saved in an array, named videoFrames). Lastly, you can do whatever you want with these images. You can also add an activity indicator while this process is going on. Hope this helps.
var videoFrames = [UIImage]()
let asset : AVAsset = AVAsset(url: videoURL as URL)
let videoDuration = CMTimeGetSeconds(asset.duration)
let integerValueOFVideoDuration = Int(videoDuration)
//start activity indicator here
for index in 0..<integerValueOFVideoDuration + 1 {
self.generateFrames(url: videoURL, fromTime: Float64(index))
}
func generateFrames(url: NSURL, fromTime: Float64) {
if videoFrames.count == integerValueOFVideoDuration {
//end activity indicator here
return
}
let asset: AVAsset = AVAsset(url: url as URL)
let assetImgGenerate: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset)
assetImgGenerate.maximumSize = CGSize(width: 300, height: 300)
assetImgGenerate.appliesPreferredTrackTransform = true
let time: CMTime = CMTimeMakeWithSeconds(fromTime, 600)
var img: CGImage?
do {
img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
} catch {
}
if img != nil {
let frameImg: UIImage = UIImage(cgImage: img!)
videoFrames.append(frameImg)
} else {
//return nil
}
}
I'm trying to extract frames as UIImages from a video in Swift. I found several Objective C solutions but I'm having trouble finding anything in Swift. Assuming the following is correct can someone either help me to convert the following to Swift or give me their own take on how to do this?
Source:
Grabbing the first frame of a video from UIImagePickerController?
- (UIImage *)imageFromVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
NSParameterAssert(asset);
AVAssetImageGenerator *assetIG =
[[AVAssetImageGenerator alloc] initWithAsset:asset];
assetIG.appliesPreferredTrackTransform = YES;
assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
CGImageRef thumbnailImageRef = NULL;
CFTimeInterval thumbnailImageTime = time;
NSError *igError = nil;
thumbnailImageRef =
[assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
actualTime:NULL
error:&igError];
if (!thumbnailImageRef)
NSLog(#"thumbnailImageGenerationError %#", igError );
UIImage *image = thumbnailImageRef
? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
: nil;
return image;
}
It actually did work.
func imageFromVideo(url: URL, at time: TimeInterval) -> UIImage? {
let asset = AVURLAsset(url: url)
let assetIG = AVAssetImageGenerator(asset: asset)
assetIG.appliesPreferredTrackTransform = true
assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels
let cmTime = CMTime(seconds: time, preferredTimescale: 60)
let thumbnailImageRef: CGImage
do {
thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
} catch let error {
print("Error: \(error)")
return nil
}
return UIImage(cgImage: thumbnailImageRef)
}
But remember that this function is synchronous and it's better not to call it on the main queue.
You can do either this:
DispatchQueue.global(qos: .background).async {
let image = self.imageFromVideo(url: url, at: 0)
DispatchQueue.main.async {
self.imageView.image = image
}
}
Or use generateCGImagesAsynchronously instead of copyCGImage.
Here's a SWIFT 5 alternative to Dmitry's solution, to not have to worry about what queue you're on:
public func imageFromVideo(url: URL, at time: TimeInterval, completion: #escaping (UIImage?) -> Void) {
DispatchQueue.global(qos: .background).async {
let asset = AVURLAsset(url: url)
let assetIG = AVAssetImageGenerator(asset: asset)
assetIG.appliesPreferredTrackTransform = true
assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels
let cmTime = CMTime(seconds: time, preferredTimescale: 60)
let thumbnailImageRef: CGImage
do {
thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
} catch let error {
print("Error: \(error)")
return completion(nil)
}
DispatchQueue.main.async {
completion(UIImage(cgImage: thumbnailImageRef))
}
}
}
Here's now to use it:
imageFromVideo(url: videoUrl, at: 0) { image in
// Do something with the image here
}
You can do this easily on iOS. Below is a code snippet on how to do so with Swift.
let url = Bundle.main.url(forResource: "video_name", withExtension: "mp4")
let videoAsset = AVAsset(url: url!)
let t1 = CMTime(value: 1, timescale: 1)
let t2 = CMTime(value: 4, timescale: 1)
let t3 = CMTime(value: 8, timescale: 1)
let timesArray = [
NSValue(time: t1),
NSValue(time: t2),
NSValue(time: t3)
]
let generator = AVAssetImageGenerator(asset: videoAsset)
generator.requestedTimeToleranceBefore = .zero
generator.requestedTimeToleranceAfter = .zero
generator.generateCGImagesAsynchronously(forTimes: timesArray ) { requestedTime, image, actualTime, result, error in
let img = UIImage(cgImage: image!)
}
You can find the demo code here and the medium article here.
Here's async/await version of #Dmitry 's answer for those who doesn't like completion handlers
func imageFromVideo(url: URL, at time: TimeInterval) async throws -> UIImage {
try await withCheckedThrowingContinuation({ continuation in
DispatchQueue.global(qos: .background).async {
let asset = AVURLAsset(url: url)
let assetIG = AVAssetImageGenerator(asset: asset)
assetIG.appliesPreferredTrackTransform = true
assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels
let cmTime = CMTime(seconds: time, preferredTimescale: 60)
let thumbnailImageRef: CGImage
do {
thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
} catch {
continuation.resume(throwing: error)
return
}
continuation.resume(returning: UIImage(cgImage: thumbnailImageRef))
}
})
}
Usage:
let vidUrl = <#your url#>
do {
let firstFrame = try await imageFromVideo(url: vidUrl, at: 0)
// do something with image
} catch {
// handle error
}
Or like this if you're in throwing function:
func someThrowingFunc() throws {
let vidUrl = <#your url#>
let firstFrame = try await imageFromVideo(url: vidUrl, at: 0)
// do something with image
}
I'm trying to get thumbnail from video and show it in my tableview. Here is my code:
- (UIImage *)imageFromVideoURL:(NSURL *)contentURL {
AVAsset *asset = [AVAsset assetWithURL:contentURL];
// Get thumbnail at the very start of the video
CMTime thumbnailTime = [asset duration];
thumbnailTime.value = 25;
// Get image from the video at the given time
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:thumbnailTime actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return thumbnail;
}
But image allways return black. What's wrong?
Using Swift 5, as an extension function on AVAsset:
import AVKit
extension AVAsset {
func generateThumbnail(completion: #escaping (UIImage?) -> Void) {
DispatchQueue.global().async {
let imageGenerator = AVAssetImageGenerator(asset: self)
let time = CMTime(seconds: 0.0, preferredTimescale: 600)
let times = [NSValue(time: time)]
imageGenerator.generateCGImagesAsynchronously(forTimes: times, completionHandler: { _, image, _, _, _ in
if let image = image {
completion(UIImage(cgImage: image))
} else {
completion(nil)
}
})
}
}
}
Usage:
AVAsset(url: url).generateThumbnail { [weak self] (image) in
DispatchQueue.main.async {
guard let image = image else { return }
self?.imageView.image = image
}
}
Use this :
Swift 3 -
func createThumbnailOfVideoFromFileURL(videoURL: String) -> UIImage? {
let asset = AVAsset(url: URL(string: videoURL)!)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(Float64(1), 100)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
return UIImage(named: "ico_placeholder")
}
}
Important Note :
You will need to use this in an if else as it is resource extensive. You will have to store the image in an array or model and check that if once thumbnail has been created it refers to the cache/array so that cellForRowAtIndexPath does not cause a lag in scrolling your UITableView
Just use this code.. Pass your video URL and get an image.
+(UIImage *)getPlaceholderImageFromVideo:(NSString *)videoURL {
NSURL *url = [NSURL URLWithString:videoURL];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CMTime time = [asset duration];
time.value = 0;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return thumbnail;
}
Hope, this is what you're looking for. Any concern get back to me. :)
//(Local URL)
NSURL *videoURL = [NSURL fileURLWithPath:filepath];// filepath is your video file path
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
AVAssetImageGenerator *generateImg = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *error = NULL;
CMTime time = CMTimeMake(1, 1);
CGImageRef refImg = [generateImg copyCGImageAtTime:time actualTime:NULL error:&error];
NSLog(#"error==%#, Refimage==%#", error, refImg);
UIImage *frameImage= [[UIImage alloc] initWithCGImage:refImg];
return frameImage;
Please check the link for "Generating Thumbnails from Videos"
https://littlebitesofcocoa.com/115-generating-thumbnails-from-videos
It's quite common for an app to need to display one or more thumbnails (small still-image previews) of what's in a video. However, depending on where the video is coming from, we might not have easy access to pre-made thumbnail(s) for it. Let's look at how we can use AVAssetImageGenerator to grab our own.
We start with a simple NSURL for the video, this can be local or remote. We'll create an AVAsset with it and that to a new AVAssetImageGenerator object. We'll configure the generator to apply preferred transforms so our thumbnails are in the correct orientation.
import AVFoundation
if let asset = AVAsset(URL: videoURL) {
let durationSeconds = CMTimeGetSeconds(asset.duration)
let generator = AVAssetImageGenerator(asset: asset)
generator.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(durationSeconds/3.0, 600)
var thumbnailImage: CGImageRef
generator.generateCGImagesAsynchronouslyForTimes([NSValue(CMTime: time)]) {
(requestedTime: CMTime, thumbnail: CGImage?, actualTime: CMTime, result: AVAssetImageGeneratorResult, error: NSError?) in
self.videoThumbnailImageView.image = UIImage(CGImage: thumbnail)
}
}
Swift 4 code for #Disha's answer:
let imageGenerator = AVAssetImageGenerator(asset: avAsset)
let time = CMTime(seconds: seconds, preferredTimescale: 600)
let times = [NSValue(time: time)]
imageGenerator.generateCGImagesAsynchronously(forTimes: times, completionHandler: {
requestedTime, image, actualTime, result, error in
guard let cgImage = image else
{
print("No image!")
return
}
let uiImage = UIImage(cgImage: cgImage)
UIImageWriteToSavedPhotosAlbum(uiImage, nil, nil, nil);
})
func getThumbnailFrom(path: URL) -> UIImage? {
do {
let asset = AVURLAsset(url: path , options: nil)
let imgGenerator = AVAssetImageGenerator(asset: asset)
imgGenerator.appliesPreferredTrackTransform = true
let timestamp = asset.duration
print("Timestemp: \(timestamp)")
let cgImage = try imgGenerator.copyCGImage(at: timestamp, actualTime: nil)
let thumbnail = UIImage(cgImage: cgImage)
return thumbnail
} catch let error {
print("*** Error generating thumbnail: \(error.localizedDescription)")
return nil
}
}
This Code is working.
SWIFT 5.2 VERSION:
You can find my original post here, where I took inspiration from rozochkin's answer.
func getCurrentFrame() -> UIImage? {
guard let player = self.player, let avPlayerAsset = player.currentItem?.asset else {return nil}
let assetImageGenerator = AVAssetImageGenerator(asset: avPlayerAsset)
assetImageGenerator.requestedTimeToleranceAfter = .zero
assetImageGenerator.requestedTimeToleranceBefore = .zero
assetImageGenerator.appliesPreferredTrackTransform = true
let imageRef = try! assetImageGenerator.copyCGImage(at: player.currentTime(), actualTime: nil)
let image = UIImage(cgImage: imageRef)
return image
}
IMPORTANT NOTES:
requestedTimeToleranceAfter and requestedTimeToleranceBefore should be set to .zero, because, according to source code, "The actual time of the generated images [...] may differ from the requested time for efficiency".
appliesPreferredTrackTransform must be set to TRUE (default is FALSE), otherwise you get a bad-rotated frame. With this property set to TRUE you get what you really see in the player.
I want to get thumbnail images of every frame from video and then save this images in Mutable Array of images.
I want to use this images to play as a animation.
NSURL* assetURL = [self.asset valueForProperty:ALAssetPropertyAssetURL];
NSDictionary* assetOptions = nil;
AVAsset* myAsset = [[AVURLAsset alloc] initWithURL:assetURL options:assetOptions];
self.imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
int duration = CMTimeGetSeconds([myAsset duration]);
for(int i = 0; i<duration; i++)
{
CGImageRef imgRef = [self.imageGenerator copyCGImageAtTime:CMTimeMake(i, duration) actualTime:NULL error:nil];
UIImage* thumbnail = [[UIImage alloc] initWithCGImage:imgRef scale:UIViewContentModeScaleAspectFit orientation:UIImageOrientationUp];
[thumbnailImages addObject:thumbnail];
}
I am using above code to get thumbnail images but the problem is if there is a 2 seconds video i am only getting 2 thumbnails but i want 20 thumbnails (10 thumbnail per second).
So, how to use CMTimeMake to get thumbnails for every .1 second
Code form reference site : Thumbnail image from Video
Objective - C
-(UIImage *)generateThumbImage : (NSString *)filepath
{
NSURL *url = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
self.imageGenerator.appliesPreferredTrackTransform = YES;
CMTime time = [asset duration];
time.value = 0;
Float duration = CMTimeGetSeconds([myAsset duration]);
for(Float i = 0.0; i<duration; i=i+0.1)
{
CGImageRef imgRef = [self.imageGenerator copyCGImageAtTime:CMTimeMake(i, duration) actualTime:NULL error:nil];
UIImage* thumbnail = [[UIImage alloc] initWithCGImage:imgRef scale:UIViewContentModeScaleAspectFit orientation:UIImageOrientationUp];
[thumbnailImages addObject:thumbnail];
}
}
Swift
func generateThumbImage(url : NSURL) -> UIImage{
var asset : AVAsset = AVAsset.assetWithURL(url) as! AVAsset
var assetImgGenerate : AVAssetImageGenerator = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
var error : NSError? = nil
var time : CMTime = CMTimeMake(1, 30)
var img : CGImageRef = assetImgGenerate.copyCGImageAtTime(time, actualTime: nil, error: &error)
var frameImg : UIImage = UIImage(CGImage: img)!
return frameImg
}
#Kirit Modi solution in Swift 3 with some small changes:
func generateThumbImage(url : URL) -> UIImage?{
let asset = AVAsset(url: url)
let assetImgGenerate : AVAssetImageGenerator = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMake(1, 30)
let img = try? assetImgGenerate.copyCGImage(at: time, actualTime: nil)
guard let cgImage = img else { return nil }
let frameImg = UIImage(cgImage: cgImage)
return frameImg
}
Following code for the getting the thumb image from video at 0.41
NSString *str = [[self.videoArray objectAtIndex:i] valueForKey:#"vName"];
NSURL *videoURL = [NSURL URLWithString:str] ;
MPMoviePlayerController *player = [[[MPMoviePlayerController alloc] initWithContentURL:videoURL]autorelease];
UIImage *thumbnail = [player thumbnailImageAtTime:0.41 timeOption:MPMovieTimeOptionNearestKeyFrame];
player = nil;
extension UIImageView{
func getImageThumnail(forLink link: String) {
let asset = AVAsset(url: URL(string: link)!)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(Float64(50), 100)
DispatchQueue.global(qos: .userInteractive).async {
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
DispatchQueue.main.async {
self.image = thumbnail
}
} catch {
}
}
}
}
I am trying to acquire a thumbnail (of the first frame) from a video taken from iphone 3GS camera so I can display it. How to do this?
-(UIImage *)generateThumbImage : (NSString *)filepath
{
NSURL *url = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
CMTime time = [asset duration];
time.value = 0;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
return thumbnail;
}
you can get the frame using the time.value suppose you want to 1 second frame then use the
time.value = 1000 //Time in milliseconds
The answer to this question is that one can now with 4.0 iOS get thumbnails using AVFoundation, the following code where the class property url is the movie url, will do the trick (you can get the thumbnail at any time, in the example its at time 0)
-(void)generateImage
{
AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:self.url options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
[asset release];
CMTime thumbTime = CMTimeMakeWithSeconds(0,30);
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result != AVAssetImageGeneratorSucceeded) {
NSLog(#"couldn't generate thumbnail, error:%#", error);
}
[button setImage:[UIImage imageWithCGImage:im] forState:UIControlStateNormal];
thumbImg=[[UIImage imageWithCGImage:im] retain];
[generator release];
};
CGSize maxSize = CGSizeMake(320, 180);
generator.maximumSize = maxSize;
[generator generateCGImagesAsynchronouslyForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:handler];
}
NSURL *videoURL = [NSURL fileURLWithPath:url];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
UIImage *thumbnail = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
//Player autoplays audio on init
[player stop];
[player release];
Check this link for an alternative:
thumbnailImageAtTime: now deprecated - What's the alternative?
Best method I've found... MPMoviePlayerController thumbnailImageAtTime:timeOption
SWIFT 2.0
You can generate in swift in two ways 1. AVFoundation 2. MPMoviePlayerController
1.
func generateThumnail(url : NSURL) -> UIImage{
var asset : AVAsset = AVAsset.assetWithURL(url) as AVAsset
var assetImgGenerate : AVAssetImageGenerator = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
var error : NSError? = nil
var time : CMTime = CMTimeMake(1, 30)
var img : CGImageRef = assetImgGenerate.copyCGImageAtTime(time, actualTime: nil, error: &error)
var frameImg : UIImage = UIImage(CGImage: img)!
return frameImg
}
2.
override func viewDidLoad() {
super.viewDidLoad()
var moviePlayer : MPMoviePlayerController! = MPMoviePlayerController(contentURL: moviePlayManager.movieURL)
moviePlayer.view.frame = CGRect(x: self.view.frame.origin.x, y: self.view.frame.origin.y, width:
self.view.frame.size.width, height: self.view.frame.height)
moviePlayer.fullscreen = true
moviePlayer.controlStyle = MPMovieControlStyle.None
NSNotificationCenter.defaultCenter().addObserver(self,
selector: "videoThumbnailIsAvailable:",
name: MPMoviePlayerThumbnailImageRequestDidFinishNotification,
object: nil)
let thumbnailTimes = 3.0
moviePlayer.requestThumbnailImagesAtTimes([thumbnailTimes],
timeOption: .NearestKeyFrame)
}
func videoThumbnailIsAvailable(notification: NSNotification){
if let player = moviePlayer{
let thumbnail =
notification.userInfo![MPMoviePlayerThumbnailImageKey] as? UIImage
if let image = thumbnail{
/* We got the thumbnail image. You can now use it here */
println("Thumbnail image = \(image)")
}
}
Swift 2 code:
func previewImageForLocalVideo(url:NSURL) -> UIImage?
{
let asset = AVAsset(URL: url)
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
var time = asset.duration
//If possible - take not the first frame (it could be completely black or white on camara's videos)
time.value = min(time.value, 2)
do {
let imageRef = try imageGenerator.copyCGImageAtTime(time, actualTime: nil)
return UIImage(CGImage: imageRef)
}
catch let error as NSError
{
print("Image generation failed with error \(error)")
return nil
}
}
For Swift 3.0
func createThumbnailOfVideoFromFileURL(_ strVideoURL: String) -> UIImage?{
let asset = AVAsset(url: URL(string: strVideoURL)!)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(Float64(1), 100)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
return thumbnail
} catch {
/* error handling here */
}
return nil }
Swift 4
func generateThumbnail(for asset:AVAsset) -> UIImage? {
let assetImgGenerate : AVAssetImageGenerator = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMake(value: 1, timescale: 2)
let img = try? assetImgGenerate.copyCGImage(at: time, actualTime: nil)
if img != nil {
let frameImg = UIImage(cgImage: img!)
return frameImg
}
return nil
}
How to use:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
picker.dismiss(animated: true) {
switch mediaType {
case kUTTypeMovie:
guard info[UIImagePickerController.InfoKey.mediaType] != nil, let url = info[UIImagePickerController.InfoKey.mediaURL] as? URL else { return }
let asset = AVAsset(url: url)
guard let img = self.generateThumbnail(for: asset) else {
print("Error: Thumbnail can be generated.")
return
}
print("Image Size: \(img.size)")
break
default:
break
}
}
}
For Swift 5
import AVKit
Code:
// Get Thumbnail Image from URL
fileprivate func getThumbnailFromUrl(_ url: String?, _ completion: #escaping ((_ image: UIImage?)->Void)) {
guard let url = URL(string: url ?? "") else { return }
DispatchQueue.main.async {
let asset = AVAsset(url: url)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMake(value: 2, timescale: 1)
do {
let img = try assetImgGenerate.copyCGImage(at: time, actualTime: nil)
let thumbnail = UIImage(cgImage: img)
completion(thumbnail)
} catch {
print("Error :: ", error.localizedDescription)
completion(nil)
}
}
}
Usage ::
Take one Image View
#IBOutlet weak var imgThumbnail: UIImageView!
Then after call getThumbnailFromUrl method for thumbnail with URL String as parameter
self.getThumbnailFromUrl(videoURL) { [weak self] (img) in
guard let _ = self else { return }
if let img = img {
self?.imgThumbnail.image = img
}
}
pls try this if it is use full then pls comment
Thank you
Maybe this is useful for someone else who faces the same problem. I needed an easy solution for creating a thumbnail for Images, PDFs and Videos. To solve that problem I've created the following Library.
https://github.com/prine/ROThumbnailGenerator
The usage is very straightforward:
var thumbnailImage = ROThumbnail.getThumbnail(url)
It has internally three different implementations and depending on the file extension it does create the thumbnail. You can easily add your own implementation if you need a thumbnail creator for another file extension.
Swift 2.1 Getting thumbnails at required time intervals
func getPreviewImageForVideoAtURL(videoURL: NSURL, atInterval: Int) -> UIImage? {
print("Taking pic at \(atInterval) second")
let asset = AVAsset(URL: videoURL)
let assetImgGenerate = AVAssetImageGenerator(asset: asset)
assetImgGenerate.appliesPreferredTrackTransform = true
let time = CMTimeMakeWithSeconds(Float64(atInterval), 100)
do {
let img = try assetImgGenerate.copyCGImageAtTime(time, actualTime: nil)
let frameImg = UIImage(CGImage: img)
return frameImg
} catch {
/* error handling here */
}
return nil
}
You will get Thumbnail Image From URL when you change "fileURLWithPath" to "URLWithString".
In My case it worked like this.
NSURL *url = [NSURL URLWithString:filepath];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = [asset duration];
time.value = 0;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return thumbnail;