I am recording a video from the iPhone camera by using the AVCam code provided from apple.
After the video is recorded it is saved to the photos library.
A new view is then loaded, here I need to have an image thumbnail of the video.
I have a path to the video:
file://localhost/private/var/mobile/Applications/ED45DEFC-ABF9-4A5E-9102-21680CC1448E/tmp/output.mov
I can't seem to figure how to get the first frame of the video to use as a thumbnail.
Any help would be very appreciated and thank you for your time.
EDIT
I ended up using this, I'm not sure why it returns the image sideways?
- (UIImage*)loadImage {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:vidURL options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err];
NSLog(#"err==%#, imageRef==%#", err, imgRef);
return [[UIImage alloc] initWithCGImage:imgRef];
}
To fix the thumbnail orientation set appliesPreferredTrackTransform to YES in the AVAssetImageGenerator instance. If you add your own video composition, you'll need to include the right transform to rotate the video as wanted.
generate.appliesPreferredTrackTransform = YES;
Remember to release the obtained image reference with CGImageRelease.
To request multiple thumbnails it's better to do asynchronously with generateCGImagesAsynchronouslyForTimes:completionHandler:.
Related
Am using JSQMessageViewController for chatting,while capturing the video it is saved in the photo album,i can't add it in to the chatview,if am trying to add the video means that video displayed in the png format.So could you please any one help me to solve this issue,How to add the capture the video from the chatview using JSQMessageViewController.After that i would like to upload the video in the chatview is using the API.Every captured video will be saved and added to the API and display the ChatView.
This code is only for create a thumbnail that you want to display in chat
YOURIMAGEVIEW.image = [SELF imageFromVideoUrl:#"GIVE HERE URL OF VIDEO"];
+(UIImage *)imageFromVideoUrl : (NSURL *)videoUrl
{
AVAsset *asset = [AVAsset assetWithURL:videoUrl];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
CMTime time = [asset duration];
time.value = 0;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
return thumbnail;
}
When you display this thumbnail in chat and in background call your api that upload a video into your server database . hope that you are clear with my this answer and got a idea what task actually you do.
Happy Coding.
I'm using the next code to extract the last frame from a video:
- (UIImage *)thumbnailFromVideoAtURL:(NSURL *)url
{
AVAsset *asset = [AVAsset assetWithURL:url];
CMTime thumbnailTime = [asset duration];
NSLog(#"value: %lld",thumbnailTime.value); //1650
NSLog(#"timescale: %d",thumbnailTime.timescale); //1000
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:thumbnailTime actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return thumbnail;
}
The code and CMTime looks fine (it's a 1.65 seconds video) but I'm still getting back the first frame of the video (it's a mov file).
If I try the above code on longer videos I get a frame from the middle of the video or a frame that's close to the end but never the exact last frame.
Any idea what's the problem?
I'm trying to avoid using something like ffmpeg.
Thanks
You should configure the requestedTimeToleranceBefore and requestedTimeToleranceAfter properties of the AVAssetImageGenerator. They default to kCMTimePositiveInfinity which means the closest key frame. If you set them both to kCMTimeZero, you will get the exact frame you are interested in.
I've been going through many codes on it but I've not found a working solution yet. I'm using Xcode 5.1.1 and Iphone Retina 4inch.I want to get a single image from a video by clicking on it. After that, I'll edit the image and apply the effects to entire video. Thank you.
UPDATE:
I've found this code for the same purpose. It's not working on the simulator either. Can somebody tell me what's the problem?
-(UIImage *)generateThumbImage : (NSString *)filepath
{
NSURL *url = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = [asset duration];
time.value = 0;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
return thumbnail;
}
This code will generating an image from a particular time in the loaded video and convert the image from a CGImage to UIImage.
The code is commented to explain its parts.
- (UIImage*)loadImage {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:vidURL options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset]; // Create object of Video.
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60); // Select time of 1/60ths of a second.
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err]; // Get Image on that time frame.
NSLog(#"err==%#, imageRef==%#", err, imgRef); // if something is not as expected then u can log error.
return [[UIImage alloc] initWithCGImage:imgRef]; // Convert Image from CGImage to UIImage so you can display it easily in a image view.
}
Have you tried AVFoundation framework?
That's what it's for. The documentation is on the Apple site:
AV Foundation framework provides essential services for working with
time-based audiovisual media on iOS and OS X. Through a modern
Objective-C interface, you can easily play, capture, edit, or encode
media formats such as QuickTime movies and MPEG-4 files.
i am new to ios,I want to create thumnail for videos in horizontal view.After tapping on thumnail video should paly.I have created custum player but unable to create horizontal thgumbnail.
you can create thumbnail of video using bellow code:
-(UIImage *)getThumbNail:(NSString*)stringPath // string path is a path of you video file
{
NSURL *urlvideoImage =[NSURL fileURLWithPath:stringPath];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:urlvideoImage options:nil];
AVAssetImageGenerator *generateImg = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *error = NULL;
CMTime time = CMTimeMake(1, 65);
CGImageRef refImg = [generateImg copyCGImageAtTime:time actualTime:NULL error:&error];
NSLog(#"error==%#, Refimage==%#", error, refImg);
UIImage *FrameImage= [[UIImage alloc] initWithCGImage:refImg];
//
return FrameImage;
}
Create thumbnail from your video URL using above code.
Set this Image as a button background Image or you can use Image View and set Tap-gesture forget it's touch event.
By click on Button you can Play video with relevant code. Hope that is working for you.
In my app I record a movie, save it to the PhotosAlbum, and request then creation of a thumbnail using the code
self.player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL_];
NSArray *times = #[#(1.1)];
[self.player requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionNearestKeyFrame];
This works fine, but during the thumbnail creation, MPMoviePlayerController plays the sound of the movie, although the movie is not visible, which is annoying.
How can I turn the sound off just of this particular MPMoviePlayerController? The MPMoviePlayerController class has apparently no property to control this.
Just add this line after creating your player:
self.player.shouldAutoplay = NO;
According to the answer in How to mute/unmute audio when playing video using MPMoviePlayerController? it is not possible to turn off the sound of a MPMoviePlayerController.
However I found in Creating Thumbnail for Video in iOS a much easier way to create thumbnails, which does not play sound and is synchronous so that handling of asynchronous callbacks is not required.
For convenience, I copy it here:
Add the AVFoundation framework to you app.
#import <AVFoundation/AVFoundation.h>
Add the following code:
UIImage *theImage = nil;
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generator copyCGImageAtTime:time actualTime:NULL error:&err];
theImage = [[UIImage alloc] initWithCGImage:imgRef];
CGImageRelease(imgRef);