I've been going through many codes on it but I've not found a working solution yet. I'm using Xcode 5.1.1 and Iphone Retina 4inch.I want to get a single image from a video by clicking on it. After that, I'll edit the image and apply the effects to entire video. Thank you.
UPDATE:
I've found this code for the same purpose. It's not working on the simulator either. Can somebody tell me what's the problem?
-(UIImage *)generateThumbImage : (NSString *)filepath
{
NSURL *url = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = [asset duration];
time.value = 0;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
return thumbnail;
}
This code will generating an image from a particular time in the loaded video and convert the image from a CGImage to UIImage.
The code is commented to explain its parts.
- (UIImage*)loadImage {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:vidURL options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset]; // Create object of Video.
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60); // Select time of 1/60ths of a second.
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err]; // Get Image on that time frame.
NSLog(#"err==%#, imageRef==%#", err, imgRef); // if something is not as expected then u can log error.
return [[UIImage alloc] initWithCGImage:imgRef]; // Convert Image from CGImage to UIImage so you can display it easily in a image view.
}
Have you tried AVFoundation framework?
That's what it's for. The documentation is on the Apple site:
AV Foundation framework provides essential services for working with
time-based audiovisual media on iOS and OS X. Through a modern
Objective-C interface, you can easily play, capture, edit, or encode
media formats such as QuickTime movies and MPEG-4 files.
Related
i am building an application that records everything going on on my screen, i have many views and a camera view opening, i have tried:
ASScreenRecorder *recorder = [ASScreenRecorder sharedInstance];
if (recorder.isRecording) {
[recorder stopRecordingWithCompletion:^{
NSLog(#"Finished recording");
}];
} else {
[recorder startRecording];
NSLog(#"Start recording");
}
from ASScreenRecorder, and also Glimpse library, but every time it seems to record everything but the camera view, it shows as blank, any suggestions?
It's not possible to take a screenshot of AVPlayer. You can use AVAssetImageGenerator to get the required image from AVAsset and then Merge/Blend it with the screenshot you are getting from ASScreenRecorder.
Getting image from AVAsset (Sample code):
AVAsset *asset = [AVAsset assetWithURL:sourceURL];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = CMTimeMake(1, 1);
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
Am using JSQMessageViewController for chatting,while capturing the video it is saved in the photo album,i can't add it in to the chatview,if am trying to add the video means that video displayed in the png format.So could you please any one help me to solve this issue,How to add the capture the video from the chatview using JSQMessageViewController.After that i would like to upload the video in the chatview is using the API.Every captured video will be saved and added to the API and display the ChatView.
This code is only for create a thumbnail that you want to display in chat
YOURIMAGEVIEW.image = [SELF imageFromVideoUrl:#"GIVE HERE URL OF VIDEO"];
+(UIImage *)imageFromVideoUrl : (NSURL *)videoUrl
{
AVAsset *asset = [AVAsset assetWithURL:videoUrl];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
CMTime time = [asset duration];
time.value = 0;
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
return thumbnail;
}
When you display this thumbnail in chat and in background call your api that upload a video into your server database . hope that you are clear with my this answer and got a idea what task actually you do.
Happy Coding.
I'm using the next code to extract the last frame from a video:
- (UIImage *)thumbnailFromVideoAtURL:(NSURL *)url
{
AVAsset *asset = [AVAsset assetWithURL:url];
CMTime thumbnailTime = [asset duration];
NSLog(#"value: %lld",thumbnailTime.value); //1650
NSLog(#"timescale: %d",thumbnailTime.timescale); //1000
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:thumbnailTime actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return thumbnail;
}
The code and CMTime looks fine (it's a 1.65 seconds video) but I'm still getting back the first frame of the video (it's a mov file).
If I try the above code on longer videos I get a frame from the middle of the video or a frame that's close to the end but never the exact last frame.
Any idea what's the problem?
I'm trying to avoid using something like ffmpeg.
Thanks
You should configure the requestedTimeToleranceBefore and requestedTimeToleranceAfter properties of the AVAssetImageGenerator. They default to kCMTimePositiveInfinity which means the closest key frame. If you set them both to kCMTimeZero, you will get the exact frame you are interested in.
How can I change or edit the default thumbnail of an existing video? Ex: suppose I have a video of 5 minutes named xyz. In a list of videos, I want to show the xyz's thumbnail image a specific image, I don't want to be depended on the default thumbnail.
To generate the thumb image, I used the bellow code block and it's working fine.
-(UIImage *)generateThumbImage : (NSURL *)filepath
{
NSURL *url = filepath;
AVAsset *asset = [AVAsset assetWithURL:url];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = [asset duration];
time.value = 1000; //Time in milliseconds
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
imageGenerator.appliesPreferredTrackTransform = YES; //To fix the thumbnail orientation set
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
[thumbImage setImage:thumbnail] ;
NSLog(#"Image set!!! %#", thumbnail);
return thumbnail;
}
I am recording a video from the iPhone camera by using the AVCam code provided from apple.
After the video is recorded it is saved to the photos library.
A new view is then loaded, here I need to have an image thumbnail of the video.
I have a path to the video:
file://localhost/private/var/mobile/Applications/ED45DEFC-ABF9-4A5E-9102-21680CC1448E/tmp/output.mov
I can't seem to figure how to get the first frame of the video to use as a thumbnail.
Any help would be very appreciated and thank you for your time.
EDIT
I ended up using this, I'm not sure why it returns the image sideways?
- (UIImage*)loadImage {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:vidURL options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err];
NSLog(#"err==%#, imageRef==%#", err, imgRef);
return [[UIImage alloc] initWithCGImage:imgRef];
}
To fix the thumbnail orientation set appliesPreferredTrackTransform to YES in the AVAssetImageGenerator instance. If you add your own video composition, you'll need to include the right transform to rotate the video as wanted.
generate.appliesPreferredTrackTransform = YES;
Remember to release the obtained image reference with CGImageRelease.
To request multiple thumbnails it's better to do asynchronously with generateCGImagesAsynchronouslyForTimes:completionHandler:.