i am developing video application in that i required video cover image , that i what to select from the user,
select cover image from the user like below image
i search many times in google for custom controller but i didn't get any result, if is there any custom controller ?? or any suggestion for that it is very helpful for me
Actully you can get Video Thumb cover image using path of video file using Bellow code or you can get all frame of video file. But there Apple Not provide any view-controller for choose cover image Thumb for video. you have to create Manually Time at thumbnailImageAtTime or create you own that type of view and get thumb image using Bellow code. Hope that useful.
-(void) getAllImagesFromVideo
{
imagesArray = [[NSMutableArray alloc] initWithCapacity:375];
times = [[NSMutableArray alloc] initWithCapacity:375];
for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video
{
[times addObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(i, 60)]];
}
[imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
if (result == AVAssetImageGeneratorSucceeded)
{
[imagesArray addObject:[UIImage imageWithCGImage:image]];
CGImageRelease(image);
}
}];
}
Above code Ref from Get all frames of Video IOS 6 Here you have all frame image of you video. so create you own view-Controller as you mention image in to your question create view-controller like above and set all image in to horizontal scroll-view and select any for your cover image else you can get particular image from time using Bellow code directionally .
-(UIImage *)getThumbNail:(NSString*)stringPath
{
NSURL *videoURL = [NSURL fileURLWithPath:stringPath];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
UIImage *thumbnail = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
//Player autoplays audio on init
[player stop];
return thumbnail;
}
Related
I am very upset because from last two days i am searching for editing video frame and replacing them at same time(frame actual time) with edited frame but i am unable to do that.
I have seen so many links of stackoverflow but that is not perfect for my requirement- iFrame extractor is not working.
Get image from a video frame in iPhone
In mean while, I though to extract all frames and save it in an array of dictionary with frames and respecting time and when i got a frame from running video then after editing i will run a loop for actual frame with respecting time and i will match the actual time with captured(edited frame running from video) frame and if it got then replace actual frame with edited frame and again write the video from all frames.
but to do so i used -
Get all frames of Video IOS 6
but it is crashing after extracting some images .
I have written my code like this
-(void)editMovie:(id)sender
{
float FPS=1;
NSString *videoPath = [[NSBundle mainBundle] pathForResource:#"IMG_0879" ofType:#"MOV"];
NSURL *videoURl = [NSURL fileURLWithPath:videoPath];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURl options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * FPS ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, FPS);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
[self saveImage: generatedImage atTime:actualTime]; // Saves the image on document directory and not memory
CGImageRelease(image);
}
}
}
-(void)saveImage:(UIImage*)image atTime:(CMTime)time
{
float t=time.timescale;
NSMutableDictionary *dict=[[NSMutableDictionary alloc]init];
[dict setObject:image forKey:#"image"];
[dict setObject:[NSString stringWithFormat:#"%f",t] forKey:#"time"];
[arrFrame addObject:dict];
NSLog(#"ArrayImg=%#",arrFrame);
}
My Requrement is ->>>
I need to run a video
From running video i have to pause a video and get image(frame) at pause time.
I have to edit captured image and save or replace at actual image.
4 when i agin play the video the edited image should be in video.
Please give me any example if you all have, I have downloaded so many projects or examples from given link of stack flow or other sites but no one perfect or even fulfilling my 20 % requirement.
Please give me any example or ides if you have.
I will be obliged to you, Thanks in Advance
I know this question has been asked many times on stackoverflow. But my problem is different.
I am iterating on the albumns of photos library to get all videos and their thumbnails.
Now, the problem is, my code is very slow to get the thumbnail of each video. For example, there is 14 videos in my camera roll and the total time taken to generate the thumbnail is around 3-4 seconds. I am using this code.
+(UIImage*)imageFromVideoAtURL:(NSURL*)contentURL {
UIImage* theImage = nil;
AVURLAsset* asset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
AVAssetImageGenerator* generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform = YES;
NSError* err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generator copyCGImageAtTime:time actualTime:NULL error:&err];
theImage = [[[UIImage alloc] initWithCGImage:imgRef] autorelease];
CGImageRelease(imgRef);
[asset release];
[generator release];
return theImage;
}
I am finding a way to get the thumbnail of all videos very fast so that user has not to wait. I have seen apps on Apple store that are doing the same thing in just micro seconds. Please help.
I have also tried this code to generate the thumbnail, it is also very slow.
MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:mediaUrl];
moviePlayer.shouldAutoplay = NO;
UIImage *thumbnail = [[moviePlayer thumbnailImageAtTime:0.0 timeOption:MPMovieTimeOptionNearestKeyFrame] retain];
[imageView setImage:thumbnail]; //imageView is a UIImageView
If you are loading videos from image library, it should already have the embedded thumbnail of the video. This can be accessed using thumbnail or aspectRatioThumnail methods of ALAsset class.
So in your case the thumbnails could be loaded like:
ALAssetLibrary* lib = [ALAssetLibrary new];
[lib assetForURL:contentURL resultBlock:^(ALAsset* asset) {
CGImageRef thumb = [asset thumbnail];
dispatch_async(dispatch_get_main_queue(), ^{
//do any UI operation here with thumb
});
}];
Please make sure to make any UIKit call in the main queue as assetForURL:: method may invoke the resultBlock in some background thread.
i am new to ios,I want to create thumnail for videos in horizontal view.After tapping on thumnail video should paly.I have created custum player but unable to create horizontal thgumbnail.
you can create thumbnail of video using bellow code:
-(UIImage *)getThumbNail:(NSString*)stringPath // string path is a path of you video file
{
NSURL *urlvideoImage =[NSURL fileURLWithPath:stringPath];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:urlvideoImage options:nil];
AVAssetImageGenerator *generateImg = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *error = NULL;
CMTime time = CMTimeMake(1, 65);
CGImageRef refImg = [generateImg copyCGImageAtTime:time actualTime:NULL error:&error];
NSLog(#"error==%#, Refimage==%#", error, refImg);
UIImage *FrameImage= [[UIImage alloc] initWithCGImage:refImg];
//
return FrameImage;
}
Create thumbnail from your video URL using above code.
Set this Image as a button background Image or you can use Image View and set Tap-gesture forget it's touch event.
By click on Button you can Play video with relevant code. Hope that is working for you.
I have an application that uploads videos into a server and now what I am trying to create is a news feed to display those video post but by a thumbnail or frame of the video. So that once the user clicks the thumbnail the video will play.
I have the following code used to grab the video from the server and play it:
- (void)openVideo {
NSString *videoURLString = #"http://myite.com/dev/iphone/uploads/t69566.mov";
NSURL *videoURL = [NSURL URLWithString:videoURLString];
MPMoviePlayerViewController *moviePlayerView = [[[MPMoviePlayerViewController alloc] initWithContentURL:videoURL] autorelease];
[self presentMoviePlayerViewControllerAnimated:moviePlayerView];
}
Now of coarse I am gong to make openVideo dynamic with the population of video post but I am stuck on taking this video and grabbing a frame or thumbnail and displaying it.
Suggestions? thoughts?
UPDATE:
I want to take a thumbnail from a video taken from server. which actually brings me to a question.
When the user first uploads the video, would it be better to just create a thumbnail there and upload it to my server and just associate that with the video on grabbing them to populate a news feed type?
You can do it using the AVAssetImageGenerator. Rough code, copied and pasted from a working project (but might not be properly isolated):
_thumbnailView is an UIImageView to display the thumbnail.
_activityView is a UIActivityIndicatorView that gets hidden when the thumbnail finishes loading.
AVPlayerItem *playerItem = [[AVPlayerItem playerItemWithURL:movieURL] retain];
AVAssetImageGenerator *_generator;
_generator = [[AVAssetImageGenerator assetImageGeneratorWithAsset:playerItem.asset] retain];
[playerItem release];
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
if (result == AVAssetImageGeneratorSucceeded) {
UIImage *thumb = [UIImage imageWithCGImage:image];
[_thumbnailView setImage:thumb forState:UIControlStateNormal];
} else {
DLog(#"%s - error: %#", __PRETTY_FUNCTION__, error);
}
dispatch_async(dispatch_get_main_queue(), ^{
_thumbnailView.hidden = NO;
_playButton.hidden = NO;
[_activityView stopAnimating];
});
};
[_generator generateCGImagesAsynchronouslyForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(30,30)]] completionHandler:handler];
Back to your question, in my opinion, a server side generated thumbnail is usually better, as servers have way more processing power. Also, in order to get this thumbnail you need to start downloading the actual video from the server.
Finally, not that you pass a CMTime parameter including a call to CMTimeMakeWithSeconds(). This might prove problematic, as you can easily hit a blank frame. This is why we are using 30 as the parameter, so we avoid that situation in our videos.
This question already has answers here:
iPhone Read UIimage (frames) from video with AVFoundation
(5 answers)
Closed 4 years ago.
I'm using UIImagePicker to allow the user to create a video and then trim it. I need to split that video into multiple frames and let the user choose one of them.
In order to show the frames I likely must convert them to UIImage. How can I do this? I must use AVFoundation but I couldn't find a tutorial on how to get & convert the frames.
Should I do the image capture with AVFoundation too? If so do I have to implementing trimming myself?
I think the answer in this question is what you are looking for.
iPhone Read UIimage (frames) from video with AVFoundation.
There are 2 methods specified by the accepted answer. You can use either one according to your requirement.
Here is code to get FPS images from video
1) Import
#import <Photos/Photos.h>
2) in viewDidLoad
videoUrl = [NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:#"VfE_html5" ofType:#"mp4"]];
[self createImage:5]; // 5 is frame per second (FPS) you can change FPS as per your requirement.
3) Functions
-(void)createImage:(int)withFPS {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * withFPS ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, withFPS);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
[self savePhotoToAlbum: generatedImage]; // Saves the image on document directory and not memory
CGImageRelease(image);
}
}
}
-(void)savePhotoToAlbum:(UIImage*)imageToSave {
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:imageToSave];
} completionHandler:^(BOOL success, NSError *error) {
if (success) {
NSLog(#"sucess.");
}
else {
NSLog(#"fail.");
}
}];
}
You also can use the lib VideoBufferReader (see on GitHub), based on AVFoundation.