GPUImage - MovieWriter, nothing happens for a while after "setCompletionBlock" - ios

Been struggling with this for a couple of days now. I'm processing a video with a filter, it saves the video just fine. However, after it's saved, it takes a long time to update the UI. I can see the video in iTunes (with iTunes file sharing), a long time before the UI is updated.
I create the view like this, and add that to my view controller. This is just so the user can preview the video and select filter.
-(GPUImageView*)playClipWithClip:(MYClip*)clip
{
_clip = clip;
_filterView = [[GPUImageView alloc] initWithFrame:CGRectMake(0, 0, 568, 320)];
_movieFile = [[GPUImageMovie alloc] initWithURL:[self urlForCurrentClip]];
_movieFile.runBenchmark = NO;
_movieFile.playAtActualSpeed = YES;
_movieFile.shouldRepeat = YES;
[self changeFilter];
return _filterView;
}
When the user wants to save the video I have this method:
-(void)saveClipWithFilter
{
[_movieFile cancelProcessing];
_movieFile.runBenchmark = YES;
_movieFile.playAtActualSpeed = NO;
_movieFile.shouldRepeat = NO;
NSString *movieName = [self fileNameForGeneratedClip];
NSString *generatedMovieNameWithPath = [NSString stringWithFormat:#"Documents/%#",movieName];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:generatedMovieNameWithPath];
unlink([pathToMovie UTF8String]);
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
_movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(568, 320.0)];
[_filter addTarget:_movieWriter];
_movieWriter.shouldPassthroughAudio = NO;
[_movieFile enableSynchronizedEncodingUsingMovieWriter:_movieWriter];
[_movieWriter startRecording];
[_movieFile startProcessing];
__weak typeof(self) weakSelf = self;
[_movieWriter setCompletionBlock:^{
NSLog(#"**************************** DONE ****************************");
[weakSelf.filter removeTarget:weakSelf.movieWriter];
[weakSelf.movieWriter finishRecording];
[weakSelf exitVideoEditingModeAndSave];
}];
}
My method [weakSelf exitVideoEditingModeAndSave]; is called. And that method in turn calls the delegate (my view controller).
The problem is that after my delegate is called and my NSLog shows, it will take about 10 seconds for the view to update. I know that the file is ready and has been saved.
Any ideas?

This is a threading issue, in your completion block, dispatch to the main thread before you update any UI elements

Related

How to wait for delegate method execution before return statement?

I have a model object that has a class method that checks if the model object already exists, and if it does it returns it, or if it doesn't it creates it and then returns it. This class makes use of the VLC framework for generating data about video files and to generate a thumbnail. This is where I'm having trouble.
The VLCThumbnailer returns the thumbnail via a delegate method once it's fetchthumbnail method is called. The problem is that the delegate method doesn't get returned until AFTER my class-creation method reaches it's return function. Here's a code example.
-(AnimuProfile*)createnewProfileforFilename:(NSString*)filename{
NSURL *fileURL = [NSURL fileURLWithPath:filename];
VLCMedia *media = [VLCMedia mediaWithURL:fileURL];
FilenameParser *parser = [[FilenameParser alloc]init];
NSArray *parsedFilename = [parser parseFilename:[filename lastPathComponent]];
NSArray *mediaArray = [media tracksInformation];
if (mediaArray.count != 0) {
NSDictionary *videoTrackinfo = [mediaArray objectAtIndex:0];
_fansubGroup = parsedFilename[0];
_seriesTitle = parsedFilename[1];
_episodeNumber = parsedFilename[2];
_filename = [filename lastPathComponent];
_filepathURL = fileURL;
_filepathString = filename;
_watched = NO;
_progress = [VLCTime timeWithInt:0];
_length = [[media length]stringValue];
NSNumber *resolution = [videoTrackinfo valueForKey:#"height"];
_resolution = [NSString stringWithFormat:#"%#p",resolution];
VLCMediaThumbnailer *thumbnailer = [VLCMediaThumbnailer thumbnailerWithMedia:media andDelegate:self];
[thumbnailer fetchThumbnail];
NSString *libPath = [NSSearchPathForDirectoriesInDomains(NSLibraryDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *profileName = [[_filename lastPathComponent] stringByAppendingPathExtension:#"prf"];
NSString *pathandProfileName = [libPath stringByAppendingPathComponent:profileName];
[NSKeyedArchiver archiveRootObject:self toFile:pathandProfileName];
return self;
}
And then the delegate methods:
#pragma mark VLC Thumbnailer delegate methods
- (void)mediaThumbnailerDidTimeOut:(VLCMediaThumbnailer *)mediaThumbnailerP{
NSLog(#"Thumbnailer timed out on file %#",_filename);
UIImage *filmstrip = [UIImage imageNamed:#"filmstrip"];
_thumbnail = UIImagePNGRepresentation(filmstrip);
}
- (void)mediaThumbnailer:(VLCMediaThumbnailer *)mediaThumbnailer didFinishThumbnail:(CGImageRef)thumbnail{
UIImage *image = [UIImage imageWithCGImage:thumbnail];
_thumbnail = UIImagePNGRepresentation(image);
}
I know it's a nono to lock the main thread waiting for the delegate method to be called so what should be done in this instance?
I know it's a nono to lock the main thread waiting for the delegate
method to be called so what should be done in this instance?
Those delegate methods are being called on VLC's video processing thread. They aren't the main thread and, therefore, you shouldn't be calling random UIKit API directly in the return blocks.
You need to process the results when they are available. If VLC were implemented using modern patterns, it would be using completion blocks. But it isn't, so...
- (void)mediaThumbnailer:(VLCMediaThumbnailer *)mediaThumbnailer didFinishThumbnail:(CGImageRef)thumbnail{
{
dispatch_async(dispatch_get_main_queue(), ^{ ... process thumbnail and update UI accordingly here ...});
}
That is, your createnewProfileforFilename: method should start the processing, but not expect it to be finished until sometime later. Then, when that sometime later happens, you trigger the updating of the UI with the data that was processed in the background.
And, as you state, you should never block the main queue/thread.
I was able to solve it by creating a separate class to be the delgate, make thumbnail fetch requests and then handle them.
#property NSMutableArray *queue;
#end
#implementation ThumbnailWaiter
+(id)sharedThumbnailWaiter{
static ThumbnailWaiter *singletonInstance = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
singletonInstance = [[self alloc] init];
});
return singletonInstance;
}
-(id)init{
self = [super init];
if (self) {
NSMutableArray *queue = [NSMutableArray array];
_queue = queue;
}
return self;
}
-(void)requestThumbnailForProfile:(AnimuProfile*)profile{
VLCMedia *media = [VLCMedia mediaWithURL:profile.filepathURL];
VLCMediaThumbnailer *thumbnailer = [VLCMediaThumbnailer thumbnailerWithMedia:media andDelegate:self];
[_queue addObject:profile];
[thumbnailer fetchThumbnail];
}
#pragma mark VLC Thumbnailer delegate methods
- (void)mediaThumbnailerDidTimeOut:(VLCMediaThumbnailer *)mediaThumbnailerP{
}
- (void)mediaThumbnailer:(VLCMediaThumbnailer *)mediaThumbnailer didFinishThumbnail:(CGImageRef)thumbnail{
UIImage *image = [UIImage imageWithCGImage:thumbnail];
AnimuProfile *profile = _queue.firstObject;
profile.thumbnail = UIImagePNGRepresentation(image);
[profile saveProfile];
[_queue removeObjectAtIndex:0];
}
Seems almost silly to have to do it this way but it seems to be working.

ios - How to properly export videos using GPUImage

I can't seem to get the GPUImageMovieWriter to work properly.
Despite the fact that the authors are always suggesting to look over their examples; none of them are able to properly build and run.
So here's my code sample:
ALAsset *asset = (ALAsset *)[self galleryImages][indexPath.row];
NSURL *fileURLForInput = [asset valueForProperty:ALAssetPropertyAssetURL];
GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:fileURLForInput];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
NSURL *fileURLForOutput = [NSURL fileURLWithPath:[self getVideoLocalFilePathWithVideoID:videoID]];
GPUImageMovieWriter *movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:fileURLForOutput size:CGSizeMake(640, 480)];
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
PSLog(#"DONE!");
}];
Running the code sample results in a "EXC_BAD_ACCESS" error.
And if I were to remove this line:
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
The app does not crash but the output file has a size of 0 bytes and the completion block is never called.
Any ideas? :(
You need to call [movieFile addTarget:movieWriter] before you start processing the movie.
I think you need to make the GPUImageMovie and GPUImageMovieWriter variable a strong property or an instance variable.

effect on video using GPU image Framework but completion block of movie writer is never called?

self.movie = [[GPUImageMovie alloc]initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"merge_video" ofType:#"mp4"]]];
self.sketchFilter = [[GPUImageSketchFilter alloc]init];
[self.movie addTarget:self.sketchFilter];
self.movie.runBenchmark = YES;
self.movie.playAtActualSpeed = NO;
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/MyMovie.m4v"];
unlink([pathToMovie UTF8String]);
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
self.writer = [[GPUImageMovieWriter alloc]initWithMovieURL:movieURL size:CGSizeMake(320, 320)];
[self.sketchFilter addTarget:self.writer];
[self.writer setShouldPassthroughAudio:YES];
[self.movie setAudioEncodingTarget: self.writer];
__weak GPUImageMovieWriter *weakSelf = writer;
[self.movie enableSynchronizedEncodingUsingMovieWriter: weakSelf];
[ weakSelf startRecording];
[self.movie startProcessing];
[weakSelf setCompletionBlock:^{
NSLog(#"Effecting Completed Succefully");
[self.sketchFilter removeTarget:weakSelf];
[weakSelf finishRecording];
}];
You don't show where you define the writer instance variable, but I bet you created that as an instance variable and then created a writer property which is instead backed by a generated _writer variable.
This means that writer is nil in the above and your movie never even starts processing. Use self.writer instead.

Blend two videos with chroma key using GPUImage

I try to blend two videos using GPUImage. One of them (show.mov) contain green background.
Here is show.mov and galaxy.mov.
I downloaded the latest version of GPUImage and changed SimpleVideoFileFilter example:
- (void)viewDidLoad
{
[super viewDidLoad];
_movie = [[GPUImageMovie alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"galaxy" ofType:#"mov"]]];
_greenMovie = [[GPUImageMovie alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"snow" ofType:#"mov"]]];
_movie.playAtActualSpeed = YES;
_greenMovie.playAtActualSpeed = YES;
_filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[_greenMovie addTarget:_filter];
[_movie addTarget:_filter];
GPUImageView *filterView = (GPUImageView *)self.view;
[_filter addTarget:filterView];
[_greenMovie startProcessing];
[_movie startProcessing];
}
When I run the project (no matter on device or simulator) I get just blank white view and after 14 seconds (length of show.mov) I see only last frame of blended videos. Using writer from example create file on disk, but this file can't be opened.
I am using iPhone 5 with 7.0.3 and XCode 5.0.2
Did I miss something?
Thanks to #BradLarson, he pointed to an github issue where I found commit that caused this problem.
In GPUImageMovie.m I comment two lines in startProcessing method:
[inputAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler: ^{
//runSynchronouslyOnVideoProcessingQueue(^{
NSError *error = nil;
AVKeyValueStatus tracksStatus = [inputAsset statusOfValueForKey:#"tracks" error:&error];
if (!tracksStatus == AVKeyValueStatusLoaded)
{
return;
}
blockSelf.asset = inputAsset;
[blockSelf processAsset];
blockSelf = nil;
//});
}];

GPUImage + Get Blank Video as output of video file overlay

I am using GPUImage framework and it is so good. In my app, i have record video and then add overlay on recorder video. (due to my requirement.)
Step 1: recording video is working fine and i am getting correct video.
Step 2: Add overlay on video file, here i am getting issue and get black video as output with small size.
My code is as below:
self.movieFile = [[GPUImageMovie alloc] initWithURL:capturevideoURL];
self.movieFile.delegate = self;
self.movieFile.runBenchmark = YES;
self.movieFile.playAtActualSpeed = YES;
[self.movieFile addTarget:self.filter];
UIImage *inputImage = self.overlayImage;
GPUImagePicture *overlayPicture = [[GPUImagePicture alloc] initWithImage:inputImage];
[overlayPicture addTarget:self.filter];
[overlayPicture processImage];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)self.view;
[self.filter addTarget:filterView];
[CommonFunction DeleteFileFromApp:#"Movie.m4v"];
NSString *pathToMovie = [CommonFunction GetPathForFileName:#"Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
self.movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(320.0, 480.0)];
[self.filter addTarget:self.movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
self.movieWriter.shouldPassthroughAudio = YES;
self.movieFile.audioEncodingTarget = self.movieWriter;
[self.movieFile enableSynchronizedEncodingUsingMovieWriter:self.movieWriter];
[self.movieWriter startRecording];
[self.movieFile startProcessing];
NSLog(#"Start Overlaying...");
__weak GPUImageNormalBlendFilter *filterlocal = self.filter;
__weak GPUImageMovieWriter *moviewriterlocal = self.movieWriter;
__weak typeof(self) weakSelf = self;
[movieWriter setCompletionBlock:^{
[filterlocal removeTarget:moviewriterlocal];
[moviewriterlocal finishRecording];
[weakSelf loadVideo];
}];
Can you suggest me, how to solve this issue of black video output?
Thanks

Resources