I am using GPUImage framework and it is so good. In my app, i have record video and then add overlay on recorder video. (due to my requirement.)
Step 1: recording video is working fine and i am getting correct video.
Step 2: Add overlay on video file, here i am getting issue and get black video as output with small size.
My code is as below:
self.movieFile = [[GPUImageMovie alloc] initWithURL:capturevideoURL];
self.movieFile.delegate = self;
self.movieFile.runBenchmark = YES;
self.movieFile.playAtActualSpeed = YES;
[self.movieFile addTarget:self.filter];
UIImage *inputImage = self.overlayImage;
GPUImagePicture *overlayPicture = [[GPUImagePicture alloc] initWithImage:inputImage];
[overlayPicture addTarget:self.filter];
[overlayPicture processImage];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)self.view;
[self.filter addTarget:filterView];
[CommonFunction DeleteFileFromApp:#"Movie.m4v"];
NSString *pathToMovie = [CommonFunction GetPathForFileName:#"Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
self.movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(320.0, 480.0)];
[self.filter addTarget:self.movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
self.movieWriter.shouldPassthroughAudio = YES;
self.movieFile.audioEncodingTarget = self.movieWriter;
[self.movieFile enableSynchronizedEncodingUsingMovieWriter:self.movieWriter];
[self.movieWriter startRecording];
[self.movieFile startProcessing];
NSLog(#"Start Overlaying...");
__weak GPUImageNormalBlendFilter *filterlocal = self.filter;
__weak GPUImageMovieWriter *moviewriterlocal = self.movieWriter;
__weak typeof(self) weakSelf = self;
[movieWriter setCompletionBlock:^{
[filterlocal removeTarget:moviewriterlocal];
[moviewriterlocal finishRecording];
[weakSelf loadVideo];
}];
Can you suggest me, how to solve this issue of black video output?
Thanks
Related
I can't seem to get the GPUImageMovieWriter to work properly.
Despite the fact that the authors are always suggesting to look over their examples; none of them are able to properly build and run.
So here's my code sample:
ALAsset *asset = (ALAsset *)[self galleryImages][indexPath.row];
NSURL *fileURLForInput = [asset valueForProperty:ALAssetPropertyAssetURL];
GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:fileURLForInput];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
NSURL *fileURLForOutput = [NSURL fileURLWithPath:[self getVideoLocalFilePathWithVideoID:videoID]];
GPUImageMovieWriter *movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:fileURLForOutput size:CGSizeMake(640, 480)];
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
PSLog(#"DONE!");
}];
Running the code sample results in a "EXC_BAD_ACCESS" error.
And if I were to remove this line:
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
The app does not crash but the output file has a size of 0 bytes and the completion block is never called.
Any ideas? :(
You need to call [movieFile addTarget:movieWriter] before you start processing the movie.
I think you need to make the GPUImageMovie and GPUImageMovieWriter variable a strong property or an instance variable.
Been struggling with this for a couple of days now. I'm processing a video with a filter, it saves the video just fine. However, after it's saved, it takes a long time to update the UI. I can see the video in iTunes (with iTunes file sharing), a long time before the UI is updated.
I create the view like this, and add that to my view controller. This is just so the user can preview the video and select filter.
-(GPUImageView*)playClipWithClip:(MYClip*)clip
{
_clip = clip;
_filterView = [[GPUImageView alloc] initWithFrame:CGRectMake(0, 0, 568, 320)];
_movieFile = [[GPUImageMovie alloc] initWithURL:[self urlForCurrentClip]];
_movieFile.runBenchmark = NO;
_movieFile.playAtActualSpeed = YES;
_movieFile.shouldRepeat = YES;
[self changeFilter];
return _filterView;
}
When the user wants to save the video I have this method:
-(void)saveClipWithFilter
{
[_movieFile cancelProcessing];
_movieFile.runBenchmark = YES;
_movieFile.playAtActualSpeed = NO;
_movieFile.shouldRepeat = NO;
NSString *movieName = [self fileNameForGeneratedClip];
NSString *generatedMovieNameWithPath = [NSString stringWithFormat:#"Documents/%#",movieName];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:generatedMovieNameWithPath];
unlink([pathToMovie UTF8String]);
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
_movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(568, 320.0)];
[_filter addTarget:_movieWriter];
_movieWriter.shouldPassthroughAudio = NO;
[_movieFile enableSynchronizedEncodingUsingMovieWriter:_movieWriter];
[_movieWriter startRecording];
[_movieFile startProcessing];
__weak typeof(self) weakSelf = self;
[_movieWriter setCompletionBlock:^{
NSLog(#"**************************** DONE ****************************");
[weakSelf.filter removeTarget:weakSelf.movieWriter];
[weakSelf.movieWriter finishRecording];
[weakSelf exitVideoEditingModeAndSave];
}];
}
My method [weakSelf exitVideoEditingModeAndSave]; is called. And that method in turn calls the delegate (my view controller).
The problem is that after my delegate is called and my NSLog shows, it will take about 10 seconds for the view to update. I know that the file is ready and has been saved.
Any ideas?
This is a threading issue, in your completion block, dispatch to the main thread before you update any UI elements
I try to blend two videos using GPUImage. One of them (show.mov) contain green background.
Here is show.mov and galaxy.mov.
I downloaded the latest version of GPUImage and changed SimpleVideoFileFilter example:
- (void)viewDidLoad
{
[super viewDidLoad];
_movie = [[GPUImageMovie alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"galaxy" ofType:#"mov"]]];
_greenMovie = [[GPUImageMovie alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"snow" ofType:#"mov"]]];
_movie.playAtActualSpeed = YES;
_greenMovie.playAtActualSpeed = YES;
_filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[_greenMovie addTarget:_filter];
[_movie addTarget:_filter];
GPUImageView *filterView = (GPUImageView *)self.view;
[_filter addTarget:filterView];
[_greenMovie startProcessing];
[_movie startProcessing];
}
When I run the project (no matter on device or simulator) I get just blank white view and after 14 seconds (length of show.mov) I see only last frame of blended videos. Using writer from example create file on disk, but this file can't be opened.
I am using iPhone 5 with 7.0.3 and XCode 5.0.2
Did I miss something?
Thanks to #BradLarson, he pointed to an github issue where I found commit that caused this problem.
In GPUImageMovie.m I comment two lines in startProcessing method:
[inputAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler: ^{
//runSynchronouslyOnVideoProcessingQueue(^{
NSError *error = nil;
AVKeyValueStatus tracksStatus = [inputAsset statusOfValueForKey:#"tracks" error:&error];
if (!tracksStatus == AVKeyValueStatusLoaded)
{
return;
}
blockSelf.asset = inputAsset;
[blockSelf processAsset];
blockSelf = nil;
//});
}];
I've tried the examples from the GPUImage library.
I tried the 'SimpleVideoFileFilter' but all I see its black screen with the slider.
I tried to do the same by my self, I got the images working perfect.
but I can't really understand the videos process.
the examples taking the videos from the project itself ? folder from the mac ? or its from the iPhone's videos ?
I don't have Apple Developer account, so I can't really test it on my device.
I found way to put an random (.m4v) file in the iPhone Simulator, and tried to play/filter the video.
anyone had this issue ? I'm trying for now just to play the video from the iPhone Simulator or the resource files .. I don't really know where how its works.
I tried this link, no luck.
Here is part of the example that we can find in GPUImage:
- (void)viewDidLoad
{
[super viewDidLoad];
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sample_iPod" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = YES;
filter = [[GPUImagePixellateFilter alloc] init];
// filter = [[GPUImageUnsharpMaskFilter alloc] init];
[movieFile addTarget:filter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)self.view;
[filter addTarget:filterView];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/TestMovie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(640.0, 480.0)];
[filter addTarget:movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
}];
}
simple code that should filter the video ..
all I get from there is black screen with the slider ..
I'm talking about this project.
Movie playback does not currently work within the Simulator using GPUImage. You'll need to run this on an actual device to have this work.
I'm not sure why movie files don't output anything when run from the Simulator, but you're welcome to dig into the GPUImageMovie code and see what might be wrong. Since it runs fine when operated on actual hardware, it hasn't been a priority of mine to fix.
UPDATE: I know now that i'm sending messages to a deallocated instance, i think i've setup my project correct, but are starting to doubt this....
I've read brad's documentation and set up my project as no-arc, that will probably have something to do with the situation. I've added the linker flag -fobjc-arc and targeted GPUImage and my target to ios 4.3.
Any one out there with the same problem that discoverd what he/she was doing wrong????
I've been searching for an answer for days now, but I can't fix it and have run out of ideas.
Whenever I try to record a movie with the GPUImageMovieWriter my app crashes (EXC_BADACCESS) on:
CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget);
Full code:
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/file.mov"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
GPUImageSepiaFilter *sepiaFilter = [[GPUImageSepiaFilter alloc] init];
[videoCamera addTarget:sepiaFilter];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(640.0, 480.0)];
movieWriter.encodingLiveVideo = YES;
[sepiaFilter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
Ant help or pointers welcome!