I try to blend two videos using GPUImage. One of them (show.mov) contain green background.
Here is show.mov and galaxy.mov.
I downloaded the latest version of GPUImage and changed SimpleVideoFileFilter example:
- (void)viewDidLoad
{
[super viewDidLoad];
_movie = [[GPUImageMovie alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"galaxy" ofType:#"mov"]]];
_greenMovie = [[GPUImageMovie alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"snow" ofType:#"mov"]]];
_movie.playAtActualSpeed = YES;
_greenMovie.playAtActualSpeed = YES;
_filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[_greenMovie addTarget:_filter];
[_movie addTarget:_filter];
GPUImageView *filterView = (GPUImageView *)self.view;
[_filter addTarget:filterView];
[_greenMovie startProcessing];
[_movie startProcessing];
}
When I run the project (no matter on device or simulator) I get just blank white view and after 14 seconds (length of show.mov) I see only last frame of blended videos. Using writer from example create file on disk, but this file can't be opened.
I am using iPhone 5 with 7.0.3 and XCode 5.0.2
Did I miss something?
Thanks to #BradLarson, he pointed to an github issue where I found commit that caused this problem.
In GPUImageMovie.m I comment two lines in startProcessing method:
[inputAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler: ^{
//runSynchronouslyOnVideoProcessingQueue(^{
NSError *error = nil;
AVKeyValueStatus tracksStatus = [inputAsset statusOfValueForKey:#"tracks" error:&error];
if (!tracksStatus == AVKeyValueStatusLoaded)
{
return;
}
blockSelf.asset = inputAsset;
[blockSelf processAsset];
blockSelf = nil;
//});
}];
Related
I am building up a APP on iOS by AudioKit(version 4.5.3), and I find out the AKTimePitch class does not work for me, here is my code(objective-c xcode 10):
(IBAction)startButton:(id)sender {
NSURL *url = [[NSBundle mainBundle] URLForResource:#"burncalory" withExtension:#"m4a"];
AKAudioFile *file = [[AKAudioFile alloc] initForReading:url error:nil];
AKAudioPlayer *player = [[AKAudioPlayer alloc] initWithFile:file looping:NO lazyBuffering:YES error:nil completionHandler:^{
NSLog(#"Finished!");
}];
AKTimePitch *akTimePitch = [[AKTimePitch alloc] init:player rate:2.0 pitch:1600 overlap:8];
AudioKit.output = akTimePitch;
[akTimePitch start];
[AudioKit startAndReturnError:nil];
[player playFrom:0.0];
}
I check out the playground(4.5.3), and the sample of "Time Stretching and Pitch Shifting" works well.
Is there something wrong in my code to use AKTimePitch or something wrong with my audio file example.m4a? By the way, this audio file can be loaded and play well by AKAudioPlayer.
After some testing I found that the parameter in the init method does not work, but after I add akTimePitch.pitch=1600 before [player playFrom:0.0], then the AKTimePitch effects works!! I don't know why the AKTimePitch *akTimePitch = [[AKTimePitch alloc] init:player rate:2.0 pitch:1600 overlap:8]; just does not work...
In my iOS application, I'm trying to play list of videos downloaded to applications' Documents directory. To achieve that target, I used AVQueuePlayer. Following is my code which leads to app crash after 6/7 times looping.
#interface PlayYTVideoViewController () <NSURLConnectionDataDelegate, UITableViewDataSource, UITableViewDelegate>
{
AVQueuePlayer *avQueuePlayer;
}
- (void)playlistLoop
{
NSLog(#"%s - %d", __PRETTY_FUNCTION__, __LINE__);
lastPlayedVideoNumber = 0;
_loadingVideoLabel.hidden = YES;
avPlayerItemsMutArray = [[NSMutableArray alloc] init];
for (NSString *videoPath in clipUrlsMutArr)
{
NSURL *vidPathUrl = [NSURL fileURLWithPath:videoPath];
AVPlayerItem *avpItem = [AVPlayerItem playerItemWithURL:vidPathUrl];
[avPlayerItemsMutArray addObject:avpItem];
}
avPlayerItemsArray = [avPlayerItemsMutArray copy];
for(AVPlayerItem *item in avPlayerItemsArray)
{
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(itemDidPlayToEndTime:) name:AVPlayerItemDidPlayToEndTimeNotification object:item];
}
avQueuePlayer = [AVQueuePlayer queuePlayerWithItems:avPlayerItemsArray];
avQueuePlayer.actionAtItemEnd = AVPlayerActionAtItemEndAdvance;
introVideoLayer = [AVPlayerLayer playerLayerWithPlayer:avQueuePlayer];
introVideoLayer.frame = _mpIntroVideoView.bounds;
[_mpContainerView.layer addSublayer:introVideoLayer];
[avQueuePlayer play];
}
- (void)itemDidPlayToEndTime:(NSNotification *)notification
{
NSLog(#"%s - %d", __PRETTY_FUNCTION__, __LINE__);
AVPlayerItem *endedAVPlayerItem = [notification object];
[endedAVPlayerItem seekToTime:kCMTimeZero];
for (AVPlayerItem *item in avPlayerItemsArray)
{
if (item == endedAVPlayerItem)
{
lastPlayedVideoNumber++;
break;
}
}
[self reloadVideoClipsTable];
if ([endedAVPlayerItem isEqual:[avPlayerItemsArray lastObject]])
{
[self playlistLoop];
}
}
After getting memory issue, I tried to make some changes to above code.
I tried to set avQueuePlayer variable public and set it as strong variable
#property (strong, nonatomic) AVQueuePlayer *avQueuePlayer;
By doing that I expected avQueuePlayer variable remain in the memory till we manually set to nil. But that didn't solve the problem.
Then I tried to set player, related arrays and layers to nil and created again for new loop session.
if (avPlayerItemsMutArray != nil)
{
avPlayerItemsMutArray = nil;
}
avPlayerItemsMutArray = [[NSMutableArray alloc] init];
if (avPlayerItemsArray != nil)
{
avPlayerItemsArray = nil;
}
avPlayerItemsArray = [avPlayerItemsMutArray copy];
if (avQueuePlayer != nil)
{
avQueuePlayer = nil;
}
avQueuePlayer = [AVQueuePlayer queuePlayerWithItems:avPlayerItemsArray];
if(introVideoLayer != nil)
{
[introVideoLayer removeFromSuperlayer];
introVideoLayer = nil;
}
introVideoLayer = [AVPlayerLayer playerLayerWithPlayer:avQueuePlayer];
But that also didn't help to solve the issue.
Next I try to remove the observer before it re-initialized in a new loop
if (avPlayerItemsArray != nil)
{
avPlayerItemsArray = nil;
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:nil];
}
But that also didn't help.
Next I used Instrument to find out memory usages and leaks. Application is not exceeding 18 MB when it is crashing and also there were more than 200 MB remaining as free. Instruments is little more complicated but still I didn't find any memory leaks related to this code.
Actually the error was not with the AVQueuePlayer. In my application I'm listing all the videos inside a table below the video playing view. In that table, each row consists with video thumbnail that I taken from below code.
+ (UIImage *)createThumbForVideo:(NSString *)vidFileName
{
NSString *videoFolder = [Video getVideoFolder];
NSString *videoFilePath = [videoFolder stringByAppendingFormat:#"/trickbook/videos/edited/%#",vidFileName];
NSURL *url = [NSURL fileURLWithPath:videoFilePath];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generateImg = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generateImg.appliesPreferredTrackTransform = YES;
NSError *error = NULL;
CMTime time = CMTimeMake(1, 65);
CGImageRef refImg = [generateImg copyCGImageAtTime:time actualTime:NULL error:&error];
UIImage *frameImage = [[UIImage alloc] initWithCGImage:refImg];
return frameImage;
}
Every time a video clip ends playing and also playlist begins a new loop, I update the table view. So each time I call above method and that's the reason for memory issue.
As the solution I call this method only once for a single video clip and store the returning UIImage in a mutable array. That solved the issue.
Heading of the question and the tags may not adequate with the answer, but I thought this is worth existing as a Q & A rather than deleting the post.
I need the iOS camera to take a picture without any input from the user. How would I go about doing this? This is my code so far:
-(void)initCapture{
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] init];
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
AVVideoCodecJPEG, AVVideoCodecKey,
nil];
[newStillImageOutput setOutputSettings:outputSettings];
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
}
self.stillImageOutput = newStillImageOutput;
}
What else do I need to add and where do I go from here? I don't want to take a video, only a single still image. Also, how would I convert the image into a UIImage afterwards? Thanks
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureSession_Class/Reference/Reference.html
Look at AVCaptureSession on how to take take picture without user input. You will need to add AVCaptureDevice for camera to the session.
You've got an AVCaptureStillImageOutput, stillImageOutput. Plus you need to have stored your capture session where you can get at it later. Call it self.sess. Then:
AVCaptureConnection *vc =
[self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[self.sess startRunning];
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:vc
completionHandler:
^(CMSampleBufferRef buf, NSError *err) {
NSData* data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:buf];
UIImage* im = [UIImage imageWithData:data];
dispatch_async(dispatch_get_main_queue(), ^{
UIImageView* iv = [[UIImageView alloc] initWithFrame:someframe];
iv.contentMode = UIViewContentModeScaleAspectFit;
iv.image = im;
[self.view addSubview: iv];
[self.iv removeFromSuperview]; // in case we already did this
self.iv = iv;
[self.sess stopRunning];
});
}];
As #Salil says, the apple's official AVFoundation is very useful, take a look at that to understand main concept. Then you can download a sample code to see how to achieve the goal.
Here is the sample code to take still images with a preview layer.
Been struggling with this for a couple of days now. I'm processing a video with a filter, it saves the video just fine. However, after it's saved, it takes a long time to update the UI. I can see the video in iTunes (with iTunes file sharing), a long time before the UI is updated.
I create the view like this, and add that to my view controller. This is just so the user can preview the video and select filter.
-(GPUImageView*)playClipWithClip:(MYClip*)clip
{
_clip = clip;
_filterView = [[GPUImageView alloc] initWithFrame:CGRectMake(0, 0, 568, 320)];
_movieFile = [[GPUImageMovie alloc] initWithURL:[self urlForCurrentClip]];
_movieFile.runBenchmark = NO;
_movieFile.playAtActualSpeed = YES;
_movieFile.shouldRepeat = YES;
[self changeFilter];
return _filterView;
}
When the user wants to save the video I have this method:
-(void)saveClipWithFilter
{
[_movieFile cancelProcessing];
_movieFile.runBenchmark = YES;
_movieFile.playAtActualSpeed = NO;
_movieFile.shouldRepeat = NO;
NSString *movieName = [self fileNameForGeneratedClip];
NSString *generatedMovieNameWithPath = [NSString stringWithFormat:#"Documents/%#",movieName];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:generatedMovieNameWithPath];
unlink([pathToMovie UTF8String]);
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
_movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(568, 320.0)];
[_filter addTarget:_movieWriter];
_movieWriter.shouldPassthroughAudio = NO;
[_movieFile enableSynchronizedEncodingUsingMovieWriter:_movieWriter];
[_movieWriter startRecording];
[_movieFile startProcessing];
__weak typeof(self) weakSelf = self;
[_movieWriter setCompletionBlock:^{
NSLog(#"**************************** DONE ****************************");
[weakSelf.filter removeTarget:weakSelf.movieWriter];
[weakSelf.movieWriter finishRecording];
[weakSelf exitVideoEditingModeAndSave];
}];
}
My method [weakSelf exitVideoEditingModeAndSave]; is called. And that method in turn calls the delegate (my view controller).
The problem is that after my delegate is called and my NSLog shows, it will take about 10 seconds for the view to update. I know that the file is ready and has been saved.
Any ideas?
This is a threading issue, in your completion block, dispatch to the main thread before you update any UI elements
I am using GPUImage framework and it is so good. In my app, i have record video and then add overlay on recorder video. (due to my requirement.)
Step 1: recording video is working fine and i am getting correct video.
Step 2: Add overlay on video file, here i am getting issue and get black video as output with small size.
My code is as below:
self.movieFile = [[GPUImageMovie alloc] initWithURL:capturevideoURL];
self.movieFile.delegate = self;
self.movieFile.runBenchmark = YES;
self.movieFile.playAtActualSpeed = YES;
[self.movieFile addTarget:self.filter];
UIImage *inputImage = self.overlayImage;
GPUImagePicture *overlayPicture = [[GPUImagePicture alloc] initWithImage:inputImage];
[overlayPicture addTarget:self.filter];
[overlayPicture processImage];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)self.view;
[self.filter addTarget:filterView];
[CommonFunction DeleteFileFromApp:#"Movie.m4v"];
NSString *pathToMovie = [CommonFunction GetPathForFileName:#"Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
self.movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(320.0, 480.0)];
[self.filter addTarget:self.movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
self.movieWriter.shouldPassthroughAudio = YES;
self.movieFile.audioEncodingTarget = self.movieWriter;
[self.movieFile enableSynchronizedEncodingUsingMovieWriter:self.movieWriter];
[self.movieWriter startRecording];
[self.movieFile startProcessing];
NSLog(#"Start Overlaying...");
__weak GPUImageNormalBlendFilter *filterlocal = self.filter;
__weak GPUImageMovieWriter *moviewriterlocal = self.movieWriter;
__weak typeof(self) weakSelf = self;
[movieWriter setCompletionBlock:^{
[filterlocal removeTarget:moviewriterlocal];
[moviewriterlocal finishRecording];
[weakSelf loadVideo];
}];
Can you suggest me, how to solve this issue of black video output?
Thanks