I am trying to overlay some text on a video and have not had any success so far.
videoCamera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
cropFilter = [[GPUImageCropFilter alloc] initWithCropRegion:CGRectMake(0, 0, 1, 1)];
mCurrentImage = [UIImage imageNamed:#"tex16"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:mCurrentImage smoothlyScaleOutput:NO];
[sourcePicture processImage];
customFilter = [[GPUFilter alloc] initWithFragmentShaderFromFile:#"shader"];
[videoCamera addTarget:cropFilter];
[cropFilter addTarget:customFilter atTextureLocation:0];
[sourcePicture addTarget:customFilter atTextureLocation:1];
[customFilter addTarget:mViewCameraPreview];//(GPUImageView*)mViewCameraPreview];
blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
UILabel *timeLabel = [[UILabel alloc] initWithFrame:CGRectMake(0.0, 0.0, 240.0f, 320.0f)];
timeLabel.font = [UIFont systemFontOfSize:17.0f];
timeLabel.text = #"Time: 0.0 s";
timeLabel.textAlignment = NSTextAlignmentCenter;
timeLabel.backgroundColor = [UIColor clearColor];
timeLabel.textColor = [UIColor whiteColor];
uiElementInput = [[GPUImageUIElement alloc] initWithView:timeLabel];
[customFilter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:mViewCameraPreview];
[videoCamera startCameraCapture];
Everything complies and runs without throwing any exceptions, however there is no text to be found.
Does anyone see what I am doing wrong?
Thanks.
Did you try performing an update of UIElementInput right after starting camera capture ? If not, please try to add this code at the end of the code you provided.
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
[weakUIElementInput update];
}];
If it doesn't work or if you've already did it, you should try with a basic filter (no custom filter) to see if the problem persists.
Related
I keep getting this error on and off. I've seen some solutions that recommend using the GPUImageNormalBlendFilter in the filter chain however doing so has resulted in a solid grey colored output.
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionFront];
filter = [[GPUImageFilter alloc] init];
[videoCamera addTarget:_filter];
blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
animatedImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)];
[contentView addSubview:_animatedImageView];
[contentView addSubview:[self watermark]];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:filteredVideoView];
filtered video view is GPUImageView
You must use setFrameProcessingCompletionBlock and invoke uiElement's update method:
__weak typeof(self) weakSelf = self;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput *filter, CMTime frameTime) {
[weakSelf.uiElementInput update];
}];
[videoCamera startCameraCapture];
I has tried the most of the question related to GPUImage (GPUImageMovieWriter) to export the video.I write this code to create and export the video.This gives the black screen and does not export the video.
In the ViewController.h file..
{
GPUImageMovie *movieFile;
GPUImageOutput<GPUImageInput> *filter;
GPUImageMovieWriter *movieWriter;
GPUImageUIElement *uiElementInput;
}
#property (nonatomic,strong) IBOutlet UIView *vwVideo;
In the ViewController.m file
- (void) editVideo {
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sophie" withExtension:#"mov"];
GPUImagePicture *overlay = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:#"iphone-icon-180.png"] smoothlyScaleOutput:YES];
[overlay processImage];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
filter = [[GPUImageBrightnessFilter alloc] init];
[(GPUImageBrightnessFilter*)filter setBrightness:0.0];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
UIImageView *ivTemp = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 147, 59)];
ivTemp.image = [UIImage imageNamed:#"iphone-icon-180.png"];
[contentView addSubview:ivTemp];
UILabel *lblDemo = [[UILabel alloc] initWithFrame:CGRectMake(0, 100, 100, 30)];
lblDemo.text = #"Did";
lblDemo.font = [UIFont systemFontOfSize:30];
lblDemo.textColor = [UIColor redColor];
lblDemo.tag = 1;
lblDemo.hidden = YES;
lblDemo.backgroundColor = [UIColor clearColor];
[contentView addSubview:lblDemo];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)vwVideo;
[filter addTarget:filterView];
[movieFile addTarget:filter];
[blendFilter addTarget:filterView];
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.mov"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSLog(#"%#",pathToMovie);
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[filter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
NSLog(#"Saved");
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];}
This code does not export the video to document directory or does not save.Please help me.How to add watermark on the video.
Note : I also tried the AvassetWriter and AVMutableComposition to create the video,but that does not work in iOS 8 ,but work in ios 7 .
i tried your code, and i found that you use wrong target. i have modified it, you can try it again.
NSURL *sampleURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"video" ofType:#"mp4"]];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = NO;
filter = [[GPUImageSepiaFilter alloc] init];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0,
0,
[[UIScreen mainScreen] bounds].size.width,
[[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
UIImageView *ivTemp = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 147, 59)];
ivTemp.image = [UIImage imageNamed:#"db"];
[contentView addSubview:ivTemp];
UILabel *lblDemo = [[UILabel alloc] initWithFrame:CGRectMake(0, 100, 200, 30)];
lblDemo.text = #"Did---啊啊啊";
lblDemo.font = [UIFont systemFontOfSize:30];
lblDemo.textColor = [UIColor redColor];
lblDemo.tag = 1;
lblDemo.hidden = YES;
lblDemo.backgroundColor = [UIColor clearColor];
[contentView addSubview:lblDemo];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = [[GPUImageView alloc] initWithFrame:self.view.frame];
[filterView setBackgroundColor:[UIColor redColor]];
[movieFile addTarget:filter];
[blendFilter addTarget:filterView];
[self.view addSubview:filterView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[weakUIElementInput update];
}];
// In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.mov"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSLog(#"%#",pathToMovie);
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[blendFilter addTarget:movieWriter];
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
NSLog(#"Saved");
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];
I'm trying to add a GPUImagePicture and a GPUImageUIElement on a video.
GPUImageUIElement is working, but I've a problem with the GPUImagePicture because I only see it on the first frame and then it disappears.
Here's my code:
filter = [[GPUImageBrightnessFilter alloc] init];
[(GPUImageBrightnessFilter *)filter setBrightness:0];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
GPUImagePicture *overlay = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:#"Logo.png"] smoothlyScaleOutput:YES];
GPUImageTransformFilter *transformFilter = [GPUImageTransformFilter new];
[transformFilter forceProcessingAtSize:CGSizeMake(73, 83)];
[transformFilter setAffineTransform:CGAffineTransformMakeScale(0.7, 0.7)];
[overlay addTarget:transformFilter];
[overlay processImage];
UIView *subview1 = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 400)];
UILabel *temperaturaText = [[UILabel alloc]initWithFrame:CGRectMake(77, 100, 105, 60)];
temperaturaText.text = #"test";
[subview1 addSubview:temperaturaText];
uiElementInput = [[GPUImageUIElement alloc] initWithView:subview1];
[filter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[blendFilter addTarget:filterView];
[overlay addTarget:filterView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput *filter, CMTime frameTime) {
[weakUIElementInput update];
}];
[blendFilter addTarget:movieWriter];
Your problem is that you define your input picture of overlay as a local variable within your setup method. You aren't holding onto a strong reference to it, so it will be deallocated the instant you finish this method, which also will remove the image texture and its output from your processing pipeline.
If you want to hold on to an input image, you need to make overlay an instance variable of your class, like you did for your camera or movie input in the above. Then it will persist to be used by the framework as input.
I am trying to implement brightness,contrast and exposure filters in a single view same as you see in iPhoto app. I have tried to put up group filter for doing the same. But it shows up a white screen instead of modified picture. Here is the code I applied.
GPUImageFilterGroup *groupFilter = [[GPUImageFilterGroup alloc]init];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc]init];
[brightnessFilter setBrightness:brightnessValue];
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc]init];
[contrastFilter setContrast:contrastValue];
GPUImageExposureFilter *exposureFilter =[[GPUImageExposureFilter alloc]init];
[exposureFilter setExposure:exposureValue];
[groupFilter addFilter:brightnessFilter];
[groupFilter addFilter:contrastFilter];
[groupFilter addFilter:exposureFilter];
GPUImagePicture *stillImage= [[GPUImagePicture alloc]initWithImage:self.imageToModify];
[stillImage addTarget:groupFilter];
[stillImage processImage];
previewPicture.image = [groupFilter imageFromCurrentlyProcessedOutputWithOrientation:self.imageToModify.imageOrientation];
I even tried to put each individually but still it shows up white image. Is the above code I am using correct?
I have also tried using GPUImageFilterPipeline instead of GPUImageFilterGroup but still having the same issue.
For the record, the image is a still image and not live feed.
you have missed some code statement for doing it which is as below.
[brightnessFilter addTarget: contrastFilter];
[contrastFilter addTarget: exposureFilter];
[(GPUImageFilterGroup *) groupFilter setInitialFilters:[NSArray arrayWithObject: brightnessFilter]];
[(GPUImageFilterGroup *) groupFilter setTerminalFilter:exposureFilter];
Thanks
Here is a full code to apply Brightness, Contrast and Saturation to video
i get the reference code from this link and i make change in that..
** in this code we use GPUImage Frameworks
1). in .h file
#import "GPUImage.h"
#interface ViewController : UIViewController
{
GPUImageMovie *movieFile;
GPUImageMovieWriter *movieWriter;
GPUImageUIElement *uiElementInput;
}
2). in .m file
- (void)editVideo
{
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"Sample Video" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
movieFile.runBenchmark = YES;
movieFile.playAtActualSpeed = YES;
GPUImageFilterGroup *groupFilter = [[GPUImageFilterGroup alloc]init]; //1
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init]; //2
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc] init]; //3
GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init]; //4
[(GPUImageBrightnessFilter*)brightnessFilter setBrightness:0.10]; // change value between -1.00 to 1.00
[(GPUImageContrastFilter*)contrastFilter setContrast:1.48]; // change value between 0.00 to 4.00
[(GPUImageSaturationFilter*)saturationFilter setSaturation:2.00]; //change value between 0.00 to 2.00
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init]; //5
blendFilter.mix = 0.0;
/* ************************************************** */
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height-20)];
contentView.backgroundColor = [UIColor clearColor];
uiElementInput = [[GPUImageUIElement alloc] initWithView:contentView];
[brightnessFilter addTarget:contrastFilter];
[contrastFilter addTarget:saturationFilter];
[saturationFilter addTarget:blendFilter];
[uiElementInput addTarget:blendFilter];
[(GPUImageFilterGroup *) groupFilter setInitialFilters:[NSArray arrayWithObject: brightnessFilter]];
[(GPUImageFilterGroup *) groupFilter setTerminalFilter:blendFilter];
[movieFile addTarget:brightnessFilter];
[movieFile addTarget:contrastFilter];
[movieFile addTarget:saturationFilter];
// Only rotate the video for display, leave orientation the same for recording
GPUImageView *filterView = (GPUImageView *)vwVideo;
[brightnessFilter addTarget:filterView];
[contrastFilter addTarget:filterView];
[saturationFilter addTarget:filterView];
[blendFilter addTarget:filterView];
[brightnessFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
[contrastFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
[saturationFilter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
if (frameTime.value/frameTime.timescale == 2) {
[contentView viewWithTag:1].hidden = NO;
}
[uiElementInput update];
}];
//In addition to displaying to the screen, write out a processed version of the movie to disk
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:[NSURL fileURLWithPath:pathToMovie] size:CGSizeMake(640.0, 480.0)];
[brightnessFilter addTarget:movieWriter];
[contrastFilter addTarget:movieWriter];
[blendFilter addTarget:movieWriter];
// Configure this for video from the movie file, where we want to preserve all video frames and audio samples
movieWriter.shouldPassthroughAudio = YES;
movieFile.audioEncodingTarget = movieWriter;
[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, nil, nil);
}];
}
*Note :- Please run that demo example in device not in simulator for perfect result.
After process done you will get effected video in your device and for different effect you can change the value at setBrightness , setContrast and setSaturation (see in code)
Very similar this answer, except I want to generate a histogram for a still image.
Below is what I'm doing, and it's giving a histogram with all 0 data. Is there some trick to getting this working?
GPUImageFilter *filter = [[GPUImageHistogramFilter alloc] initWithHistogramType:kGPUImageHistogramRGB];
GPUImagePicture *original = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
GPUImageGammaFilter *gammaFilter = [[GPUImageGammaFilter alloc] init];
[original addTarget:gammaFilter];
[gammaFilter addTarget:filter];
GPUImageHistogramGenerator *histogramGraph = [[GPUImageHistogramGenerator alloc] init];
[histogramGraph forceProcessingAtSize:CGSizeMake(256.0, 330.0)];
[filter addTarget:histogramGraph];
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 0.75;
[blendFilter forceProcessingAtSize:CGSizeMake(256.0, 330.0)];
[original addTarget:blendFilter];
[histogramGraph addTarget:blendFilter];
[blendFilter addTarget:gpuImageView];
[original processImage];
Brad has changed some inner mechanism to GPUImage to improve memory management (and it does)in the last releases, now you should tell the filter to keep the frame for still images -useNextFrameForImageCapture .
UIImage *inputImage = [UIImage imageNamed:#"Lambeau.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageSepiaFilter *stillImageFilter = [[GPUImageSepiaFilter alloc] init];
[stillImageSource addTarget:stillImageFilter];
[stillImageFilter useNextFrameForImageCapture]
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentFramebuffer];