Objevtive - C : Save zoomed caputre image from camera with GPUImage? - ios

I have done zooming functionality in camera with GPUImage. But when I capture image from camera with zoom and save it but still it save as normal pict(no zooming found). I want in whichever mode I capture image that must be saved in album. How can I solve this problem Any suggestion will be great. Thanks guys. My code :
- (void)viewDidLoad {
[super viewDidLoad];
self.library = [[ALAssetsLibrary alloc] init];
[self setViewLayOut];
[self setupFilter];
[self setZoomFunctionlityOnCamera];
}
- (void)setupFilter;
{
videoCamera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
switch (filterType)
{
case GPUIMAGE_COLORINVERT:
{
self.title = #"Color Negative";
filter = [[GPUImageColorInvertFilter alloc] init];
};
break;
case GPUIMAGE_GRAYSCALE:
{
self.title = #"Black and White Positive";
filter = [[GPUImageGrayscaleFilter alloc] init];
};
break;
default: filter = [[GPUImageFilter alloc] init];
self.title = #"Color Positive";
break;
}
videoCamera.runBenchmark = YES;
filterView = (GPUImageView *)cameraView;
[filter addTarget:filterView];
[videoCamera addTarget:filter];
[videoCamera startCameraCapture];
}
- (IBAction)clickPhotoBtn:(id)sender {
if (!isCameraPermissionAccessed) {
[self showAccessDeinedMessage :#"Camera permission denied" withMessage:#"To enable, please go to settings and allow camera permission for this app."];
return;
}
[videoCamera capturePhotoAsJPEGProcessedUpToFilter:filter withCompletionHandler:^(NSData *processedJPEG, NSError *error){
if (error!=nil)
{
[self showErrorMessage:#"Unable to capture image" ];
return ;
}
else {
UIImage *image = [UIImage imageWithData:processedJPEG];
if (filterType == GPUIMAGE_GRAYSCALE) {
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:image];
GPUImageColorInvertFilter *stillImageFilter = [[GPUImageColorInvertFilter alloc] init];
[stillImageSource addTarget:stillImageFilter];
[stillImageFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentFramebuffer];
UIImageWriteToSavedPhotosAlbum(currentFilteredVideoFrame, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
else{
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
}
}];
}

Use below code it may helpful to you
+(UIImage*)croppedImageWithImage:(UIImage *)image zoom:(CGFloat)zoom
{
CGFloat zoomReciprocal = 1.0f / zoom;
CGPoint offset = CGPointMake(image.size.width * ((1.0f - zoomReciprocal) / 2.0f), image.size.height * ((1.0f - zoomReciprocal) / 2.0f));
CGRect croppedRect = CGRectMake(offset.x, offset.y, image.size.width * zoomReciprocal, image.size.height * zoomReciprocal);
CGImageRef croppedImageRef = CGImageCreateWithImageInRect([image CGImage], croppedRect);
UIImage* croppedImage = [[UIImage alloc] initWithCGImage:croppedImageRef scale:[image scale] orientation:[image imageOrientation]];
CGImageRelease(croppedImageRef);
return croppedImage;
}

Related

iOS - Unable to save image into custom folder, after deletion that custom folder from photos album

I want to save an image into custom folder in iPhone. I have done it.
But I am facing a strange problem that when I run my code it works fine. But when I delete that custom folder from iPhone, and then run my code again, the custom folder is not created.
If I change the custom folder name in my code, then it's created with the new name. But I want to create same folder name even after user deletes that custom folder. I don't want to create another custom folder.
My Code:
#interface CameraVC ()
{
ALAssetsLibrary* library;
GPUImageStillCamera *videoCamera;
}
- (IBAction)clickPhotoBtn:(id)sender {
if (!isCameraPermissionAccessed) {
[self showAccessDeinedMessage :#"Camera permission denied" withMessage:#"To enable, please go to settings and allow camera permission for this app."];
return;
}
[videoCamera capturePhotoAsJPEGProcessedUpToFilter:filter withCompletionHandler:^(NSData *processedJPEG, NSError *error){
if (error!=nil)
{
[self showErrorMessage:#"Unable to capture image" ];
return ;
}
else {
UIImage *image = [UIImage imageWithData:processedJPEG];
CGFloat zoomReciprocal = 1.0f /currentScale;
CGPoint offset = CGPointMake(image.size.width * ((1.0f - zoomReciprocal) / 2.0f), image.size.height * ((1.0f - zoomReciprocal) / 2.0f));
CGRect croppedRect = CGRectMake(offset.x, offset.y, image.size.width * zoomReciprocal, image.size.height * zoomReciprocal);
CGImageRef croppedImageRef = CGImageCreateWithImageInRect([image CGImage], croppedRect);
UIImage *croppedImage = [[UIImage alloc] initWithCGImage:croppedImageRef scale:[image scale] orientation:[image imageOrientation]];
CGImageRelease(croppedImageRef);
if (filterType == GPUIMAGE_GRAYSCALE && !isPhotoBlackAndWhite) {
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:croppedImage];
GPUImageColorInvertFilter *stillImageFilter = [[GPUImageColorInvertFilter alloc] init];
[stillImageSource addTarget:stillImageFilter];
[stillImageFilter useNextFrameForImageCapture];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentFramebuffer];
[self.library saveImage:currentFilteredVideoFrame toAlbum:#"myCustomFolder" withCompletionBlock:^(NSError *error) {
if (error!=nil) {
[self showErrorMessage:#"Unable to save image in photos album, please ensure this app must has permission access to photos album to save images."];
return ;
}
else {
NSString *message = #"Image saved";
UIAlertView *toast = [[UIAlertView alloc] initWithTitle:nil
message:message
delegate:nil
cancelButtonTitle:nil
otherButtonTitles:nil, nil];
[toast show];
int duration = 1;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, duration * NSEC_PER_SEC), dispatch_get_main_queue(), ^{
[toast dismissWithClickedButtonIndex:0 animated:YES];
});
}
}];
}

iOS capture image form GPUImage

I am trying to capture an image with an app that uses GPUImage. I have the camera set up like this
self.videoCamera = [[GPUImageVideoCamera alloc]
initWithSessionPreset:AVCaptureSessionPresetHigh
cameraPosition:AVCaptureDevicePositionBack];
_videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
[_videoCamera startCameraCapture];
[_videoCamera addTarget:sample1ImageView];
and i use a custom filter:
radFilter = [[GPUImageCustomFilter alloc] init];
[_videoCamera addTarget:cusFilter];
[cusFilter addTarget:imageView];
I then use this code for the camera capture:
[_videoCamera pauseCameraCapture];
[radFilter forceProcessingAtSize:CGSizeMake(600, 600)];
[radFilter useNextFrameForImageCapture];
UIImage* capturedImage = [radFilter imageFromCurrentFramebuffer];
UIImageWriteToSavedPhotosAlbum(capturedImage, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
[_videoCamera resumeCameraCapture];
and all i get is white pictures, with rgb 0,0,0.
I tried saving both in an IBAction and in a rac_signalForControlEvents,i used dispatch but nothing changed. Can anyone tell me what am i doing wrong?
Thank you,
Alex
try using GPUImageStillCamera like these..
in your .h file..
GPUImageStillCamera *stillCamera;
GPUImageView * filterView;
in your .m files viewdidload..
selectedFilter = [[GPUImageFilter alloc]init];
filterView=[[GPUImageView alloc]init];
stillCamera=[[GPUImageStillCamera alloc]initWithSessionPreset:AVCaptureSessionPresetPhoto cameraPosition:AVCaptureDevicePositionFront];
stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
[stillCamera addTarget:selectedFilter];
[selectedFilter addTarget:filterView];
[stillCamera startCameraCapture];
on the UIButtons click event for capturing image do these,i hope it helps..
[stillCamera capturePhotoAsImageProcessedUpToFilter:selectedFilter withCompletionHandler:^(UIImage *processedImage, NSError *error)
{
UIImageWriteToSavedPhotosAlbum(processedImage, self, nil, nil);
}];
You can use this code to capture the image.
- (UIImage *) screenshot {
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, [UIScreen mainScreen].scale);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
- (IBAction)btnCaptureClicked:(id)sender
{
[videoCamera pauseCameraCapture];
[filter useNextFrameForImageCapture];
//[filter imageFromCurrentFramebuffer];
UIImage *capturedImage= [self screenshot];
if(capturedImage != nil)
{
UIImageWriteToSavedPhotosAlbum(capturedImage, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
[videoCamera resumeCameraCapture];
}

how to apply brightness ,contrast ,gamma on image while moving slider in iOS?

Hi i am using GPUImagefilters for applying brightness on image.But it does not work.i could not find that problem where it is.my code is like this
- (void)viewDidLoad
{
self.slider.hidden=YES;
slider.minimumValue = 0.0;
slider.maximumValue = 1.0;
slider.value = 0.1;
}
- (IBAction)brightnessClicked:(id)sender
{
self.slider.hidden=NO;
}
- (IBAction)sliderMoved:(id)sender
{
image_p=[[GPUImagePicture alloc]initWithImage:image3];
GPUImageBrightnessFilter *filter1 = [[GPUImageBrightnessFilter alloc] init];
[image_p addTarget:filter1];
[image_p processImage];
[(GPUImageBrightnessFilter *)filter setBrightness:[(UISlider *)sender value]];
// UIImageWriteToSavedPhotosAlbum([filter1 imageFromCurrentlyProcessedOutput], self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
//self.thirdImgView.image = [image3 brightness:(1+value-0.5)];
// self.thirdImgView.image = ;
}
if its not possible to get GPUImagefilters tell me any other way.I tried in more ways but it does not work so please anybody suggest me how to do this.any help is appreciated.
try these
-(void)viewDidLoad
{
[sliderChange setMinimumValue:-0.5];
[sliderChange setMaximumValue:0.5];
[sliderChange setValue:0.0];
brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
}
-(IBAction)upDateSliderValue:(id)sender
{
GPUImagePicture *fx_image;
fx_image = [[GPUImagePicture alloc] initWithImage:originalImage];
[brightnessFilter setBrightness:self.sliderChange.value];
[fx_image addTarget:brightnessFilter];
[fx_image processImage];
UIImage *final_image = [brightnessFilter imageFromCurrentlyProcessedOutput];
self.selectedImageView.image = final_image;
}
- (UIImage *)getNewImageFromImage:(UIImage *)image withBrightness:(CGFloat)brightnessValue
{
// pass the image through a brightness filter
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:image];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
[brightnessFilter setBrightness:brightnessValue];
[blurFilter addTarget:brightnessFilter];
[stillImageSource processImage];
return [brightnessFilter imageFromCurrentlyProcessedOutputWithOrientation:UIImageOrientationUp];
}
to make it darker than default you have to pass negative values.

MKMapSnapshotter with MKPolylineRenderer problems

I tried implementing this answer: https://stackoverflow.com/a/22716610, to the problem of adding overlays to mkmapsnapshotter in iOS7 (cant do renderInContext method). I did this as shown below, but the image returned has only the map with no overlays. Forgive me, I am quite new to this. Thanks.
-(void)mapViewDidFinishRenderingMap:(MKMapView *)mapView fullyRendered:(BOOL)fullyRendered
{
if (mapView.tag == 100) {
MKMapSnapshotOptions *options = [[MKMapSnapshotOptions alloc] init];
options.region = mapView.region;
options.size = mapView.frame.size;
options.scale = [[UIScreen mainScreen] scale];
MKMapSnapshotter *snapshotter = [[MKMapSnapshotter alloc] initWithOptions:options];
[snapshotter startWithCompletionHandler:^(MKMapSnapshot *snapshot, NSError *error) {
if (error) {
NSLog(#"[Error] %#", error);
return;
}
UIImage *image = snapshot.image;
UIGraphicsBeginImageContextWithOptions(image.size, YES, image.scale);
{
[image drawAtPoint:CGPointMake(0, 0)];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);
CGContextSetLineWidth(context,5.0f);
CGContextBeginPath(context);
bool first = YES;
NSArray *overlays = mapView.overlays;
for (id <MKOverlay> overlay in overlays) {
CGPoint point = [snapshot pointForCoordinate:overlay.coordinate];
if(first)
{
first = NO;
CGContextMoveToPoint(context,point.x, point.y);
}
else{
CGContextAddLineToPoint(context,point.x, point.y);
}
}
UIImage *compositeImage = UIGraphicsGetImageFromCurrentImageContext();
NSData *data = UIImagePNGRepresentation(compositeImage);
placeToSave = data;
NSLog(#"MapView Snapshot Saved.");
//show image for debugging
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 200, 320, 320)];
imageView.image = compositeImage;
[self.view addSubview:imageView];
}
UIGraphicsEndImageContext();
}];
[mapView setHidden:YES];
}
}

GPUImage imageFromCurrentFramebuffer returning nil sometimes for GPUImageLookupFilter and it's subclasses

I have been using GPUImage for my project and I am getting into this problem where imageFromCurrentFramebuffer returns nil for some of the GPUImageLookupFilter's.
I subclassed GPUImageFilterGroup like in the GPUImageAmatorkaFilter my code is as follows:
-(MTLookupFilter *) initWithLookupImageName:(NSString *) lookupImageName {
self = [super init];
if (self) {
UIImage *image = [UIImage imageNamed:lookupImageName];
self.lookupImageSource = [[GPUImagePicture alloc] initWithImage:image];
GPUImageLookupFilter *lookupFilter = [[GPUImageLookupFilter alloc] init];
[self addFilter:lookupFilter];
[self.lookupImageSource addTarget:lookupFilter atTextureLocation:1];
[self.lookupImageSource processImage];
self.initialFilters = [NSArray arrayWithObjects:lookupFilter, nil];
self.terminalFilter = lookupFilter;
}
return self;
}
I have several of the objects of this class added into an array and I use:
- (IBAction)filterAction:(id)sender {
NSInteger index = arc4random()%self.filtersArray.count;
id filter = self.filtersArray[index];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:self.fullImage];
UIImage *filteredimage = nil;
[stillImageSource addTarget:filter];
[stillImageSource processImage];
[filter useNextFrameForImageCapture];
filteredimage = [filter imageFromCurrentFramebuffer];
if (filteredimage) {
self.imageView.image = filteredimage;
} else {
NSLog(#"Filtered image is nil");
}
}
The returned image from imageFromCurrentFramebuffer is sometimes nil and I do not understand it's cause. I would be thankful for any help. Sometimes the image is nil even for the filters, GPUImageAmatorkaFilter, GPUImageSoftEleganceFilter and `GPUImageMissEtikateFilter so I know it is not a problem with my subclass.

Resources