Adding Photo Frame To Camera Image on iOS - ios

I am working to create a view in my app that will allow you to take a picture, and then will automatically draw a frame around that image. The issue I am facing is that there is some blank space around the top and sides when I do this. Here is my code:
UPDATE: If I run this in a 6S, you get image shown below. If in 6S Plus, you get the final image.
-(void)viewDidLoad {
UIBarButtonItem *takePhoto = [[UIBarButtonItem alloc] initWithTitle:#"Take Picture" style:UIBarButtonItemStylePlain target:self action:#selector(takingPhoto)];
UIBarButtonItem *savePhoto = [[UIBarButtonItem alloc] initWithTitle:#"Save" style:UIBarButtonItemStylePlain target:self action:#selector(saveIt)];
self.navigationItem.rightBarButtonItems = #[takePhoto, savePhoto];
}
- (void)takingPhoto{
ipc = [[UIImagePickerController alloc] init];
ipc.delegate = self;
ipc.allowsEditing = YES;
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
ipc.sourceType = UIImagePickerControllerSourceTypeCamera;
[self presentViewController:ipc animated:YES completion:NULL];
}
else
{
ipc.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
[self presentViewController:ipc animated:YES completion:NULL]; }
}
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
self.imgView.image = [info objectForKey:UIImagePickerControllerOriginalImage];
[self applyFilter];
[picker dismissViewControllerAnimated:YES completion:nil];
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
[picker dismissViewControllerAnimated:YES completion:nil];
}
- (void)applyFilter {
NSLog(#"Running");
UIImage *borderImage = [UIImage imageNamed:#"IMG_8055.PNG"];
NSData *dataFromImage = UIImageJPEGRepresentation(self.imgView.image, 1);
CIImage *beginImage= [CIImage imageWithData:dataFromImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *border =[CIImage imageWithData:UIImagePNGRepresentation(borderImage)];
CIFilter *filter= [CIFilter filterWithName:#"CISourceOverCompositing"]; //#"CISoftLightBlendMode"];
[filter setDefaults];
[filter setValue:border forKey:#"inputImage"];
[filter setValue:beginImage forKey:#"inputBackgroundImage"];
CIImage *outputImage = [filter valueForKey:#"outputImage"];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
self.imgView.image = newImg;
}
In the simulator, it looks like this:
However, when I look in saved photos after I have saved it, it looks like this:
Why does it have the white space on top and right side?
Here is the aforementioned 6S Plus image:

Related

Objective C - Image from camera to set on UIButton

I am trying to set image from camera to set on my button but it did not accepting cropped image by default it sets to original image width and height which was captured by camera and because of this image looks shrink from top to bottom .
When I click on UIButton camera opens and then after image captured it shows on this UIButton
- (IBAction)ProfImageBtnCliked:(id)sender
{
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
[picker setAllowsEditing:YES];
picker.delegate = self;
if ([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceRear] || [UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront] )
{
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
[self presentViewController:picker animated:YES completion:nil];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Alert" message:#"No Camera Available"delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:Nil];
NSData *dataForImage = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerOriginalImage"], 1.0);
[ProfImgButton setImage:[UIImage imageWithData:dataForImage] forState:UIControlStateNormal];
ProfImgButton.userInteractionEnabled=NO;
ProfImgButton.contentMode=UIViewContentModeScaleAspectFill;
ProfImgButton.clipsToBounds=YES;
CGSize constraint = CGSizeMake(200,200);
NSData *imgData = UIImageJPEGRepresentation([self imageWithImage [UIImage imageWithData:dataForImage] scaledToSize:constraint], 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
NSString *imageString = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
NSLog(#"%#",imageString);
//assign base64 image to image string
Base64ImageString=[imageString stringByReplacingOccurrencesOfString:#"+" withString:#"%2B"];
}
//reduce image size
-(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
You are setting the original image taken from your camera in your code here
[picker dismissViewControllerAnimated:YES completion:Nil];
NSData *dataForImage = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerEditedImage"], 1.0);
[ProfImgButton setImage:[UIImage imageWithData:dataForImage] forState:UIControlStateNormal];
while you are resizing the image below
CGSize constraint = CGSizeMake(200,200);
NSData *imgData = UIImageJPEGRepresentation([self imageWithImage [UIImage imageWithData:dataForImage] scaledToSize:constraint], 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
This data is not used anywhere. You are not setting it on your ProfImgButton
All you need to do is set the resized imagedata to your button
[ProfImgButton setImage:[UIImage imageWithData:imgData] forState:UIControlStateNormal];
Your code of cropping is wrong!
Write like this:-
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:nil];
CGSize size = CGSizeMake(200,200);
UIImage *originalImage=UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerEditedImage"], 1.0);
//CROP image code
UIGraphicsBeginImageContext(size);
[originalImage drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//now assign image to button
[ProfImgButton setImage:croppedImage forState:UIControlStateNormal];
ProfImgButton.userInteractionEnabled=NO;
ProfImgButton.contentMode=UIViewContentModeScaleAspectFill;
ProfImgButton.clipsToBounds=YES;
NSData *imgData = UIImageJPEGRepresentation(croppedImage , 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
NSString *imageString = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
NSLog(#"%#",imageString);
//assine base64 image to image string
Base64ImageString=[imageString stringByReplacingOccurrencesOfString:#"+" withString:#"%2B"];
}

Photo App with Slicer

I am trying to create my first App for iOS. I would like to create an App which can load a Photo from Library or create a new one with the camera. Then after when I created a photo or load a picture from my library, i would like that it will go to a filter, for example sepia (like it is used in every tutorial) and that I can change the value right now with a slicer on the screen. I will put my source code here, because I dont know if I am doing it right. This is everything from my ViewController.m file
What should I do now to get it working?
#import "ViewController.h"
#interface ViewController ()
{
CIContext *context;
CIFilter *filter;
CIImage *beginImage;
}
#property (nonatomic, retain) UIDocumentInteractionController *documentController;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
- (void) viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"image" ofType:#"png"];
NSURL *fileNamePath = [NSURL fileURLWithPath:filePath];
CIImage *inputImage = [CIImage imageWithContentsOfURL:fileNamePath];
CIFilter *filter = [CIFilter filterWithName:#"CISepia" keysAndValues:kCIInputImageKey, inputImage, #"inputIntensity", #0.8, nil];
CIImage *outputImage = [filter outputImage];
self.ImageView.image = [UIImage imageWithCIImage:outputImage];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)TakePhoto:(id)sender {
picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
[picker setSourceType:UIImagePickerControllerSourceTypeCamera];
[self presentViewController:picker animated:YES completion:NULL];
}
- (IBAction)ChoosePhoto:(id)sender {
picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
[picker setSourceType:UIImagePickerControllerSourceTypePhotoLibrary];
[self presentViewController:picker animated:YES completion:NULL];
}
-(void)imagePickerController:(nonnull UIImagePickerController *)picker didFinishPickingMediaWithInfo:(nonnull NSDictionary<NSString *,id> *)info {
image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[self.ImageView setImage:image];
[self dismissViewControllerAnimated:YES completion:NULL];
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
[self dismissViewControllerAnimated:YES completion:NULL];
}
- (IBAction)amountSliderValueChanged:(id)sender
{
float slideValue = _Slider.value;
[filter setValue:#(slideValue)
forKey:#"inputIntensity"];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage
fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
self.ImageView.image = newImage;
CGImageRelease(cgimg);
}

Image from UIImagePickerController to UIImage image=

Trying to load a camera/album photo into a 'add border' loop, but can't get it to work.
The app basically takes a photo or picks one from album, then allows the user to choose between 4 different borders but as the user switches borders it doesn't revert back to the original photo it just layers the borders on top.
I've tried using
UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
But it doesn't like 'info'
Code below.
// ABViewController.m
#import "ABViewController.h"
#import <QuartzCore/QuartzCore.h>
#interface ABViewController ()
#property (nonatomic) IBOutlet UIImageView *imageView;
#property (nonatomic) IBOutlet UIButton *border1;
#property (nonatomic) IBOutlet UIButton *border2;
#property (nonatomic) IBOutlet UIButton *border3;
#property (nonatomic) IBOutlet UIButton *border4;
#end
#implementation ABViewController
#synthesize imageView;
#synthesize border1, border2, border3, border4;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
border1.tag =1;
border2.tag =2;
border3.tag =3;
border4.tag =4;
imageView.image = [UIImage imageNamed:#"a7qIB4OiXsz5Q0BG4ipZV0FLF5_jB71p24h7Oa2ExYU.jpg"];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
//Border code
- (IBAction)borderButtonTapped:(UIButton *)sender {
UIImage* image=(UIImage*)[#"UIImagePickerControllerOriginalImage"];
UIImage *borderImage = [UIImage imageNamed:[NSString stringWithFormat:#"borderImage%i.png", sender.tag]];
NSData *dataFromImage = UIImageJPEGRepresentation(imageView.image, 1);
CIImage *beginImage= [CIImage imageWithData:dataFromImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *border =[CIImage imageWithData:UIImagePNGRepresentation(borderImage)];
CIFilter *filter= [CIFilter filterWithName:#"CISourceOverCompositing"]; //#"CISoftLightBlendMode"];
[filter setDefaults];
[filter setValue:border forKey:#"inputImage"];
[filter setValue:beginImage forKey:#"inputBackgroundImage"];
CIImage *outputImage = [filter valueForKey:#"outputImage"];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
imageView.image = newImg;
}
// Camera CODE
- (IBAction)TakePhoto {
picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
[picker setSourceType:UIImagePickerControllerSourceTypeCamera];
[picker setShowsCameraControls:YES];
[picker setAllowsEditing:YES];
[self presentViewController:picker animated:YES completion:NULL];
[picker release];
}
- (IBAction)ChooseExisting {
picker2 = [[UIImagePickerController alloc] init];
picker2.delegate = self;
[picker2 setSourceType:UIImagePickerControllerSourceTypePhotoLibrary];
[picker2 setAllowsEditing:YES];
[self presentViewController:picker2 animated:YES completion:NULL];
[picker2 release];
}
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
image = [info objectForKey:UIImagePickerControllerEditedImage];
[imageView setImage:image];
[self dismissViewControllerAnimated:YES completion:NULL];
}
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
[self dismissViewControllerAnimated:YES completion:NULL];
}
#end
Try the following...
Create a NSDictionary reference to hold onto the info:
#implementation ABViewController{
NSDictionary *imageInfo;
}
In the imagePickerController: didFinishPickingMediaWithInfo: method add:
imageInfo = info;
In the borderButtonTapped method, replace:
UIImage* image=(UIImage*)[#"UIImagePickerControllerOriginalImage"];
with:
UIImage *image = [imageInfo valueForKey:UIImagePickerControllerOriginalImage];
Edit:
and change:
NSData *dataFromImage = UIImageJPEGRepresentation(imageView.image, 1);
to:
NSData *dataFromImage = UIImageJPEGRepresentation(image, 1);

iOS - Setting blurred image on top of other views, odd issues

So, I've got an odd scenario.
In my iOS app, I'm trying to blur the content area of the screen when a popover is opened.
I have this working when using Core Image, but only when using Gaussian blur- none of the other blurs work, which is odd.
I tried doing the same with GPUImage, and it blurs far faster, but doesn't actually put the view on top of the other views!
To summarize: in the source below, setBlurOnView will work properly- however setBlurOnViewWithGPUImage appears to not be working. The blur view (tag 6110) is created, but the app doesn't actually blur.
Note: This is on iOS 6, in the simulator.
Here's the relevant source:
// ScreenBlur.m
#import <QuartzCore/QuartzCore.h>
#import <CoreImage/CoreImage.h>
#import <GPUImage/GPUImage.h>
#import "ScreenBlur.h"
#import "GlobalData.h"
#import "Logger.h"
#implementation ScreenBlur
+ (void) setBlurOnViewWithGPUImage:(UIView*)view {
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:[self captureScreenInRect:view.frame inView:view]];
GPUImageGaussianBlurFilter *blur = [[GPUImageGaussianBlurFilter alloc] init];
[imageSource addTarget:blur];
[imageSource processImage];
[self setImage:[imageSource imageFromCurrentlyProcessedOutput] toView:view];
}
+ (void) setBlurOnView:(UIView *)view {
//http://stackoverflow.com/questions/17041669/creating-a-blurring-overlay-view
CIImage *inputImage = [CIImage imageWithCGImage:[self captureScreenInRect:view.frame inView:view].CGImage];
//CIContext *context = [CIContext contextWithOptions:nil];
if ([GlobalData getInstance].ciContext == nil) {
[Logger Log:#"ciContext does not exist, creating..." fromClass:#"ScreenBlur"];
// [GlobalData getInstance].ciContext = [CIContext contextWithOptions:nil]; //cpu context
[GlobalData getInstance].ciContext = [CIContext contextWithEAGLContext:[[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]];
}
//set up the blur filter
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:3.0f] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
// CIGaussianBlur has a tendency to shrink the image a little,
// this ensures it matches up exactly to the bounds of our original image
CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
[self setImage:[UIImage imageWithCGImage:cgImage] toView:view];
}
+ (void) setImage:(UIImage*)blurredImage toView:(UIView*)view {
UIView *blurView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, blurredImage.size.width, blurredImage.size.height)];
[blurView setBackgroundColor:[UIColor colorWithPatternImage:blurredImage]];
[blurView setTag:6110];
//set the image as the foreground for the view
[view addSubview:blurView];
[view bringSubviewToFront:blurView];
}
//same as the method above, but resizes the screenshot before applying the blur for increased performance at the expense of image quality.
+ (void) setBlurOnViewPerformanceMode:(UIView *)view {
//http://stackoverflow.com/questions/17041669/creating-a-blurring-overlay-view
UIImage *screenShot = [self imageWithImage:[self captureScreenInRect:view.frame inView:view] scaledToSize:CGSizeMake(view.frame.size.width / 2, view.frame.size.height / 2)];
CIImage *inputImage = [CIImage imageWithCGImage:screenShot.CGImage];
//CIContext *context = [CIContext contextWithOptions:nil];
if ([GlobalData getInstance].ciContext == nil) {
[Logger Log:#"ciContext does not exist, creating..." fromClass:#"ScreenBlur"];
// [GlobalData getInstance].ciContext = [CIContext contextWithOptions:nil]; //cpu context
[GlobalData getInstance].ciContext = [CIContext contextWithEAGLContext:[[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]];
}
//set up the blur filter
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:3.0f] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
//CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
[self setImage:[self imageWithImage:[UIImage imageWithCGImage:cgImage] scaledToSize:view.frame.size] toView:view];
}
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
//UIGraphicsBeginImageContext(newSize);
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
+ (void) removeBlurFromView:(UIView *)view {
for (UIView *subView in view.subviews) {
if (subView.tag == 6110) {
[subView removeFromSuperview];
}
}
}
+ (UIImage *)captureScreenInRect:(CGRect)captureFrame inView:(UIView*) view {
CALayer *layer;
layer = view.layer;
UIGraphicsBeginImageContext(view.bounds.size);
CGContextClipToRect (UIGraphicsGetCurrentContext(),captureFrame);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenImage;
}
#end
And then in my view controller, it's simply called with
[ScreenBlur setBlurOnView:self.view];
I found a workaround for this (or, who knows, maybe this is how it was supposed to be done).
//ScreenBlur.m
+ (GPUImageView*) getBlurredImageWithGPUImageFromView:(UIView*)view {
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:[self captureScreenInRect:view.frame inView:view] smoothlyScaleOutput:true];
GPUImageFastBlurFilter *blur = [[GPUImageFastBlurFilter alloc] init];
[blur setBlurPasses:3];
[imageSource addTarget:blur];
GPUImageView *filteredView = [[GPUImageView alloc] initWithFrame:view.frame];
[blur addTarget:filteredView];
[imageSource processImage];
return filteredView;
}
//ViewController.m
//blur the main screen
GPUImageView *blurred = [ScreenBlur getBlurredImageWithGPUImageFromView:self.view];
[blurred setTag:6110];
[self.view addSubview:blurred];
[self.view bringSubviewToFront:blurred];

didFinishPickingMediaWithInfo orientation issue

i'm trying to manage image rotation in imagePickerController:didFinishPickingMediaWithInfo, but after several attempts, I did not find any solutions. My last try is:
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// load the storyboard by name
UIStoryboard *storyboard = [UIStoryboard storyboardWithName:[ApplicationSupport getStoryBoardName] bundle:nil];
UploadViewController *uploadView = [storyboard instantiateViewControllerWithIdentifier:#"ViewUploadFile"];
UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
NSDictionary *metadataImage = [info valueForKey:UIImagePickerControllerMediaMetadata];
uploadView.imageToUpload = image;
if([[metadataImage objectForKey:#"Orientation"] intValue] == 3) {
UIImage *imageRotated = [UIImage imageWithCGImage:[image CGImage] scale:[image scale] orientation:UIImageOrientationDown];
uploadView.dataToUpload = UIImagePNGRepresentation(imageRotated);
} else {
uploadView.dataToUpload = UIImagePNGRepresentation(image);
}
// ETC...
}
I can't and I don't want to use JPEGRepresentation. I need to rotate my photo by 180° when iPad is in specific position.
Any idea?
have a Look at this: http://www.catamount.com/blog/1015/uiimage-extensions-for-cutting-scaling-and-rotating-uiimages/
Just call
UIImage *rotatedImage = [originalImage imageRotatedByDegrees:180.0];
Source
UploadView.DataToUpload = [UIImage imageWithCGImage:[image CGImage] scale:1.0 orientation : UIImageOrientationDown];
For 180 deg rotation.

Resources