Trying to load a camera/album photo into a 'add border' loop, but can't get it to work.
The app basically takes a photo or picks one from album, then allows the user to choose between 4 different borders but as the user switches borders it doesn't revert back to the original photo it just layers the borders on top.
I've tried using
UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
But it doesn't like 'info'
Code below.
// ABViewController.m
#import "ABViewController.h"
#import <QuartzCore/QuartzCore.h>
#interface ABViewController ()
#property (nonatomic) IBOutlet UIImageView *imageView;
#property (nonatomic) IBOutlet UIButton *border1;
#property (nonatomic) IBOutlet UIButton *border2;
#property (nonatomic) IBOutlet UIButton *border3;
#property (nonatomic) IBOutlet UIButton *border4;
#end
#implementation ABViewController
#synthesize imageView;
#synthesize border1, border2, border3, border4;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
border1.tag =1;
border2.tag =2;
border3.tag =3;
border4.tag =4;
imageView.image = [UIImage imageNamed:#"a7qIB4OiXsz5Q0BG4ipZV0FLF5_jB71p24h7Oa2ExYU.jpg"];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
//Border code
- (IBAction)borderButtonTapped:(UIButton *)sender {
UIImage* image=(UIImage*)[#"UIImagePickerControllerOriginalImage"];
UIImage *borderImage = [UIImage imageNamed:[NSString stringWithFormat:#"borderImage%i.png", sender.tag]];
NSData *dataFromImage = UIImageJPEGRepresentation(imageView.image, 1);
CIImage *beginImage= [CIImage imageWithData:dataFromImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *border =[CIImage imageWithData:UIImagePNGRepresentation(borderImage)];
CIFilter *filter= [CIFilter filterWithName:#"CISourceOverCompositing"]; //#"CISoftLightBlendMode"];
[filter setDefaults];
[filter setValue:border forKey:#"inputImage"];
[filter setValue:beginImage forKey:#"inputBackgroundImage"];
CIImage *outputImage = [filter valueForKey:#"outputImage"];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
imageView.image = newImg;
}
// Camera CODE
- (IBAction)TakePhoto {
picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
[picker setSourceType:UIImagePickerControllerSourceTypeCamera];
[picker setShowsCameraControls:YES];
[picker setAllowsEditing:YES];
[self presentViewController:picker animated:YES completion:NULL];
[picker release];
}
- (IBAction)ChooseExisting {
picker2 = [[UIImagePickerController alloc] init];
picker2.delegate = self;
[picker2 setSourceType:UIImagePickerControllerSourceTypePhotoLibrary];
[picker2 setAllowsEditing:YES];
[self presentViewController:picker2 animated:YES completion:NULL];
[picker2 release];
}
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
image = [info objectForKey:UIImagePickerControllerEditedImage];
[imageView setImage:image];
[self dismissViewControllerAnimated:YES completion:NULL];
}
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
[self dismissViewControllerAnimated:YES completion:NULL];
}
#end
Try the following...
Create a NSDictionary reference to hold onto the info:
#implementation ABViewController{
NSDictionary *imageInfo;
}
In the imagePickerController: didFinishPickingMediaWithInfo: method add:
imageInfo = info;
In the borderButtonTapped method, replace:
UIImage* image=(UIImage*)[#"UIImagePickerControllerOriginalImage"];
with:
UIImage *image = [imageInfo valueForKey:UIImagePickerControllerOriginalImage];
Edit:
and change:
NSData *dataFromImage = UIImageJPEGRepresentation(imageView.image, 1);
to:
NSData *dataFromImage = UIImageJPEGRepresentation(image, 1);
Related
I am working to create a view in my app that will allow you to take a picture, and then will automatically draw a frame around that image. The issue I am facing is that there is some blank space around the top and sides when I do this. Here is my code:
UPDATE: If I run this in a 6S, you get image shown below. If in 6S Plus, you get the final image.
-(void)viewDidLoad {
UIBarButtonItem *takePhoto = [[UIBarButtonItem alloc] initWithTitle:#"Take Picture" style:UIBarButtonItemStylePlain target:self action:#selector(takingPhoto)];
UIBarButtonItem *savePhoto = [[UIBarButtonItem alloc] initWithTitle:#"Save" style:UIBarButtonItemStylePlain target:self action:#selector(saveIt)];
self.navigationItem.rightBarButtonItems = #[takePhoto, savePhoto];
}
- (void)takingPhoto{
ipc = [[UIImagePickerController alloc] init];
ipc.delegate = self;
ipc.allowsEditing = YES;
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
ipc.sourceType = UIImagePickerControllerSourceTypeCamera;
[self presentViewController:ipc animated:YES completion:NULL];
}
else
{
ipc.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
[self presentViewController:ipc animated:YES completion:NULL]; }
}
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
self.imgView.image = [info objectForKey:UIImagePickerControllerOriginalImage];
[self applyFilter];
[picker dismissViewControllerAnimated:YES completion:nil];
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
[picker dismissViewControllerAnimated:YES completion:nil];
}
- (void)applyFilter {
NSLog(#"Running");
UIImage *borderImage = [UIImage imageNamed:#"IMG_8055.PNG"];
NSData *dataFromImage = UIImageJPEGRepresentation(self.imgView.image, 1);
CIImage *beginImage= [CIImage imageWithData:dataFromImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *border =[CIImage imageWithData:UIImagePNGRepresentation(borderImage)];
CIFilter *filter= [CIFilter filterWithName:#"CISourceOverCompositing"]; //#"CISoftLightBlendMode"];
[filter setDefaults];
[filter setValue:border forKey:#"inputImage"];
[filter setValue:beginImage forKey:#"inputBackgroundImage"];
CIImage *outputImage = [filter valueForKey:#"outputImage"];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
self.imgView.image = newImg;
}
In the simulator, it looks like this:
However, when I look in saved photos after I have saved it, it looks like this:
Why does it have the white space on top and right side?
Here is the aforementioned 6S Plus image:
I am trying to turn white imageView.image into different colors programmatically. The outlines are currently white.
I am trying to use this solution with CIFilter but the images don't appear at all.
Here is the pasted code for reference:
- (UIImage *) filterImage: (CIImage *)beginImage color: (UIColor *) color{
CIImage *output = [CIFilter filterWithName:#"CIColorMonochrome" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:1.0], #"inputColor", [[CIColor alloc] initWithColor:color], nil].outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgiimage = [context createCGImage:output fromRect:output.extent];
UIImage *newImage = [UIImage imageWithCGImage:cgiimage];
CGImageRelease(cgiimage);
return newImage;
}
Solution .h file
#import <UIKit/UIKit.h>
#interface ViewController : UIViewController
#property (weak, readwrite, nonatomic) IBOutlet UIImageView *imgView;
#end
Solution .m file
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
CIImage *inputImage = [CIImage imageWithCGImage:[UIImage imageNamed:#"NJDLv"].CGImage];
self.imgView.image = [self filterImage:inputImage color:[UIColor yellowColor]];
}
- (UIImage *) filterImage: (CIImage *)beginImage color: (UIColor *) color{
CIColor *ciColor = [CIColor colorWithCGColor:color.CGColor];
CIImage *output = [CIFilter filterWithName:#"CIColorMonochrome"
keysAndValues:kCIInputImageKey, beginImage,
#"inputIntensity",#(1.0),
#"inputColor", ciColor, nil].outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgiimage = [context createCGImage:output fromRect:output.extent];
UIImage *newImage = [UIImage imageWithCGImage:cgiimage];
CGImageRelease(cgiimage);
return newImage;
}
#end
I am trying to set image from camera to set on my button but it did not accepting cropped image by default it sets to original image width and height which was captured by camera and because of this image looks shrink from top to bottom .
When I click on UIButton camera opens and then after image captured it shows on this UIButton
- (IBAction)ProfImageBtnCliked:(id)sender
{
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
[picker setAllowsEditing:YES];
picker.delegate = self;
if ([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceRear] || [UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront] )
{
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
[self presentViewController:picker animated:YES completion:nil];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Alert" message:#"No Camera Available"delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:Nil];
NSData *dataForImage = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerOriginalImage"], 1.0);
[ProfImgButton setImage:[UIImage imageWithData:dataForImage] forState:UIControlStateNormal];
ProfImgButton.userInteractionEnabled=NO;
ProfImgButton.contentMode=UIViewContentModeScaleAspectFill;
ProfImgButton.clipsToBounds=YES;
CGSize constraint = CGSizeMake(200,200);
NSData *imgData = UIImageJPEGRepresentation([self imageWithImage [UIImage imageWithData:dataForImage] scaledToSize:constraint], 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
NSString *imageString = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
NSLog(#"%#",imageString);
//assign base64 image to image string
Base64ImageString=[imageString stringByReplacingOccurrencesOfString:#"+" withString:#"%2B"];
}
//reduce image size
-(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize;
{
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
You are setting the original image taken from your camera in your code here
[picker dismissViewControllerAnimated:YES completion:Nil];
NSData *dataForImage = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerEditedImage"], 1.0);
[ProfImgButton setImage:[UIImage imageWithData:dataForImage] forState:UIControlStateNormal];
while you are resizing the image below
CGSize constraint = CGSizeMake(200,200);
NSData *imgData = UIImageJPEGRepresentation([self imageWithImage [UIImage imageWithData:dataForImage] scaledToSize:constraint], 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
This data is not used anywhere. You are not setting it on your ProfImgButton
All you need to do is set the resized imagedata to your button
[ProfImgButton setImage:[UIImage imageWithData:imgData] forState:UIControlStateNormal];
Your code of cropping is wrong!
Write like this:-
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:nil];
CGSize size = CGSizeMake(200,200);
UIImage *originalImage=UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerEditedImage"], 1.0);
//CROP image code
UIGraphicsBeginImageContext(size);
[originalImage drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//now assign image to button
[ProfImgButton setImage:croppedImage forState:UIControlStateNormal];
ProfImgButton.userInteractionEnabled=NO;
ProfImgButton.contentMode=UIViewContentModeScaleAspectFill;
ProfImgButton.clipsToBounds=YES;
NSData *imgData = UIImageJPEGRepresentation(croppedImage , 0);
NSLog(#"Size of Image(bytes):%lu",(unsigned long)[imgData length]);
NSString *imageString = [imgData base64EncodedStringWithOptions:NSDataBase64EncodingEndLineWithLineFeed];
NSLog(#"%#",imageString);
//assine base64 image to image string
Base64ImageString=[imageString stringByReplacingOccurrencesOfString:#"+" withString:#"%2B"];
}
I am trying to create my first App for iOS. I would like to create an App which can load a Photo from Library or create a new one with the camera. Then after when I created a photo or load a picture from my library, i would like that it will go to a filter, for example sepia (like it is used in every tutorial) and that I can change the value right now with a slicer on the screen. I will put my source code here, because I dont know if I am doing it right. This is everything from my ViewController.m file
What should I do now to get it working?
#import "ViewController.h"
#interface ViewController ()
{
CIContext *context;
CIFilter *filter;
CIImage *beginImage;
}
#property (nonatomic, retain) UIDocumentInteractionController *documentController;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
- (void) viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"image" ofType:#"png"];
NSURL *fileNamePath = [NSURL fileURLWithPath:filePath];
CIImage *inputImage = [CIImage imageWithContentsOfURL:fileNamePath];
CIFilter *filter = [CIFilter filterWithName:#"CISepia" keysAndValues:kCIInputImageKey, inputImage, #"inputIntensity", #0.8, nil];
CIImage *outputImage = [filter outputImage];
self.ImageView.image = [UIImage imageWithCIImage:outputImage];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)TakePhoto:(id)sender {
picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
[picker setSourceType:UIImagePickerControllerSourceTypeCamera];
[self presentViewController:picker animated:YES completion:NULL];
}
- (IBAction)ChoosePhoto:(id)sender {
picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
[picker setSourceType:UIImagePickerControllerSourceTypePhotoLibrary];
[self presentViewController:picker animated:YES completion:NULL];
}
-(void)imagePickerController:(nonnull UIImagePickerController *)picker didFinishPickingMediaWithInfo:(nonnull NSDictionary<NSString *,id> *)info {
image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[self.ImageView setImage:image];
[self dismissViewControllerAnimated:YES completion:NULL];
}
-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
[self dismissViewControllerAnimated:YES completion:NULL];
}
- (IBAction)amountSliderValueChanged:(id)sender
{
float slideValue = _Slider.value;
[filter setValue:#(slideValue)
forKey:#"inputIntensity"];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage
fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
self.ImageView.image = newImage;
CGImageRelease(cgimg);
}
I need to load a image from a URL into a UIImageView. Each time the URL would be different because i am JSON parsing a website. I am able to get the URL. However when i run the app, there is no image in the UIImageView.
This is my code
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
if ([eventNameDesc containsString:#"src="]) {
eventNameDesc = #"";
}
else {
eventDescription.text = eventNameDesc;
}
self.navigationItem.rightBarButtonItem=[[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemAction target:self action:#selector(shareButtonPressed)];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
-(IBAction)shareButtonPressed {
NSString *shareText = eventNameDesc;
NSArray *itemsToShare = #[shareText];
UIActivityViewController *activityVC = [[UIActivityViewController alloc] initWithActivityItems:itemsToShare applicationActivities:nil];
activityVC.excludedActivityTypes = #[];
[self presentViewController:activityVC animated:YES completion:nil];
}
- (void)loadImage {
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:self.imageURL]];
UIImage* image = [[UIImage alloc] initWithData:imageData];
[self performSelectorOnMainThread:#selector(displayImage:) withObject:image waitUntilDone:NO];
}
- (void)displayImage:(UIImage *)image {
[self.ImageView setImage:image]; //UIImageView
}
#end
What is wrong with my code? Somebody please help.
You can make a UIImage object using the code provided here
After that you can add your UIImage object to any UIImageView object.
This is the code that works for me.
UIImage *anImage = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:#"http://4.bp.blogspot.com/-K7vl-ShXrNc/VQU-D0NTFgI/AAAAAAAAAOg/aBIUOwF2nEQ/s1600/contact.png"]]];
UIImageView *temp = [[UIImageView alloc] initWithImage:anImage];
[self.view addSubview:temp];
Here is a screenshot.