I have an issue with new iOS 7 photo filters feature.
I have a photolibrary in my app. While I showing photo's thumbnails in UICollectionView I receive images with filters and crops already applied. There are two methods that return "ready for use" images:
[asset thumbnail]
[[asset defaultRepresentation] fullScreenImage]
On the contrary, when I want to share fullsize image I receive unchanged photo without any filters:
[[asset defaultRepresentation] fullResolutionImage]
Read image data through getBytes:fromOffset:length:error:
Is it possible to get a fullsize image with filter appropriate applied?
So far I figured out only one way to get what I want. All assets store their modification (like filters, crops and etc) info in the metadata dictionary by the key #"AdjustmentXMP". We're able to interpret this data and apply all filters to the fullResolutionImage like in this SO answer. Here is my complete solution:
...
ALAssetRepresentation *assetRepresentation = [asset defaultRepresentation];
CGImageRef fullResImage = [assetRepresentation fullResolutionImage];
NSString *adjustment = [[assetRepresentation metadata] objectForKey:#"AdjustmentXMP"];
if (adjustment) {
NSData *xmpData = [adjustment dataUsingEncoding:NSUTF8StringEncoding];
CIImage *image = [CIImage imageWithCGImage:fullResImage];
NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData
inputImageExtent:image.extent
error:&error];
CIContext *context = [CIContext contextWithOptions:nil];
if (filterArray && !error) {
for (CIFilter *filter in filterArray) {
[filter setValue:image forKey:kCIInputImageKey];
image = [filter outputImage];
}
fullResImage = [context createCGImage:image fromRect:[image extent]];
}
}
UIImage *result = [UIImage imageWithCGImage:fullResImage
scale:[assetRepresentation scale]
orientation:(UIImageOrientation)[assetRepresentation orientation]];
Related
Not able to get original image after applying the filter on imageview.As I want to get original image after moving the slider from 1 to 0 (max to min value, I mean in reverse direction).
Below is the code for applying sharpness effect
- (IBAction)actionSharp:(UISlider *)sender {
demoImage = _affectView.image;
CIImage* image = [CIImage imageWithCGImage:demoImage.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
NSNumber *testNSNumber = [NSNumber numberWithFloat:_sharpActionWithSlider.value];
NSLog(#"ffghhn %f",_sharpActionWithSlider.value);
CIFilter *gaussianBlurFilter = [CIFilter filterWithName:#"CISharpenLuminance" keysAndValues: #"inputImage", image, nil];
[gaussianBlurFilter setDefaults];
[gaussianBlurFilter setValue:testNSNumber forKey:kCIInputSharpnessKey];
[gaussianBlurFilter setDefaults];
CIImage *result2 = [gaussianBlurFilter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result2 fromRect:[result2 extent]];
UIImage *sharp= [UIImage imageWithCGImage:cgImage];
UIImage *p = sharp;
self.affectView.image= p;
self.affectView.alpha = sender.value;
CGImageRelease(cgImage);
}
Just copy the image on which you are adding filter and keep it aside. Add effects to copied image and show them on your _affectView image view.
On your slider action every time get reference from your original image and effect to that image rather than image from image view.
Create global var as mainImage as
UIImage *mainImage;
in your viewDidLoad assign origial image to your main image
mainImage = [UIImage imagenamed:"yourMainImage"]; // assign from source local or url
In you IBAction on every slider action get refernce of original image instead of fetching image from imageview.
- (IBAction)actionSharp:(UISlider *)sender {
UIImage *demoImage = mainImage;
CIImage* image = [CIImage imageWithCGImage:demoImage.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
NSNumber *testNSNumber = [NSNumber numberWithFloat:_sharpActionWithSlider.value];
NSLog(#"ffghhn %f",_sharpActionWithSlider.value);
CIFilter *gaussianBlurFilter = [CIFilter filterWithName:#"CISharpenLuminance" keysAndValues: #"inputImage", image, nil];
[gaussianBlurFilter setDefaults];
[gaussianBlurFilter setValue:testNSNumber forKey:kCIInputSharpnessKey];
[gaussianBlurFilter setDefaults];
CIImage *result2 = [gaussianBlurFilter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result2 fromRect:[result2 extent]];
UIImage *sharp= [UIImage imageWithCGImage:cgImage];
UIImage *p = sharp;
self.affectView.image= p;
self.affectView.alpha = sender.value;
CGImageRelease(cgImage);
}
You are getting your demo image from _affectView, applying a filter to it and then saving it in _affectView again. The next time you retrieve _affectView into a UIImage it will not be the original one but the modified one so it is imposible to get back to the original.
Just at the start (outside this method) save your original image in demo and modify demo every time you want.
I want to convert image to pure black & white. I tried and got the result which is left image I have attached and the result should be the right image I have attached according to the requirements of the application.
I have used lots of CIFilter like (CIColorMonochrome, CIColorControls, CIPhotoEffectTonal etc.) but none of this is working.
Please check below code and attached result of Images.
- (UIImage *)imageBlackAndWhite
{
CIImage *beginImage = [CIImage imageWithData: UIImageJPEGRepresentation(self.captureImage.image, .1)];
//CIImage *beginImage = [CIImage imageWithCGImage:self.CGImage];
CIImage *output = [CIFilter filterWithName:#"CIColorMonochrome" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:1.0], #"inputColor", [[CIColor alloc] initWithColor:[UIColor whiteColor]], nil].outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgiimage = [context createCGImage:output fromRect:output.extent];
UIImage *newImage = [UIImage imageWithCGImage:cgiimage];
CGImageRelease(cgiimage);
return newImage;
}
My Result----------------------------------------------Expected Result
You can use blendmode: kCGBlendModeLuminosity with alpha value:1.0
I'm creating a photo app that involves filters. The filters work, but they stay permanent and overlap each other. I want, for example, the sepia filter to be removed from the image and replaced with the instant filter when the instant filter button is pressed. How do I achieve this? Here is some code from the project:
- (IBAction)sepiaFilter:(id)sender {
CIImage *beginImage = [CIImage imageWithData: UIImagePNGRepresentation(self.picture.image)];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil ];
CIImage *outputImage = [filter outputImage];
CGImageRef cging = [context createCGImage:outputImage fromRect:[outputImage extent]];
self.picture.image = [UIImage imageWithCGImage:cging];
CGImageRelease(cging);
}
- (IBAction)instantFilter:(id)sender {
CIImage *beginImage = [CIImage imageWithData: UIImagePNGRepresentation(self.picture.image)];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filterTwo = [CIFilter filterWithName:#"CIPhotoEffectInstant" keysAndValues:kCIInputImageKey, beginImage, nil ];
CIImage *outputImage = [filterTwo outputImage];
CGImageRef cging = [context createCGImage:outputImage fromRect:[outputImage extent]];
self.picture.image = [UIImage imageWithCGImage:cging];
CGImageRelease(cging);
}
(I actually have nine filters, but I just need an idea on how to do this) Thanks for any help
Its quite simple. You are using self.picture.image for input as well as output. Keep an original UIImage separately for input, and use self.picture.image only for output, so the only thing in your code changes is the first line of both the filters ie
CIImage *beginImage = [CIImage imageWithData: UIImagePNGRepresentation(self.originalImage)];
Hope this helps.
When a user makes some changes (cropping, red-eye removal, ...) to photos in the built-in Photos.app on iOS, the changes are not applied to the fullResolutionImage returned by the corresponding ALAssetRepresentation.
However, the changes are applied to the thumbnail and the fullScreenImage returned by the ALAssetRepresentation.
Furthermore, information about the applied changes can be found in the ALAssetRepresentation's metadata dictionary via the key #"AdjustmentXMP".
I would like to apply these changes to the fullResolutionImage myself to preserve consistency. I've found out that on iOS6+ CIFilter's filterArrayFromSerializedXMP: inputImageExtent:error: can convert this XMP-metadata to an array of CIFilter's:
ALAssetRepresentation *rep;
NSString *xmpString = rep.metadata[#"AdjustmentXMP"];
NSData *xmpData = [xmpString dataUsingEncoding:NSUTF8StringEncoding];
CIImage *image = [CIImage imageWithCGImage:rep.fullResolutionImage];
NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData
inputImageExtent:image.extent
error:&error];
if (error) {
NSLog(#"Error during CIFilter creation: %#", [error localizedDescription]);
}
CIContext *context = [CIContext contextWithOptions:nil];
for (CIFilter *filter in filterArray) {
[filter setValue:image forKey:kCIInputImageKey];
image = [filter outputImage];
}
However, this works only for some filters (cropping, auto-enhance) but not for others like red-eye removal. In these cases, the CIFilters have no visible effect. Therefore, my questions:
Is anyone aware of a way to create red-eye removal CIFilter? (In a way consistent with the Photos.app. The filter with the key kCIImageAutoAdjustRedEye is not enough. E.g., it does not take parameters for the position of the eyes.)
Is there a possibility to generate and apply these filters under iOS 5?
ALAssetRepresentation* representation = [[self assetAtIndex:index] defaultRepresentation];
// Create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte*)malloc(representation.size); // Copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer fromOffset: 0.0 length:representation.size error:nil];
if (length==0)
return nil;
// Convert the buffer into a NSData object, and free the buffer after.
NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:representation.size freeWhenDone:YES];
// Set up a dictionary with a UTI hint. The UTI hint identifies the type
// of image we are dealing with (that is, a jpeg, png, or a possible
// RAW file).
// Specify the source hint.
NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:
(id)[representation UTI], kCGImageSourceTypeIdentifierHint, nil];
// Create a CGImageSource with the NSData. A image source can
// contain x number of thumbnails and full images.
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef) adata, (CFDictionaryRef) sourceOptionsDict);
[adata release];
CFDictionaryRef imagePropertiesDictionary;
// Get a copy of the image properties from the CGImageSourceRef.
imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);
CFNumberRef imageWidth = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelWidth);
CFNumberRef imageHeight = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelHeight);
int w = 0;
int h = 0;
CFNumberGetValue(imageWidth, kCFNumberIntType, &w);
CFNumberGetValue(imageHeight, kCFNumberIntType, &h);
// Clean up memory
CFRelease(imagePropertiesDictionary);
So I've been using core image to apply filters on images, everything is good except when I try to apply the same filter over and over again the application just quits, I guess its a memory leak.
Here's the code:
-(UIImage *) applyFilter: (UIImage*) picture
{
UIImageOrientation originalOrientation = picture.imageOrientation;
CGFloat originalScale = picture.scale;
CIImage *beginImage = [CIImage imageWithCGImage:picture.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, beginImage,
#"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg =
[context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg scale:originalScale orientation:originalOrientation];
beginImage = nil;
context = nil;
filter = nil;
outputImage = nil;
cgimg = nil;
[beginImage release];
[context release];
[filter release];
[outputImage release];
//CGImageRelease(CGImageRef) method.
CGImageRelease(cgimg);
return newImg;
}
And to filter I simply do
UIImage *ima = [self.filter applyFilter:self.imageView.image];
imageView.image = ima ;
The applyFilter is a method of Filter class that I created
You set variables to nil before you call release, so the release has no effect. But you should not release most of the stuff anyway. You only need to release objects that you created (I hope the following list is complete):
Objective-C objects that were returned by methods starting with alloc, init, copy, new
Foundation objects returned by Objective-C methods starting with create, or by functions containing Create or Copy.
Delete these lines and it should be fine:
beginImage = nil;
context = nil;
filter = nil;
outputImage = nil;
cgimg = nil;
[beginImage release];
[context release];
[filter release];
[outputImage release];
You need to keep the line CGImageRelease(cgimg); because the method used to get cgimg contains create – you create it, you release it.