UIImageWriteToSavedPhotosAlbum() doesn't save cropped image - ios

I'm trying to save a cropped image to the camera roll.
(I need to do it programmatically, I can't have the user edit it)
This is my (still quite basic) cut and save code:
- (void)cutAndSaveImage:(UIImage*)rawImage
{
CIImage *workingImage = [[CIImage alloc] initWithImage:rawImage];
CGRect croppingRect = CGRectMake(0.0f, 0.0f, 3264.0f, 1224.0f);
CIImage *croppedImage = [workingImage imageByCroppingToRect:croppingRect];
UIImage *endImage = [UIImage imageWithCIImage:croppedImage scale: 1.0f orientation:UIImageOrientationRight];
self.testImage.image = endImage;
UIImageWriteToSavedPhotosAlbum(rawImage, self, #selector(image:didFinishSavingWithError:contextInfo:) , nil);
UIImageWriteToSavedPhotosAlbum(endImage, self, #selector(image:didFinishSavingWithError:contextInfo:) , nil);
}
The method is called within:
- (void)imagePickerController:(UIImagePickerController*)picker didFinishPickingMediaWithInfo:(NSDictionary*)info
I first create a CIImage using the raw UIImage.
Then I get a cropped CIImage using an instance method of the first one.
After that I create a new UIImage using the cropped CIImage.
At this point, to have some feedback, I set the new cropped UIImage as the backing image of a UIImageView. This works, and I can clearly see the image cropped exactly how I desired.
When I try to save it to the camera roll, however, things stop working.
I can't save the newly created endImage.
As you can see, I added a line to save the original UIImage too, just for comparison. The original one saves normally.
Another confusing thing is that the NSError object passed to the image:didFinishSavingWithError:contextInfo: callback is nil. (the callback is normally executed for both saving attempts)
EDIT:
Just made an experiment:
NSLog(#"rawImage: %# - rawImage.CGImage: %#", rawImage, rawImage.CGImage);
NSLog(#"endImage: %# - endImage.CGImage: %#", endImage, endImage.CGImage);
It looks like only the rawImage (coming from the UIImagePickerController) possesses a backing CGImageRef. The other one, created from a CIImage, doesn't.
Can it be that UIImageWriteToSavedPhotosAlbum works using the backing CGImageRef?

Can it be that UIImageWriteToSavedPhotosAlbum works using the backing CGImageRef?
Correct. A CIImage is not an image, and a UIImage backed only by a CIImage is not an image either; it is just a kind of wrapper. Why are you using CIImage at all here? You aren't using CIFilter so this makes no sense. Or if you are using CIFilter, you must render through a CIContext to get the output as a bitmap.
You can crop easily by drawing into a smaller graphics context.

If the UIImage object was initialized using a CIImage object, the
value of the property is NULL.
You can generate UIImage from CIImage like this:
let lecturePicture = UIImage(data: NSData(contentsOfURL: NSURL(string:"http://i.stack.imgur.com/Xs4RX.jpg")!)!)!
let controlsFilter = CIFilter(name: "CIColorControls")
controlsFilter.setValue(CIImage(image: lecturePicture), forKey: kCIInputImageKey)
controlsFilter.setValue(1.5, forKey: kCIInputContrastKey)
let displayImage = UIImage(CGImage: CIContext(options:nil).createCGImage(controlsFilter.outputImage, fromRect:controlsFilter.outputImage.extent()))!
displayImage

Related

The reason for converting an instance of CIImage into an instance of CGImage and only then into UIImage

I was reading an article about Core Image where I saw the following lines:
if let output = filter?.valueForKey(kCIOutputImageKey) as? CIImage {
let cgimgresult = context.createCGImage(output, fromRect: output.extent)
let result = UIImage(CGImage: cgimgresult)
imageView?.image = result
}
As you can see, the CIImage instance is first converted into a CGImage instance and only then into a UIImage one. After doing some research I found out that it had something to do with the scale of the image within the image view's bounds.
I wonder, is that the only reason (having the right scale for display purposes) why we need to do all those conversions because there is already an initializer for UIImage that takes an instance of CIImage as an argument?
In the UIImage's reference that wrote
An initialized UIImage object. In Objective-C, this method returns nil if the ciImage parameter is nil.
and like #matt wrote here
UIImage's CIImage is not nil only if the UIImage is backed by a CIImage already (e.g. because it was generated by imageWithCIImage:).
So, the direct init
UIImage(ciImage: ciImage)
can be nil.
That's why we should be init the UIImage via the CGImage, not CIImage

inputImage = nil : Filtered Image not displaying?

Have been developing an image filtering app with the help of online tutorials on bitFountain. The user should be able to select image of photo they have added to an album and then can either add filter to the photo or delete that image.
My delete image functionality runs fine but adding a filter is not working.
I have logged three of the filter instances to the console as they are returned by the method but it comes back as inputImage = nil.
2015-10-19 10:41:53.634 FilterApp[78451:28732768] <CISepiaTone: inputImage=nil inputIntensity=1>
2015-10-19 10:41:53.634 FilterApp[78451:28732768] <CIGaussianBlur: inputImage=nil inputRadius=1>
2015-10-19 10:41:53.635 FilterApp[78451:28732768] <CIColorClamp: inputImage=nil inputMinComponents=[0.2 0.2 0.2 0.2] inputMaxComponents=[0.9 0.9 0.9 0.9]>
What exactly does inputImage=nil mean?
I'm not sure where the code could be going wrong, or if this problem is even external to the code.
(Using Xcode7 and iPhone 4s simulator.)
Edit: This was the code used to convert to UIImage.
- (UIImage *)filteredImageFromImage:(UIImage *)image andFilter:(CIFilter *)filter
{
CIImage *unfilteredImage = [[CIImage alloc] initWithCGImage:image.CGImage];
[filter setValue:unfilteredImage forKey:kCIInputImageKey];
CIImage *filteredImage = [filter outputImage];
CGRect extent = [filteredImage extent];
CGImageRef cgImage = [self.context createCGImage:filteredImage fromRect:extent];
UIImage *finalImage = [UIImage imageWithCGImage:cgImage];
return finalImage;
}
This problem occurs when you use CIImage. But when you display the image it contains nil value. So Convert your CIImage to UIImage.
e.g.
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *imageDisplay = [UIImage imageWithCGImage:image];
It may be helpful to you.

Video stream in AVSampleBufferDisplayLayer doesn't show up in screenshot

I've been using the new Video Toolbox methods to take an H.264 video stream and display it in a view controller using AVSampleBufferDisplayLayer. This all works as intended and the stream looks great. However, when I try to take a screenshot of the entire view, the contents of the AVSampleBufferDisplayLayer (i.e. the decompressed video stream) do not show up in the snapshot. The snapshot shows all other UI buttons/labels/etc. but the screenshot only shows the background color of the AVSampleBufferDisplayLayer (which I had set to bright blue) and not the live video feed.
In the method below (inspired by this post) I take the SampleBuffer from my stream and queue it to be displayed on the AVSampleBufferDisplayLayer. Then I call my method imageFromLayer: to get the snapshot as a UIImage. (I then either display that UIImage in the UIImageView imageDisplay, or I save it to the device's local camera roll to verify what the UIImage looks like. Both methods yield the same result.)
-(void) h264VideoFrame:(CMSampleBufferRef)sample
{
[self.AVSampleDisplayLayer enqueueSampleBuffer:sample];
dispatch_sync(dispatch_get_main_queue(), ^(void) {
UIImage* snapshot = [self imageFromLayer:self.AVSampleDisplayLayer];
[self.imageDisplay setImage:snapshot];
});
}
Here I simply take the contents of the AVSampleBufferDisplayLayer and attempt to convert it to a UIImage. If I pass the entire screen into this method as the layer, all other UI elements like labels/buttons/images will show up except for the AVDisplayLayer. If I pass in just the AVDisplayLayer, I get a solid blue image (since the background color is blue).
- (UIImage *)imageFromLayer:(CALayer *)layer
{
UIGraphicsBeginImageContextWithOptions([layer frame].size, YES, 1.0);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
//UIImageWriteToSavedPhotosAlbum(outputImage, self, nil, nil);
UIGraphicsEndImageContext();
return outputImage;
}
I've tried using UIImage snapshot = [self imageFromLayer: self.AVSampleDisplayLayer.presentationLayer]; and .modelLayer, but that didn't help. I've tried queueing the samplebuffer and waiting before taking a snapshot, I've tried messing with the opacity and xPosition of the AVDisplayLayer... I've even tried setting different values for the CMTimebase of the AVDisplayLayer. Any hints are appreciated!
Also according to this post, and this post other people are having similar troubles with snapshots in iOS 8.
I fixed this by switching from AVSampleDisplayLayer to VTDecompressionSession. In the VTDecompression didDecompress callback method, I send the decompressed image (CVImageBufferRef) into the following method to get a screenshot of the video stream and turn it into a UIImage.
-(void) screenshotOfVideoStream:(CVImageBufferRef)imageBuffer
{
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
CIContext *temporaryContext = [CIContext contextWithOptions:nil];
CGImageRef videoImage = [temporaryContext
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(imageBuffer),
CVPixelBufferGetHeight(imageBuffer))];
UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
[self doSomethingWithOurUIImage:image];
CGImageRelease(videoImage);
}

Reset button to original image from filtered image

I tried to add filters to my image in image view, when i tried to click on filter buttons they are stacking. Can you help me to create a reset or undo button to go to original image with iam getting original image from camera and camera roll?
(IBAction)filter:(id)sender {
CIContext *context=[CIContext contextWithOptions:nil];
CIImage *image =[CIImage imageWithCGImage:imageview.image.CGImage];
CIFilter *filter =[CIFilter filterWithName:#"CISepiaTone"];
[filter setValue:image forKey:kCIInputImageKey];
[filter setValue:#1.0f forKey:#"InputIntensity"];
CIImage *result =[filter valueForKey:kCIOutputImageKey];
CGImageRef cgImage =[context createCGImage:result fromRect:result.extent];
UIImage *uiImage =[[UIImage alloc]initWithCGImage:cgImage];
[self.imageview setImage:uiImage];
Your filters are stacking because you're modifying the image in the imageview, and then putting that modified image back into the imageview. You should store your original image in a separate property and always start your modifications from that. For example, a property like:
#property (nonatomic, strong) UIImage *originalImage;
Then in your method, use the original image instead of the one already in the image view:
CIImage *image = [CIImage imageWithCGImage:self.originalImage.CGImage];
Storing data in member variables/properties is also just better practice than storing data in the view.

UIImagePNGRepresentation returns nil after CIFilter

Running into a problem getting a PNG representation for a UIImage after having rotated it with CIAffineTransform. First, I have a category on UIImage that rotates an image 90 degrees clockwise. It seems to work correctly when I display the rotated image in a UIImageView.
-(UIImage *)cwRotatedRepresentation
{
// Not very precise, stop yelling at me.
CGAffineTransform xfrm=CGAffineTransformMakeRotation(-(6.28 / 4.0));
CIContext *context=[CIContext contextWithOptions:nil];
CIImage *inputImage=[CIImage imageWithCGImage:self.CGImage];
CIFilter *filter=[CIFilter filterWithName:#"CIAffineTransform"];
[filter setValue:inputImage forKey:#"inputImage"];
[filter setValue:[NSValue valueWithBytes:&xfrm objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
CIImage *result=[filter valueForKey:#"outputImage"];
CGImageRef cgImage=[context createCGImage:result fromRect:[inputImage extent]];
return [[UIImage alloc] initWithCIImage:result];
}
However, when I try to actually get a PNG for the newly rotated image, UIImagePNGRepresentation returns nil.
-(NSData *)getPNG
{
UIImage *myImg=[UIImage imageNamed:#"canada"];
myImg=[myImg cwRotatedRepresentation];
NSData *d=UIImagePNGRepresentation(myImg);
// d == nil :(
return d;
}
Is core image overwriting the PNG headers or something? Is there a way around this behavior, or a better means of achieving the desired result of a PNG representation of a UIImage rotated 90 degrees clockwise?
Not yelling, but -M_PI_4 will give you the constant you want with maximum precision :)
The only other thing that I see is you probably want to be using [result extent] instead of [inputImage extent] unless your image is known square.
Not sure how that would cause UIImagePNGRepresentation to fail though. One other thought... you create a CGImage and then use the CIImage in the UIImage, perhaps using initWithCGImage would give better results.

Resources