UIImageWriteToSavedPhotosAlbum - Failed to encode image for saved photos - ios

In my main ViewController, I generate an image from UIGraphicsGetImageFromCurrentImageContext(). The image is assigned to a UIImage, and I've tested it by placing in a UIImageView in the main view, so I can see it. Works just fine. However when I tap a button I've assigned to save it, I get the error: -3304 "Failed to encode image for saved photos."
screenshot code:
CGSize mySize = CGSizeMake(530, 350);
UIGraphicsBeginImageContext(mySize);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(context, -444, -313);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
// self.screenshot is a UIImage declared in .h
self.screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Save code:
UIImageWriteToSavedPhotosAlbum(screenshot, self, #selector(imageWasSavedSuccessfully:didSaveWithError:contextInfo:), NULL);
Not sure if I'm not meeting the requirements for camera roll images, or if that method can only be used in conjunction with the UIPickerController class. Thanks for your help.

For posterity, the issue was that the image to be saved was nil.

Try to save your image like this:
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil);
or maybe
UIImageWriteToSavedPhotosAlbum(self.screenshot, nil, nil, nil);
or maybe even
UIImageWriteToSavedPhotosAlbum(screenshot, self, nil, nil);

Related

iOS always add white background to images

I am writing an app process image. But when I change alpha of an image to 0, it mean that this image will be transparent. And then I save this image to photo library.
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
But when I open this image again, i see that it has white background.
Does anybody know why?
Did you saved as jpeg?
try this:
NSData* pngdata = UIImagePNGRepresentation (image);
UIImage* img = [UIImage imageWithData:pngdata];
UIImageWriteToSavedPhotosAlbum(img, nil, nil, nil);

How do I programmatically combine multiple UIImages into one long UIImage?

I'm using the code from here How can I programmatically put together some UIImages to have one big UIImage? to combine multiple screenshots into one large vertical image. However, I'm having trouble calling this function to combine multiple images together using:
imageContainer = [UIImage imageByCombiningImage:imageContainer withImage:viewImage];
How do I call this UIImage+combine category to merge the library of images together?
Here's my Code:
- (void) printScreen {
// this is a method that starts the screenshot taking process
// this line is necessary to capture the completion of the scrollView animation
_webView.scrollView.delegate = self;
// save the first image
UIGraphicsBeginImageContextWithOptions(_webView.bounds.size, NO, 0);
[_webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
// scroll to the next section
[_webView.scrollView scrollRectToVisible:CGRectMake(0, _webView.frame.size.height, _webView.frame.size.width, _webView.frame.size.height) animated:YES];
}
- (void )scrollViewDidEndScrollingAnimation:(UIScrollView *)scrollView {
// at this point the _webView scrolled to the next section
// I save the offset to make the code a little easier to read
CGFloat offset = _webView.scrollView.contentOffset.y;
UIGraphicsBeginImageContextWithOptions(_webView.bounds.size, NO, 0);
// note that the below line will not work if you replace _webView.layer with _webView.scrollView.layer
[_webView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(image1, nil, nil, nil);
// if we are not done yet, scroll to next section
if (offset < _webView.scrollView.contentSize.height) {
[_webView.scrollView scrollRectToVisible:CGRectMake(0, _webView.frame.size.height+offset, _webView.frame.size.width, _webView.frame.size.height) animated:YES];
}
UIImage *image = [UIImage imageByCombiningImage:image1 withImage:image2];
[[self theNewImageView] setImage:image];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
NSLog(#"%# printScreen Image", image);
NSLog(#"%# printScreen Image2", image2);
}
Edit:
At this point I've tried alot of different things. Im new to objective C development so I've been digging deep into the Apple developer docs and other good places for info like stack and lynda.
Im seeing two different codes from the logs: printScreen Image2, printScreen Image, but i just cant get the function to call the new category.
I'm able to write all the images in separate pieces to the photo album and I can merge image1 or image2 into one image but not the both images or all images.
I figured it out! Right in front of my face the entire time. I needed to add the call to the category within the scrollViewDidEndScrollingAnimation then save to the UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);. I probably tried a couple different version of this solution but I renamed some variables early on to align with my project which screwed everything up.

Keep transparency in screenshot iOS

I am taking screenshot of a particular View in my Xib file with the following code...
UIView* captureView = self.view;
UIGraphicsBeginImageContextWithOptions(captureView.bounds.size, NO , 0.0f);
[captureView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
It works fine and saves JPG image to camera roll.
But problem is, There is another UIImageView on the top of my View, that UIImageView has a semi-transparent image in it.
My screenshot doesn't preserve that transparency in the screenshot it is taking.
I want to keep the transparency as it is in the actual screen.
How can you preserve the transparency in the screenshot?
If you specify "No" for the opaque property, your image must include an alpha channel for this to work. Check that your image has an alpha channel.
JPGs don't have transparency so as soon as you convert it to JPG alpha is gone.
This is a known limitation of UIImageWriteToSavedPhotosAlbum
it doesn't keep png.
try this. this code working for me
UIGraphicsBeginImageContext(baseViewOne.frame.size);
[[baseViewOne layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage * screenshota = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
also check cocoa coder screen shots
NSData* imdata = UIImagePNGRepresentation(_snapshotImgView.image);
UIImage* snapshotPNG = [UIImage imageWithData:imdata];
UIImageWriteToSavedPhotosAlbum(snapshotPNG, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);

Save scene to camera roll

I'm working on augmented reality app for iPhone and I'm using sample code "ImageTargets" from Vuforia SDK. I'm using my own images as templates and my own model to augment the scene (just a few vertices in OpenGL). Next thing I wanna do is to save the scene to camera roll after pushing a button. I created the button as well as the method the button responds to. Here comes the tricky part. When I press the button the method gets called, image is properly saved, but the image is completely white showing only the button icon (like this http://tinypic.com/r/16c2kjq/5).
- (void)saveImage {
UIGraphicsBeginImageContext(self.view.layer.frame.size);
[self.view.layer renderInContext: UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, self,
#selector(image:didFinishSavingWithError:contextInfo:), nil);
}
- (void)image: (UIImage *)image didFinishSavingWithError:(NSError *)error
contextInfo: (void *) contextInfo {
NSLog(#"Image Saved");
}
I have these 2 methods in ImageTargetsParentViewController class but I also tried saving the view from ARParentViewController (and even moved the methods to the class). Has anyone found solution to this? I'm not so sure which view to save and/or whether there aren't any tricky parts with saving the view that contains OpeglES. Thanks for any reply.
try to use this code for save photo:
- (void)saveImage {
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imagee = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rect;
rect = CGRectMake(0, 0, 320, 480);
CGImageRef imageRef = CGImageCreateWithImageInRect([imagee CGImage], rect);
UIImage *img = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
UIImageWriteToSavedPhotosAlbum(img, Nil, Nil, Nil);

Merging two UIImage into 1 to be savedtolibrary

I would like to know how can I merge 2 uiimage into 1? I would like to save the end product to the library. For saving images I'm using a UI button. Here's snippet of how I save a UIImageview.image.
-(IBAction)getPhoto:(id)sender {
UIImage* imageToSave = imageOverlay.image;
UIImageWriteToSavedPhotosAlbum(imageToSave, nil, nil, nil);
}
I've looked online and read about UIGraphicsBeginImageContext. Found an example but I couldn't understand how to actually apply it to mine. Here's the one I've got so far.
- (UIImage*)addImage:(UIImage *)image secondImage:(UIImage *)image2
{
UIGraphicsBeginImageContext(image.size);
[image drawInRect:CGRectMake(0,0,image.size.width,image.size.height)];
[image2 drawInRect:CGRectMake(10,10,image2.size.width,image2.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Right now I have 2 UIImageviews which is imageOverlay.image and imageView.image. If I use the method above how to assign the return value to UIImageWriteToSavedPhotoAlbum? Hope someone can point me to the right direction.
Thanks very much.
Seems like you don't have the header for the method in your header file, in the .h file of the class/view controller from which you're running the IBAction, add the first line of your method, -(UIImage*) addImage:... And end it with a ; before the #end. That should work, though you'd have to implement it in the .m file of the same .h file somewhere.
Try this:
-(IBAction)getPhoto:(id)sender {
UIImage* imageToSave = [self addImage:imageOverlay.image secondImage:imageView.image];
UIImageWriteToSavedPhotosAlbum(imageToSave, nil, nil, nil);
}
create a NSLog, and pass them through it like this, image1 and image2 are your images:
NSLog (#"testing images aren't nil. Details for image1: %# image2: %#",image1,image2);
I'm a newbie too, and I don't really know a more sophisticated way to see if the image is nil, but I saw that by passing a nil image to the statement above, you'd only get nil instead of the values of those two images. If you get lots of information, even though they're alot of codes and you may not understand them either, as I usually don't, at least you can test whether they are nil or not.

Resources