Saving an image displayed in a UIImageView - ios

I have an array of images that are displayed in a UICollectionView.
When a cell in the collection view is pressed, that image is pushed to a view controller and displayed in a UIImageView.
I want to then be able to press a button and save the image to the users camera roll.
But I'm having some trouble doing so...
I think I'm on the right lines with the code but can't get it all to work together:
- (IBAction)onClickSavePhoto:(id)sender{
UIImage *img = [UIImage imageNamed:#"which ever image is being currently displayed in the image view"];
UIImageWriteToSavedPhotosAlbum(img, nil, nil, nil);
}
How can i manipulate the code to allow the user to save the image displayed in the image view?
Thanks in advance!
UPDATE:
Found the solution to the problem in another post:
Save image in UIImageView to iPad Photos Library

How to save an image to the library:
You can use this function:
UIImageWriteToSavedPhotosAlbum(UIImage *image,
id completionTarget,
SEL completionSelector,
void *contextInfo);
You only need completionTarget, completionSelector and contextInfo if you want to be notified when the UIImage is done saving, otherwise you can pass in nil.
More info here
Supposedly a faster way to save an image to the library than using UIImageWriteToSavedPhotosAlbum:
There`s much more fast then UIImageWriteToSavedPhotosAlbum way to do it using iOS 4.0+ AVFoundation framework
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) { // TODO: error handling }
else { // TODO: success handling }
}];
//for non-arc projects
//[library release];
Get image of whatever is in the UIImageView as a screenshot:
iOS 7 has a new method that allows you to draw a view hierarchy into the current graphics context. This can be used to get an UIImage very fast.
This is a category method on UIView to get the view as an UIImage:
- (UIImage *)takeSnapShot {
UIGraphicsBeginImageContextWithOptions(self.myImageView.bounds.size, NO, [UIScreen mainScreen].scale);
[self drawViewHierarchyInRect:self.myImageView.bounds afterScreenUpdates:YES];
// old style [self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
It is considerably faster then the existing renderInContext: method.
Reference : https://developer.apple.com/library/ios/qa/qa1817/_index.html
UPDATE FOR SWIFT: An extension that does the same:
extension UIView {
func takeSnapshot() -> UIImage {
UIGraphicsBeginImageContextWithOptions(self.myImageView.bounds.size, false, UIScreen.mainScreen().scale);
self.drawViewHierarchyInRect(self.myImageView.bounds, afterScreenUpdates: true)
// old style: self.layer.renderInContext(UIGraphicsGetCurrentContext())
let image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
}

Related

Programatic Screenshot with Camera View

I am using the below code to take a screenshot of the screen, but it replaces the camera view with all black (though all other UI elements are fine-- I need the Camera View as well as UI elements).
All answers to similar questions that I've found are either extremely old/deprecated, or in Swift. If anyone has a simple, Obj-C solution to this problem, it would be much appreciated.
Thanks!
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
UIGraphicsBeginImageContextWithOptions(self.view.window.bounds.size, NO, [UIScreen mainScreen].scale);
} else {
UIGraphicsBeginImageContext(self.view.window.bounds.size);
}
[self.view.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image);
if (imageData) {
[imageData writeToFile:#"Screenshot.png" atomically:YES];
NSLog(#"Screenshot write successful");
} else {
NSLog(#"Error while taking screenshot");
}
If i understand correct, it can help you get video screen - https://stackoverflow.com/a/37789235/1678018
And you can add this image in screen image (which you have with UIGraphicsGetImageFromCurrentImageContext).
Example way for add UIImage on top of another UIImage: Add UIImage on top of another UIImage

Video stream in AVSampleBufferDisplayLayer doesn't show up in screenshot

I've been using the new Video Toolbox methods to take an H.264 video stream and display it in a view controller using AVSampleBufferDisplayLayer. This all works as intended and the stream looks great. However, when I try to take a screenshot of the entire view, the contents of the AVSampleBufferDisplayLayer (i.e. the decompressed video stream) do not show up in the snapshot. The snapshot shows all other UI buttons/labels/etc. but the screenshot only shows the background color of the AVSampleBufferDisplayLayer (which I had set to bright blue) and not the live video feed.
In the method below (inspired by this post) I take the SampleBuffer from my stream and queue it to be displayed on the AVSampleBufferDisplayLayer. Then I call my method imageFromLayer: to get the snapshot as a UIImage. (I then either display that UIImage in the UIImageView imageDisplay, or I save it to the device's local camera roll to verify what the UIImage looks like. Both methods yield the same result.)
-(void) h264VideoFrame:(CMSampleBufferRef)sample
{
[self.AVSampleDisplayLayer enqueueSampleBuffer:sample];
dispatch_sync(dispatch_get_main_queue(), ^(void) {
UIImage* snapshot = [self imageFromLayer:self.AVSampleDisplayLayer];
[self.imageDisplay setImage:snapshot];
});
}
Here I simply take the contents of the AVSampleBufferDisplayLayer and attempt to convert it to a UIImage. If I pass the entire screen into this method as the layer, all other UI elements like labels/buttons/images will show up except for the AVDisplayLayer. If I pass in just the AVDisplayLayer, I get a solid blue image (since the background color is blue).
- (UIImage *)imageFromLayer:(CALayer *)layer
{
UIGraphicsBeginImageContextWithOptions([layer frame].size, YES, 1.0);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
//UIImageWriteToSavedPhotosAlbum(outputImage, self, nil, nil);
UIGraphicsEndImageContext();
return outputImage;
}
I've tried using UIImage snapshot = [self imageFromLayer: self.AVSampleDisplayLayer.presentationLayer]; and .modelLayer, but that didn't help. I've tried queueing the samplebuffer and waiting before taking a snapshot, I've tried messing with the opacity and xPosition of the AVDisplayLayer... I've even tried setting different values for the CMTimebase of the AVDisplayLayer. Any hints are appreciated!
Also according to this post, and this post other people are having similar troubles with snapshots in iOS 8.
I fixed this by switching from AVSampleDisplayLayer to VTDecompressionSession. In the VTDecompression didDecompress callback method, I send the decompressed image (CVImageBufferRef) into the following method to get a screenshot of the video stream and turn it into a UIImage.
-(void) screenshotOfVideoStream:(CVImageBufferRef)imageBuffer
{
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
CIContext *temporaryContext = [CIContext contextWithOptions:nil];
CGImageRef videoImage = [temporaryContext
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(imageBuffer),
CVPixelBufferGetHeight(imageBuffer))];
UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
[self doSomethingWithOurUIImage:image];
CGImageRelease(videoImage);
}

UIImage loaded from managed object data doesn't show up in UIImageView

I've got a pretty simple iOS app (adapted from basic master-detail boilerplate).
I've got RestKit set up to load data from a server. If an object's image URL gets updated, I download the image (using an AFHTTPClient subclass) and save its data using UIImagePNGRepresentation(image). Pretty straightforward.
So now, I've got a database that's already populated with objects - including their imageData. But for some reason, though I can get a UIImage instance from the data, that UIImage won't show up in a UIImageView.
I've got a category on the auto-generated NSManagedObject subclass, which (among other things) pulls the image data, and returns a UIImage instance:
#implementation Artwork (Helpers)
// ...
- (UIImage*)image {
if (self.imageData) {
return [UIImage imageWithData:self.imageData];
}
return nil;
}
#end
In my detail view, I have a UIImageView, whose image is set from the above method. Here's the relevant bit from my detail view controller. It gets called just before the segue, and works fine for setting the description text, but doesn't set the image correctly.
- (void)configureView {
// Update the user interface for the detail item (a Artwork instance in this case).
if (self.detailItem) {
// this works just fine
self.detailDescriptionText.text = self.detailItem.rawDescription;
// ... but this doesn't! Nothing is shown in the
UIImage *image = self.detailItem.image;
if (image) {
// Yes, the UIImage *is* there
NSLog(#"UIImage instance: %#, size: %fx%f", image, image.size.width, image.size.height);
// ... but this doesn't seem to any effect
self.imageView.image = image;
}
}
}
The NSLog call prints:
UIImage instance: <UIImage: 0x109a0d090>, size: 533.000000x300.000000
so it certainly seems like the UIImage object exists and has been unpacked from the data just like it should. But nothing shows up in the UIImageView.
Interestingly, if I set up a simple touch-listener on the detail view controller, I can show the image using the exact same code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UIImage *image = self.detailItem.image;
if (image) {
NSLog(#"UIImage instance: %#, size: %fx%f", image, image.size.width, image.size.height);
self.imageView.image = image;
}
}
That works perfectly - tap the screen and the image shows up immediately, and the NSLog call prints:
UIImage instance: <UIImage: 0x10980a7e0>, size: 533.000000x300.000000
So there really is image data, and it does get unpacked into a proper UIImage - but it won't show up.
So, all in all, it seems like there's some sort of timing or threading issue. But here I'm drawing a blank.
Make sure to set your image on the main thread :)
dispatch_async(dispatch_get_main_queue(), ^(void) {
/* your code here */
});

GPUImage output image is missing in screen capture

I am trying to capture screen portion to post image on social media.
I am using following code to capture screen.
- (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, NO, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
Above code is perfect for capturing screen.
Problem :
My UIView contains GPUImageView with the filtered image. When I tries to capture screen using above code, that particular portion of GPUImageView does not contains the filtered image.
I am using GPUImageSwirlFilter with the static image (no camera). I have also tried
UIImage *outImage = [swirlFilter imageFromCurrentFramebuffer]
but its not giving image.
Note : Following is working code, which gives perfect output of swirl effect, but I want same image in UIImage object.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
GPUImageSwirlFilter *swirlFilter = [GPUImageSwirlFilter alloc] init];
swirlLevel = 4;
[swirlFilter setAngle:(float)swirlLevel/10];
UIImage *inputImage = [UIImage imageNamed:gi.wordImage];
GPUImagePicture *swirlSourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage];
inputImage = nil;
[swirlSourcePicture addTarget:swirlFilter];
dispatch_async(dispatch_get_main_queue(), ^{
[swirlFilter addTarget:imgSwirl];
[swirlSourcePicture processImage];
// This works perfect and I have filtered image in my imgSwirl. But I want
// filtered image in UIImage to use at different place like posting
// on social media
sharingImage = [swirlFilter imageFromCurrentFramebuffer]; // This also
// returns nothing.
});
});
1) Am I doing something wrong with GPUImage's imageFromCurrentFramebuffer ?
2) And why does screen capture code is not including GPUImageView portion in output image ?
3) How do I get filtered image in UIImage ?
First, -renderInContext: won't work with a GPUImageView, because a GPUImageView renders using OpenGL ES. -renderinContext: does not capture from CAEAGLLayers, which are used to back views presenting OpenGL ES content.
Second, you're probably getting a nil image in the latter code because you've forgotten to set -useNextFrameForImageCapture on your filter before triggering -processImage. Without that, your filter won't hang on to its backing framebuffer long enough to capture an image from it. This is due to a recent change in the way that framebuffers are handled in memory (although this change did not seem to get communicated very well).

Save scene to camera roll

I'm working on augmented reality app for iPhone and I'm using sample code "ImageTargets" from Vuforia SDK. I'm using my own images as templates and my own model to augment the scene (just a few vertices in OpenGL). Next thing I wanna do is to save the scene to camera roll after pushing a button. I created the button as well as the method the button responds to. Here comes the tricky part. When I press the button the method gets called, image is properly saved, but the image is completely white showing only the button icon (like this http://tinypic.com/r/16c2kjq/5).
- (void)saveImage {
UIGraphicsBeginImageContext(self.view.layer.frame.size);
[self.view.layer renderInContext: UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, self,
#selector(image:didFinishSavingWithError:contextInfo:), nil);
}
- (void)image: (UIImage *)image didFinishSavingWithError:(NSError *)error
contextInfo: (void *) contextInfo {
NSLog(#"Image Saved");
}
I have these 2 methods in ImageTargetsParentViewController class but I also tried saving the view from ARParentViewController (and even moved the methods to the class). Has anyone found solution to this? I'm not so sure which view to save and/or whether there aren't any tricky parts with saving the view that contains OpeglES. Thanks for any reply.
try to use this code for save photo:
- (void)saveImage {
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imagee = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rect;
rect = CGRectMake(0, 0, 320, 480);
CGImageRef imageRef = CGImageCreateWithImageInRect([imagee CGImage], rect);
UIImage *img = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
UIImageWriteToSavedPhotosAlbum(img, Nil, Nil, Nil);

Resources