Is [UIImage imageWithCIImage:scale:orientation] broken on iOS10? - ios

The following code does no longer work on iOS10. The image data remains unrotated.
I have confirmed that this code works on iOS 8 and 9.
CIImage *i = [[CIImage alloc] initWithImage:image];
imageView.image = [UIImage imageWithCIImage:i scale:image.scale orientation:UIImageOrientationRight];
Has anyone run into this same problem? Is this a bug, or an intended change?

My guess is that something has changed in terms of how UIImageView handles image rotation flags. I can't find the changes mentioned anywhere, but at least the code below works. Taken from here.
- (UIImage*)rotateImage:(UIImage*)sourceImage clockwise:(BOOL)clockwise
{
CGSize size = sourceImage.size;
UIGraphicsBeginImageContext(CGSizeMake(size.height, size.width));
[[UIImage imageWithCGImage:[sourceImage CGImage]
scale:1.0
orientation:clockwise ? UIImageOrientationRight : UIImageOrientationLeft]
drawInRect:CGRectMake(0,0,size.height ,size.width)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}

If you need to change the CIImage's orientation, please try this.
let newCIImage: CIImage
if #available(iOS 11.0, *) {
newCIImage = myCIImage.oriented(.right)
} else {
newCIImage = myCIImage.oriented(forExifOrientation: Int32(CGImagePropertyOrientation.right.rawValue))
}

Related

Snap shot is not working?

I am new to mobile programming. I am working in H264 video rendering in iOS application using VideoToolBox framework. It has one feature to take snapshot while rendering the video. Whenever I take a snapshot, I get the Black screen only.
I tried this
1. renderInContext,
2. drawViewHierarchyInRect,
3. snapshotViewAfterScreenUpdates method
to capture the rendering the video but returns a Black screen only.
//snapshot coding
UIGraphicsBeginImageContextWithOptions (self.view.bounds.size, YES, 0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *snapshotImage = UIGraphicsGetImageFromCurrentImageContext();
mImageView.image = snapshotImage;
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(snapshotImage,self, #selector(image:didFinishSavingWithError: contextInfo:), nil);
Check this out,
following chunk of code works for me to take screen's snap shot
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(APP_DELEGATE.window.bounds.size, NO, [[UIScreen mainScreen] scale]);
else
UIGraphicsBeginImageContext(APP_DELEGATE.window.bounds.size);
[APP_DELEGATE.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
I guess, it will help you. let me know if so
I've not worked with video yet, but a simple snapshot of UIView with subViews on it works fine
+ (UIImage *)makeSnapShot:(UIView *)view image:(UIImageView *)imageView
{
CGFloat offset_x = /*your_value*/;
CGFloat offset_y = /*your_value*/;
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGRect rect = CGRectMake(offset_x, offset_y, imageView.bounds.size.width, imageView.bounds.size.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], rect);
image = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return image;
}
Not sure if this is what you're looking for, but if you need to get a snapshot of the VTDecompressionSession, you can send the CVImageBuffer that you get from the decodeFrame callback into this method to get a UIImage. You can also add your CIContext to the parameters list instead of using the temporaryContext.
+ (UIImage *) UIImageFromCVImageBufferRef:(CVImageBufferRef)imageBuf
{
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuf];
CIContext *temporaryContext = [CIContext contextWithOptions:nil];
CGImageRef videoImage = [temporaryContext
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(imageBuf),
CVPixelBufferGetHeight(imageBuf))];
UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
CGImageRelease(videoImage);
return image;
}
func takeScreenshot(_ shouldSave: Bool = true) {
var screenshotImage :UIImage?
let layer = UIApplication.shared.keyWindow!.layer
let scale = UIScreen.main.scale
UIGraphicsBeginImageContextWithOptions(layer.frame.size, false, scale)
self.view.drawHierarchy(in: self.view.bounds, afterScreenUpdates: true)
screenshotImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
if let image = screenshotImage, shouldSave {
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
}

how to set the value to scale property for the resizable image in ios?

how to set the value to scale property for the resizable image in ios?
- (IBAction)changeToLdpi:(id)sender
{
myImage= [myImage resizableImageWithCapInsets:UIEdgeInsetsMake(50, 150,50,150) resizingMode:UIImageResizingModeStretch];
//self.scale=2.0;
CIImage *scaledImg=[[CIImage alloc]initWithImage:myImage];
myImage=[UIImage imageWithCIImage:scaledImg scale:2.0 orientation:UIImageOrientationLeft];
[self->imageView setImage:myImage];
//[_view addSubview:imageView];
}
i have used this code.
but the scale :2.0 is not working.
suggest me how to do tis?
use this method it works
myImage= [myImage resizableImageWithCapInsets:UIEdgeInsetsMake(50, 150,50,150)];
UIImage *picture = [UIImage imageWithCGImage:[myImage CGImage] scale:2.0 orientation:UIImageOrientationLeft];

Stretched UIView background gets cut off during screenshot

So, I am taking a screenshot of a subclassed UIView that I save into the device's photo stream.
Problem:
The problem is that I use resizableImageWithCapInsets to add a stretched background to my UIView, but this background gets cut off on the right side and I have no idea why. If someone could help me out it would be highly appreciated.
I add the stretched background to my UIView the following way:
[diagramBase addSubview:[self addTileBackgroundOfSize:diagramBase.frame
andType:#"ipad_diagram_border.png"]];
Which calls this method:
- (UIImageView *) addTileBackgroundOfSize:(CGRect)frame
andType:(NSString *)type
{
frame.origin.x = 0.0f;
frame.origin.y = 0.0f;
UIImageView *backgroundView = [[UIImageView alloc] initWithFrame:frame];
UIImage *image = [UIImage imageNamed:type];
UIEdgeInsets insets = UIEdgeInsetsMake(10.0f, 10.0f, 10.0f, 10.0f);
UIImage *backgroundImage = [image resizableImageWithCapInsets:insets];
backgroundView.image = backgroundImage;
return backgroundView;
}
The actual printscreen is done with this method (RINDiagramView is the name of my subclassed UIView, which I am taking a screenshot of). The rotation is in there because I need the image rotated when I save it, but I commented out that part and that is not what does the background to act weird.
- (UIImage *) createSnapshotOfView:(RINDiagram *) view
{
CGRect rect = [view bounds];
rect.size.height = rect.size.height - 81.0f;
UIGraphicsBeginImageContextWithOptions(rect.size, YES, 0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
[view.layer renderInContext:context];
UIImage *capturedScreen = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
return finalImage;
}
I use Xcode 5.1 and everything is done programmatically (no storyboard and such). The base SDK is iOS 7.1.
If you're doing iOS 7+ you can use the new drawViewHierarchyInRect:afterScreenUpdates: and related methods which Apple says are really performant.
Even if you're targeting iOS 6 you should give it a try to see if you get the same problem.
Try using the correct scale?
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: [[UIScreen mainScreen] scale]
orientation: UIImageOrientationLeft];
Use a different UIViewContentMode?
UIViewContentModeScaleToFill -> check if you can see the edges
UIViewContentModeScaleAspectFit -> check if you can see the edges, even if position is incorrect
UIViewContentModeScaleAspectFill -> check for edge right side
The reason you got a right-side cut image is caused by this line
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
You made the image orientation to left, the context will thought the left-side is your top-side.And your size has a minus to the height value, so the result turns to the right-side is cut.
About the rotation, I added some code into your code.Hopes it is helpful.
- (UIImage *) createSnapshotOfView:(UIView *) view
{
CGRect rect = [view bounds];
rect.size.height = rect.size.height - 81.0f;
UIGraphicsBeginImageContextWithOptions(rect.size, YES, 0.0f);
CGContextRef context = UIGraphicsGetCurrentContext();
view.transform = CGAffineTransformMakeRotation(M_PI_2);
[view.layer renderInContext:context];
UIImage *capturedScreen = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImage *finalImage = [[UIImage alloc] initWithCGImage: capturedScreen.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
view.transform = CGAffineTransformMakeRotation(0);
return finalImage;
}
UIGraphicsBeginImageContext(self.window.bounds.size);
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(image);
[data writeToFile:#"foo.png" atomically:YES];
for retina display, change the first line into this:
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(self.window.bounds.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(self.window.bounds.size);
adjust your size, may you get help..

Move,rotate and scale UIImages, then merge them! in iphone

rotate image merge but not working in this code
i post my code
please give me solution
- (void)mergeImage :(UIImage *)imageA withImage:(UIImage *)imageB ###
{
UIImage *image0 = imageA;
UIImage *image1=overlayimg.image;
CGSize newImageSize = CGSizeMake(overlayimg.frame.size.width, overlayimg.frame.size.height);
UIGraphicsBeginImageContext(newImageSize);
[image0 drawInRect:CGRectMake(overlayimg.frame.origin.x,overlayimg.frame.origin.y,newImageSize.width,newImageSize.height)];
NSLog(#"Last Rotation:%f",templastrotate);
overlayimg.transform = CGAffineTransformMakeRotation(templastrotate);
CGContextConcatCTM(UIGraphicsGetCurrentContext(), overlayimg.transform);
[image1 drawInRect:CGRectMake(overlayView.frame.origin.x, overlayView.frame.origin.y,overlayView.frame.size.width,overlayView.frame.size.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
imageView.contentMode = UIViewContentModeScaleAspectFill & UIViewContentModeScaleAspectFit;
imageView.image=newImage;
UIGraphicsEndImageContext();
}

Save UIView to a transparent PNG

I have a UIView and I want it to be stored as a transparent PNG, i.e. without the UIVIew background color...
I am currently using this code and it's working OK but with the background color :(
UIGraphicsBeginImageContext(self.bounds.size);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* image1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image1);
[imageData writeToFile:filePath atomically:YES];
So does anyone have any idea about getting this image as a transparent one?
Thank you in advance..
In my case, I forgot the opaque property. It should be set to NO:
view.opaque = NO;
Change the background color to [UIColor clear], draw the image and then set the background color back to the original color.
Since GUI updates will fire only on the next runloop cycle, the user should not see any flickering.
UIColor* color=self.backgroundColor;
view.backgroundColor=[UIColor clearColor];
UIGraphicsBeginImageContext(self.bounds.size);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* image1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image1);
[imageData writeToFile:filePath atomically:YES];
self.backgroundColor=color;
As an addition to Gilad's answer. On retina display this may cause some quality issues. To get the retina context you may use this code from this post.
UIColor* color=self.backgroundColor;
view.backgroundColor=[UIColor clearColor];
// This is for retina render check
UIWindow *keyWindow = [[UIApplication sharedApplication] keyWindow];
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
UIGraphicsBeginImageContextWithOptions(YourView.frame.size, NO, [UIScreen mainScreen].scale);
else
UIGraphicsBeginImageContext(keyWindow.bounds.size);
// And it goes up to here the rest stays the same
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* image1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(image1);
[imageData writeToFile:filePath atomically:YES];
self.backgroundColor=color;
For Objective-c you have to set opaque = NO
+ (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, NO, [[UIScreen mainScreen] scale]);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
For Swift you have to set opaque = false
func imageWithView(inView: UIView) -> UIImage? {
UIGraphicsBeginImageContextWithOptions(inView.bounds.size, false, 0.0)
if let context = UIGraphicsGetCurrentContext() {
inView.layer.render(in: context)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
return nil
}

Resources