I am trying to save a UIView with components in it to an image and save that image the the Photos.
For that I am using the following code
let image = scrollContent.asImage() //UIImage(view: scrollContent)
UIImageWriteToSavedPhotosAlbum(image, self,#selector(image(_:didFinishSavingWithError:contextInfo:)), nil)
and one of these two extensions (Second one is better).
extension UIImage {
convenience init(view: UIView) {
UIGraphicsBeginImageContext(view.frame.size)
view.layer.render(in:UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.init(cgImage: image!.cgImage!)
}
}
//Or
extension UIView {
func asImage() -> UIImage {
let renderer = UIGraphicsImageRenderer(bounds: bounds)
return renderer.image { rendererContext in
layer.render(in: rendererContext.cgContext)
}
}
}
My Problem is that sometimes this code saves only a cut version of my image to the photos. But the view is displayed completely. How can that be ? Is there a max size or something ?
Related
I have a UIView that can be drawn like a finger paint application, but sometimes it is not visible. I want to be able to take a screenshot of it when it is not visible. Also, I want a screenshot where it is visible, but I don't want any subviews. I just want the UIView itself. This is the method I have tried:
func shapshot() -> UIImage? {
UIGraphicsBeginImageContext(self.frame.size)
guard let context = UIGraphicsGetCurrentContext() else {
return nil
}
self.layer.render(in: context)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
if image == nil {
return nil
}
return image
}
func snapshot() -> UIImage {
UIGraphicsBeginImageContextWithOptions(bounds.size, self.isOpaque, UIScreen.main.scale)
layer.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image!
}
To get view rendered as UIImage, you could introduce a very simple protocol and extend UIView with it.
protocol Renderable {
var render: UIImage { get }
}
extension UIView: Renderable {
var render: UIImage {
UIGraphicsImageRenderer(bounds: bounds).image { context in
layer.render(in: context.cgContext)
}
}
}
and now it's super easy to get the image of any view
let image: UIImage = someView.render
then if you plan to share it or save it, you probably want to convert it to Data
let data: Data? = image.pngData()
I am not sure what you mean with the "when it is not visible" but this should work as long as the view is in the view hierarchy and it's properly laid out. I have been using this method in many apps for sharing stuff and it never failed me.
And of course there is no need for a protocol, feel free to use only the render computed property. It's just a matter of preference.
Documentation:
UIGraphicsImageRenderer, image(actions:)
I want to implement simple sharing function. In my app if user long pressed on UITableViewCell, it's present UIActionController with some buttons.
The first one allows to share content with friends (UIActivityViewController)
My goal is to save cell text with brand watermark in the right corner.
For now, I'm using this extension for converting UIView to UImage:
extension UIView {
func convertToImage() -> UIImage {
if #available(iOS 10.0, *) {
let renderer = UIGraphicsImageRenderer(bounds: bounds)
return renderer.image { rendererContext in
layer.render(in: rendererContext.cgContext)
}
} else {
UIGraphicsBeginImageContext(self.frame.size)
self.layer.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return UIImage(cgImage: image!.cgImage!)
}
}
}
But I have some problems with it. It's works only when the view presented on screen. If I try to get an image from UIView which is not shown on the screen, I get an empty variable.
Even with vc:
let vc = ViewController()
let image = vc.view.convertToImage()
//Image empty
I don't want that user see content with watermark on the screen, I need watermark be added only to the camera roll.
Can I do it?
I am working on an extension to convert UIView to UIImage but I am having facing a strange issue that I am able to get correct image in iOS Simulator but I am getting black image in real device. Below is my code
extension UIView {
func screenshotImage() -> UIImage {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, false, 0.0);
self.drawHierarchy(in: self.bounds, afterScreenUpdates: true)
let screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenShot!
}
}
Can anyone explain what am I doing wrong that I am not able to get correct image in real device?
EDIT
Observations:
Whenever I pass a UIView to this extension in simulator I get perfect image of that view
Whenever I pass a UIView to this extension in real device I get an image which is completely black instead of elements in that UIView unlike to simulator result.
extension UIImage {
class func imageWithView(view: UIView) -> UIImage? {
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.isOpaque, 0.0)
view.drawHierarchy(in: view.bounds, afterScreenUpdates: true)
let img = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return img
}
}
Hope this helps!
I have a UIView canvas which I would like to save a screen shot of it (and it's subviews) on a camera roll when I press the UIBarButton shareBarButton. However, when I press the shareBarButton the image that appears in the camera roll is completely blank black screen.
Any help on why this is happening would be greatly appreciated! I'm also open to suggestions on other (perhaps better) ways to save UIViews onto the camera roll.
This is the method attached to shareBarButton
func share(sender: UIBarButtonItem) {
let masterpiece = canvas.snapshotViewAfterScreenUpdates(true)
let image = convertViewToImage(masterpiece)
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
}
This is the helper function which convert the UIView canvas into a UIImage
func convertViewToImage(masterpiece: UIView) -> UIImage{
UIGraphicsBeginImageContextWithOptions(masterpiece.bounds.size, masterpiece.opaque, UIScreen().scale)
masterpiece.layer.renderInContext(UIGraphicsGetCurrentContext()!)
let img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
to take screenshot:
func captureView(view: UIView) -> UIImage {
var size: CGSize = self.imageViewObject.image.size
UIGraphicsBeginImageContext(size)
self.imageViewObject.image.drawInRect(CGRectMake(0, 0, size.width, size.height))
self.imageOverLayObject.image.drawInRect(CGRectMake(0, 0, size.width, size.height))
var image: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
I'm trying to render/draw the snapshotView of a UIView in contex to get a UIImage.then set it as a CAlayer.contents.
I try this method to get snapshotView:
snapshotViewAfterScreenUpdates:
then convert UIView to UIImage:
UIGraphicsBeginImageContext(view.bounds.size);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
//I'm trying this method too
[view drawViewHierarchyInRect:view.bounds afterScreenUpdates:NO];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
the problem is I'm using the code above, but I get a empty image.
If you need a snapshot UIImage in the first place just use the method in your second code block. Create a UIView's category like
#implementation UIView (takeSnapshot)
- (UIImage *)takeASnapshot {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, [UIScreen mainScreen].scale);
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
#end
That's enough. You don't need snapshot a view and convert it to a image. In your case this process might cause the problem
Had a similar issue when building interactive animations with floating snapshots. Here what we did:
UIView extension:
extension UIView {
public func snapshot(scale: CGFloat = 0, isOpaque: Bool = false, afterScreenUpdates: Bool = true) -> UIImage? {
UIGraphicsBeginImageContextWithOptions(bounds.size, isOpaque, scale)
drawHierarchy(in: bounds, afterScreenUpdates: afterScreenUpdates)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
public enum CASnapshotLayer: Int {
case `default`, presentation, model
}
/// The method drawViewHierarchyInRect:afterScreenUpdates: performs its operations on the GPU as much as possible
/// In comparison, the method renderInContext: performs its operations inside of your app’s address space and does
/// not use the GPU based process for performing the work.
/// https://stackoverflow.com/a/25704861/1418981
public func caSnapshot(scale: CGFloat = 0, isOpaque: Bool = false,
layer layerToUse: CASnapshotLayer = .default) -> UIImage? {
var isSuccess = false
UIGraphicsBeginImageContextWithOptions(bounds.size, isOpaque, scale)
if let context = UIGraphicsGetCurrentContext() {
isSuccess = true
switch layerToUse {
case .default:
layer.render(in: context)
case .model:
layer.model().render(in: context)
case .presentation:
layer.presentation()?.render(in: context)
}
}
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return isSuccess ? image : nil
}
}
Usage example (inside interactive animation):
private func makeSnapshot(view: UIView, snapshotLayer: UIView.CASnapshotLayer) -> UIView? {
// There is 3 ways for taking snapshot:
// 1. Replicate view: `view.snapshotView(afterScreenUpdates: true)`
// 2. Draw Hierarchy: `view.drawHierarchy(in: bounds, afterScreenUpdates: true)`
// 3. Render Layer: `layer.render(in: context)`
//
// Only #3 is working reliable during UINavigation controller animated transitions.
// For modally presented UI also trick by combining #1 and #2 seems works.
// I.e. `view.snapshotView(afterScreenUpdates: true).snapshot()`
//
// If this call causes error listed below, then this indicate that something wrong with one of the views layout.
// [Snapshotting] View (_UIReplicantView) drawing with afterScreenUpdates:YES inside CoreAnimation commit is not supported.
// See also: https://stackoverflow.com/a/29676207/1418981
if let image = view.caSnapshot(layer: snapshotLayer) {
let imageView = ImageView(image: image)
imageView.clipsToBounds = true
imageView.contentMode = .scaleAspectFit
shapshots[view] = image
return imageView
}
return nil
}