ScreenShot of an UIWebView that is not currently on screen - ios

I'm trying to generate thumbnails of several pages that will be displayed through a UIWebView. The problem is all of the images come out as gray or as the background color I set for the UIWebView. Could it be that I am not allowing the web view enough time to load the page?
I believe I want to do the following:
Create a UIWebView that is not visible on the screen
Setup graphics context
For each page:
Call loadRequest on the web view
Render web view in graphics context
capture screenshot
End graphic context
Here is the code I have for a single image:
UIWebView *screenWebView = _screenshotWebView;
UIView *screenView = _screenshotView;
if ([UIScreen instancesRespondToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2.0f) {
UIGraphicsBeginImageContextWithOptions(screenView.bounds.size, NO, 2.0f);
} else {
UIGraphicsBeginImageContext(screenView.bounds.size);
}
[screenWebView loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:path ofType:#"html"]isDirectory:NO]]];
[screenView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
I defined _screenshotWebView and _screenshotView within a XIB file because I thought I may not have been initializing them correctly.
Any thoughts? Thanks!
Edit: Here is some code that implements this functionality in case someone else needs it: https://github.com/rmcl/webview_screenshot

As you theorized, there isn't time for the UIWebView to load the page. That method will asynchronously start the page load; the UIWebView probably won't get a chance to do any work until you've returned to the run loop.
Try implementing webViewDidFinishLoad: and do your rendering after it has completed.

Related

Capture a screen shot before a Share Extension's view appears

I'm creating a share extension that currently captures web content via javascript, and a resulting NSDictionary [NSExtensionJavaScriptPreprocessingResultsKey]. That's all working great.
The next phase of my implementation is to get a screen shot of the web view of the page in Safari that the user is on.
Does anyone know if there's a pattern where taking a screen shot of the web page prior to the view that's loading w/the extension? Clearly viewWillAppear won't work, but I'll past the code I have none the less:
- (void)viewWillAppear:(BOOL)animated
{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 0);
[self.view drawViewHierarchyInRect:self.view.bounds afterScreenUpdates:YES];
UIImage *screenShotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.imageView.image = screenShotImage;
[super viewWillAppear:animated];
}

Screenshot Safari from Share Extension

Is it possible to perform a screenshot of the current visible zone of the webview in Safari from a Share Extension? I could use windows, but UIApplication isn't supported on extensions so I can't access to that window.
You can't since UIApplication can't be reached from an extension. You cannot get the first UIWindow, which is the Safari layer, so you have to play with the Javascript preprocessing file that the extensions have. So just create a Javascript file that, when sent to Safari, generates a base64 string with the current visible zone image data. Take that string through the kUTTypePropertyList identifier in your extension. Since that should be NSData, generate the UIImage from there, by using +imageWithData. That is what you're looking for, without having to load the page again, preventing a second load and a bad image if the webpage requires of a login.
As far as I know, you can't unless you invoke the API you need dynamically, and even so you might run into context permission issues and app store approval issues.
An alternative might be passing the current Safari URL to your extension, load it using a hidden UIWebView and render this view into an UIImage but you will loose the current visible zone information...
Edit: So the below works in the Simulator but does not work on the device. I'm presently looking for a solution as well.
You can't get just the visible area of Safari, but you can get a screenshot with a little ingenuity. The following method captures a screenshot from a ShareViewController.
func captureScreen() -> UIImage
{
// Get the "screenshot" view.
let view = UIScreen.mainScreen().snapshotViewAfterScreenUpdates(false)
// Add the screenshot view as a subview of the ShareViewController's view.
self.view.addSubview(view);
// Now screenshot *this* view.
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, false, 0);
self.view.drawViewHierarchyInRect(view.bounds, afterScreenUpdates: true)
let image: UIImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Finally, remove the subview.
view.removeFromSuperview()
return image
}
This is the approved way to capture the screenshot of a webpage in a share extension:
for (NSExtensionItem *item in self.extensionContext.inputItems) {
for (NSItemProvider *itemProvider in item.attachments) {
[itemProvider loadPreviewImageWithOptions:#{NSItemProviderPreferredImageSizeKey: [NSValue valueWithCGSize:CGSizeMake(60.0f, 60.0f)]} completionHandler:^(UIImage * item, NSError * _Null_unspecified error) {
// Set the size to that desired, however,
// Note that the image 'item' returns will not necessarily by the size that you requested, so code should handle that case.
// Use the UIImage however you wish here.
}];
}
}

I'm trying to take a screenshot of the entire screen, and then resize a webview. The webview refuses to resize

I have a webview, and when a user clicks on a link, the app takes a screenshot of the screen, and then loads the page, and then when the page is done, the image is removed from the screen.
I also need the webview to resize while the image is visible.
With the code to make the image in place, the webview won't resize anymore. If I take out the code to make the image, the webview does resize. However, with the code to make the image, it DOES move the webview up to the correct origin, but it does not then make the webview larger. Here is my code. It starts running the instant the user clicks on a link:
UIGraphicsBeginImageContext(CGSizeMake(1024, 768));
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
if (!_coverImageView) _coverImageView = [[UIImageView alloc]init];
_coverImageView.frame = CGRectMake(0.0f , 0.0f, 1024.0f, 768.0f);
_coverImageView.image = viewImage;
[self.view addSubview:_coverImageView];
_webView.frame = CGRectMake(0.0f , 0.0f, 1024.0f, 768.0f);
Then, when the webview is finished loading:
[_coverImageView removeFromSuperview];
I'm completely lost as to how this affects my webview. So again, this code does successfully take the image, put it on the screen, and then MOVE the webview up, but it does not resize the webview.

How to optimize memory usage in UIImage

I try to take screenshot of uiwebview and send it with observer to another UIImageView in another class.
I using this method to take screenshot:
-(UIImage*)takeScreenshoot{
#autoreleasepool {
UIGraphicsBeginImageContext(CGSizeMake(self.view.frame.size.width,self.view.frame.size.height));
CGContextRef context = UIGraphicsGetCurrentContext();
[self.webPage.layer renderInContext:context];
UIImage *__weak screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenShot;
}
}
But then I have problem. Everytime I take screenshot with this method, memory rate grows about 10-15mb and it's not realising it. And if I take screenshot in every webviewdidfinishload, you can imagine how much it can take memory!
How can I fix that issue?
If possible try to use [UIScreen snapshotViewAfterScreenUpdates] which returns UIView .
This is the snapshot of currently displayed content (snapshot of app).
Even apple also says " this method is faster than trying to render the contents of the screen into a bitmap image yourself."
According to your code, you are passing this bitmap image to just display in some other UIImageView. so i think using UIScreen method is appropriate here.
To display UIWebView part only->
Create another UIView instance. Set its frame to the frame of your webView.
Now add this screenShot view as subView to createdView and set its frame such that webView portion will be displayed.
Try calling CGContextRelease(context); after you have got your screen shot.
Or as #Greg said, remove the line and use UIGraphicsGetCurrentContext() directly

Generating thumbnails of local webpages on iOS

I have a set of local HTML pages that I would like batch generate thumbnails for on the fly (I only want to show the thumbnails, not the full web pages). This is the way I'm accomplishing this:
NSString* path = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:link];
NSURL* url = [NSURL fileURLWithPath:path];
NSURLRequest* request = [NSURLRequest requestWithURL:url];
UIWebView* webView = [[UIWebView alloc] initWithFrame:CGRectMake(0, 0, 725, 1004)];
webView.delegate = cell;
[webView loadRequest:request];
[self.view addSubview:webView]; // doesn't work without this line, but then the UIWebView is onscreen
Then in the delegate:
- (void)webViewDidFinishLoad:(UIWebView *)webView
{
[self performSelector: #selector(render:) withObject: webView afterDelay: 0.01f];
}
- (void) render: (id) obj
{
UIWebView* webView = (UIWebView*) obj;
CGSize thumbsize = CGSizeMake(96,72);
UIGraphicsBeginImageContext(thumbsize);
CGContextRef context = UIGraphicsGetCurrentContext();
CGFloat scalingFactor = thumbsize.width/webView.frame.size.width;
CGContextScaleCTM(context, scalingFactor,scalingFactor);
[webView.layer renderInContext: context];
UIImage *resultImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.thumbnail.image = resultImage;
}
Here are my questions:
1) Is my general approach correct? Is this the most efficient way to batch process thumbnails of webpages on the fly?
2) I want to be able to render the thumbnail from an offscreen UIWebView, but the webViewDidFinishLoad: doesn't get called unless I add the UIWebView to the view hierarchy. Is there a way I can avoid this?
3) If I attempt to capture an image of the UIWebView in webViewDidFinishLoad:, I get a blank image. I have to put an artificial delay for the capture to work. Any way around this?
Thanks!
1) This seems to be the only way I know of to do this, without relying on a third party server-based API (if one exists)
2) You can draw the UIWebView off-screen by setting its frame to a position that is off-screen. For example 320,568,725,1008. I haven't tested it, but you should even be able to set the view as a 1 x 1px frame.
3) I imagine this is because when webViewDidFinishLoad: is called, a SetNeedsDisplay() has been called on the web view, but the web view has not yet redrawn itself. I'm not sure about a way around this.

Resources