Good afternoon,
I am trying to take a snapshot of a PDF but I am facing some difficulties to access view's content on iOS 12.
In order to display the PDF's content, I've already used two different approaches:
UIDocumentInteractorController
Webview
On iOS 11, I couldn't already take a snapshot of a UIDocumentInteractorController view and the best answer that I could find was this one https://stackoverflow.com/a/13332623/2568889. In short, it says that the document viewer runs in an external process and on its own windows that the main app process doesn't have access to.
The WebView was the solution at that time, until iOS 12 came. While testing on real devices running iOS 12, I had the same issue of not being able to access the viewer's content while taking the snapshot. While inspecting the view hierarchy, it looks like we have a childviewcontroller (PDFHostViewController) that is the one rendering the actual view.
Please take into account that this issue is just happening for PDFs, on regular webpages is working fine!
Code used to take snapshot:
private extension UIView {
func takeSnapshot() -> UIImage {
let format = UIGraphicsImageRendererFormat()
format.opaque = self.isOpaque
let renderer = UIGraphicsImageRenderer(size: self.frame.size, format: format)
return renderer.image { context in
self.drawHierarchy(in: self.frame, afterScreenUpdates: true)
}
}
}
Note: I have also tried to use the native Webview.takeSnapshot(with:,completionHandler) of the web view but it just works for regular webpages, not PDFs
Maybe it works with Apple's PDFKit. As far as i know the PDFView is a subclass of UIView.
import PDFKit
#IBOutlet var pdfView: PDFView!
let pdfDocument = PDFDocument(url: url)
pdfView.document = pdfDocument
And then your extension as PDFView extension
Related
First, this is a Youtube link showing the problem: Video stretched. The Youtube video is edited to remove unnecessary parts, I am only showing the important parts. As you can see after some time the video gets stretched.
The original video was uploaded to Azure media services and encoded by Azure media using the built-in "AdaptiveStreaming" preset.
I am using HLS dynamic packaging with this url:
https://amswrdev-usso.streaming.media.azure.net/80a2651c-462f-487f-b1a3-87cb72366255/1zIHQ.ism/manifest(format=m3u8-cmaf)
I am testing it on an Iphone 12 pro max, IOS 15.0.1, swift 5.0
I am using the AVPlayerViewController, this is the code:
import Foundation
import SwiftUI
import AVKit
struct VideoPlayerView: UIViewControllerRepresentable {
var player: AVPlayer
#Binding var gravity: AVLayerVideoGravity
func makeUIViewController(context: UIViewControllerRepresentableContext<VideoPlayerView>) -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = player
controller.showsPlaybackControls = false
controller.videoGravity = gravity
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: UIViewControllerRepresentableContext<VideoPlayerView>) {
uiViewController.videoGravity = gravity
}
func dismantleUIViewController(_ uiViewController: AVPlayerViewController, coordinator: Self.Coordinator) {
//print("dismantleUIViewController \(uiViewController)")
}
}
My hypothesis:
Avplayer is not correctly switching to the correct bandwidth
Azure media is not sending the correct variants on the initial playlist
Maybe I don't have the correct parameters for preferredMaximumResolution and preferredForwardBufferDuration? but I don't know what values should be correct.
Dynamic packaging of azure media is now on version 7, maybe is not supported by IOS?
I have been trying to fix it changing my view to have fixed values of height and width but is not working. I have 2 weeks trying to figure out this but nothing is working, Do you have any idea?
Like I said the video is stretched after some time, is not consistent. Sometimes happens immediately and sometimes takes more time but happens.
I'm still on iOS 14 on an iPhone 8 - and it plays just fine for me... so this may be an iOS 15.0.1 issue right now.
I'm developing a QLThumbnailProvider extension to display thumbnails for my document type. My extension does not appear to be being called - my thumbnails are not appearing and I'm not seeing the logging I've added appearing in any log files.
I have an UIDocumentBrowserViewController based app that defines a new document type. It exports an UTI (com.latenightsw.Eureka.form). My app is able to browse, create and open documents, but the thumbnails are blank.
I've added a Thumbnail Extension target to my project. The code looks like this:
class ThumbnailProvider: QLThumbnailProvider {
override func provideThumbnail(for request: QLFileThumbnailRequest, _ handler: #escaping (QLThumbnailReply?, Error?) -> Void) {
// Third way: Set an image file URL.
print("provideThumbnail: \(request)")
handler(QLThumbnailReply(imageFileURL: Bundle.main.url(forResource: "EurekaForm", withExtension: "png")!), nil)
}
}
I've confirmed that EurekaForm.png is part of the target and being copied to the extension's bundle (as well as the host app's bundle).
And I've confirmed that my UTI is declared:
Does anyone have any suggestions?
It appears that logging and breakpoints sometimes do not work inside app extension. Even fatalErrors occur silently.
In my project I could not get the initialiser QLThumbnailReply(imageFileURL:) to work. However the other initialisers seem to work better.
Drawing the image into a context
When using the context initialiser you have to use a context size which lies between request.minimumSize and request.maximumSize.
Below I've written some code which takes an image and draws it into the context while keeping the above conditions.
override func provideThumbnail(for request: QLFileThumbnailRequest, _ handler: #escaping (QLThumbnailReply?, Error?) -> Void) {
let imageURL = // ... put your own code here
let image = UIImage(contentsOfFile: imageURL.path)!
// size calculations
let maximumSize = request.maximumSize
let imageSize = image.size
// calculate `newImageSize` and `contextSize` such that the image fits perfectly and respects the constraints
var newImageSize = maximumSize
var contextSize = maximumSize
let aspectRatio = imageSize.height / imageSize.width
let proposedHeight = aspectRatio * maximumSize.width
if proposedHeight <= maximumSize.height {
newImageSize.height = proposedHeight
contextSize.height = max(proposedHeight.rounded(.down), request.minimumSize.height)
} else {
newImageSize.width = maximumSize.height / aspectRatio
contextSize.width = max(newImageSize.width.rounded(.down), request.minimumSize.width)
}
handler(QLThumbnailReply(contextSize: contextSize, currentContextDrawing: { () -> Bool in
// Draw the thumbnail here.
// draw the image in the upper left corner
//image.draw(in: CGRect(origin: .zero, size: newImageSize))
// draw the image centered
image.draw(in: CGRect(x: contextSize.width/2 - newImageSize.width/2,
y: contextSize.height/2 - newImageSize.height/2,
width: newImageSize.width,
height: newImageSize.height);)
// Return true if the thumbnail was successfully drawn inside this block.
return true
}), nil)
}
I've gotten the Thumbnail Extension rendering but it only displays its renders in the Files app (others use the App Icon) as far as I can tell.
It is important to note this issue with debugging extensions in that print to console and breakpoints may not be called even though the extension is running.
I see that you have the QLSupportedContentTypes set with your UTI but you may also want to change your UTI to something new as this is when it started working for me. I think after some testing the UTI can get corrupted. While it was working, I had a breakpoint set and it was never called.
In my case, the extension didn't work in the simulator (Xcode 11.1). Everything works as expected on a real device (iOS 13.1.2).
iOS 11 added a markup option after taking a screenshot, how can I programmatically apply this option after programmatically taking a screenshot? it directly gets saved to photos without providing the markup/share option.
I use the code below to take and save the screenshot
#IBAction func takeScreenshot(_ sender: Any) {
let layer = UIApplication.shared.keyWindow!.layer
let scale = UIScreen.main.scale
UIGraphicsBeginImageContextWithOptions(layer.frame.size, false, scale);
layer.render(in: UIGraphicsGetCurrentContext()!)
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot!, nil, nil, nil)
}
Instant Markup is not documented anywhere in Apple's Reference Docs, so I think its safe to assume this isn't made public through their SDK.
Instead you would have to create your own markup editor.
Note: You may not change the way actual device screenshots are handled (when the user presses Home and Lock together) as per Apple's guidelines.
I am new to iOS development. In my project i am displaying a exerciseFilePath on tableview.
the response is given below.
"exerciseId" : 1,
"exerciseName" : "Fitness Exercise",
"exerciseFilePath" : "\/p\/pdf\/exercise_pdf\/fitness_exercise.pdf"
i need to display the pdf in another view on didSelectRowAtIndexpath.
i Dont know how to display the pdf and what are steps to be followed to display that pdf.
I hope you understand my problem. please help me how I can do this.
Why don't you use QLPreviewController or UIDocumentInteractionController?
Now in your case you can do it by using webView:
let req = NSURLRequest(url: pdf) //pdf is your pdf path
let webView = UIWebView(frame: CGRect(x:0,y:0,width:self.view.frame.size.width,height: self.view.frame.size.height-40)) //Adjust view area here
webView.loadRequest(req as URLRequest)
self.view.addSubview(webView)
Some tutorials:
QLPreviewController example
UIDocumentInteractionController example
I need to have a Facebook share button on one of my app's view controllers so that when the user pushes it, it will share a screenshot of the user's current screen to Facebook.
I have been watching a few tutorials such as this one on how to implement a Facebook share button: https://www.youtube.com/watch?v=774_-cTjnVM
But these only show how I can share a message on Facebook, and I'm still a little bit confused how to share the whole screen that user is currently interacting with.
Sharing directly to Facebook isn't hard to do. First, import the Social framework:
import Social
Now add this as the action for your button:
let screen = UIScreen.mainScreen()
if let window = UIApplication.sharedApplication().keyWindow {
UIGraphicsBeginImageContextWithOptions(screen.bounds.size, false, 0);
window.drawViewHierarchyInRect(window.bounds, afterScreenUpdates: false)
let image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
let composeSheet = SLComposeViewController(forServiceType: SLServiceTypeFacebook)
composeSheet.setInitialText("Hello, Facebook!")
composeSheet.addImage(image)
presentViewController(composeSheet, animated: true, completion: nil)
}
You might be interested to know that UIActivityViewController lets users share to Facebook but also other services. The code above is for your exact question: sharing to Facebook. This code renders the entire visible screen; you can also have individual views render themselves if you want.
Note: As Duncan C points out in a comment below, this rendering code won't include anything outside your app, such as other apps or system controls.
In iOS 8 and earlier there used to be a private framework that you could use to capture the entire screen. Using that framework would cause your app to be rejected for the app store, but at least it worked.
Starting in iOS 9 that API no longer works
The best you can do is to capture your app's views. That won't include the status bar or other things drawn by the system or other apps.
One way is to create an off-screen context, render the parent view you want to capture into the off-screen context (probably using drawViewHierarchyInRect:afterScreenUpdates:, load the data from the context into a UIImage, and then close the context.
Another way is a new API that will capture a snapshot of a view hierarchy. One of the new methods to capture a snapshot is snapshotViewAfterScreenUpdates. That creates specialized snapshot view.
swift 3
let screen = UIScreen.main
if let window = UIApplication.shared.keyWindow {
UIGraphicsBeginImageContextWithOptions(screen.bounds.size, false, 0);
window.drawHierarchy(in: window.bounds, afterScreenUpdates: false)
let image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
let composeSheet = SLComposeViewController(forServiceType: SLServiceTypeFacebook)
composeSheet?.setInitialText("Hello, Facebook!")
composeSheet?.add(image)
present(composeSheet!, animated: true, completion: nil)
}