Prevent manual screenshot in IOS Objective-C - ios

On iOS, my requirement is disallow user to take manual screenshot from my application, either disallow or blur the captured screenshot. How?

The only solution is to simulate the iOS controls you have in your View using DRM'ed videos.
For each widget you need to create a video subclass that renders the widget, and apply DRM to the video.
You can try to do it yourself, or use a commercial solution such as the following:
https://screenshieldkit.com

It is possible but I don't recommend it because of the room for error. It was easier to do in the past, in iOS13 you will have to do it like this:
You will have to ask for the user's permission to read and edit their photo library, then you have a listener which is checking the number of photos in their library while they are using your app, if that number changes, they have just taken a screenshot (unless you allow other things in your app like tap and hold to save image, etc). When this happens, read said photo and apply a blur, then delete the photo from their library and save the blurred photos.
Warning: There are times where a user may get a photo while using your app that is not a screenshot (e.g. they received an airdrop) and you will now be tampering with their photos, which is very bad. To prevent this you may need to use key value pixel encoding on your screen at all times, for example the first 3 pixels of the screen are 3 very specific RGB values, that way if a new photo is detected and the first 3 pixels are those exact RGB values you know it's a screenshot of your app and not just another photo that was somehow saved while the user was using the app.

There isn't any regular solution to your problem!
You can do some tricks such as if you force the user to have their finger on the screen for the image to show then I don't think they can create screenshots. Because as soon as you press the home+lock keys to actually take the screenshot, the screen seems to behave as if there are no fingers touching it.
BUT what if the user takes a screenshot by AssistiveTouch?!
OR what do you want to do if user records screen and taking screenshot from the video?
I think it's better to change your strategy for example notify the owner of picture for taking screen shot by another one (like SnappChat)!

Related

iOS Image Viewer like Photo app

I am developing an app that has an internal gallery with some images.
What I want to achieve is exactly a result that behaves and looks like the Apple Photo app image viewer.
With a collection view I implemented the gallery images with thumbnails and now I would like to show the image on fullscreen on press.
The image viewer should have exactly the Apple's Photo behavior:
Full screen on single tap,
Delete, Share button etc...
Pinch to zoom, double tap zoom...
My question is. Is that really possible that such a common feature is not already given by iOS? Is there maybe a view controller already build in that we can use but I am not aware of?
I know there are some libraries around that make such thing, but I'm wondering if there's already something given.

How do I know when the screen is being drawn on iOS?

I would like to know when the screen is being drawn on iOS. In particular, I'd like to know if there are any visible changes being drawn on screen. This can be handy to know how long a page took to render, for example (assuming that the user is not interacting with the page). I would like to be able to capture this information in a regular production build, not in a developer build. And I'd like this to be a general solution applicable to most any page in my app, not just a specific page.
For example, I have a page that 1) asynchronously queries an API for data, 2) displays that data in a UITableView where some of the entries may be offscreen, and then 3) asynchronously downloads the images for each of the visible items on the screen. I want to get callbacks when the UITableView is rendered and when all of the images are rendered. The total time to render the page can be determined by looking at the timestamp of the last call to the callback (again, assuming no user interaction).
On Android, this is fairly simple. You can use ViewTreeObserver.addPreDrawListener to get a callback whenever the screen is being drawn. If there's no visible change to the screen, the callback is not called.
On iOS, it looks like CADisplayLink might potentially serve a similar purpose. However, when I hook up my CADisplayLink, it appears to be called over-and-over forever, whether or not there are visible changes on the screen.
Is there a way to know when there are visible changes to the screen being drawn in iOS?
In iOS 9 Apple made it impossible to get access to things drawn onto the screen outside of your app. Prior, it was possible to use an API called IOSurface to do it, but Apple closed it down in iOS 9. (To prevent apps from snooping on each other.)
So if you're talking about ANYTHING being drawn to the screen the answer is no. If you're looking for changes within your app there's probably a way to do it.

How do I invoke iOS photo editing extension?

Let's say I'm writing a word-processing application. Users can embed images using the app. Now since the application is not a photo editor, naturally it would delegate photo editing of embedded images to other applications.
The question is, how to do that? How to invoke photo editing extensions in an iOS app?
Theoretically it should be as simple as passing an image, invoke the extension, and then get another image back as the "edited" image. However the SDK documentation doesn't seem to provide any hint on how to do this.
Looks like the SDK doc actually says it doesn't provide a UI per se:
When using built-in editing controls, the image picker controller enforces certain options. For still images, the picker enforces a square cropping as well as a maximum pixel dimension. For movies, the picker enforces a maximum movie length and resolution. If you want to let the user edit full-size media, or specify custom cropping, you must provide your own editing UI."
From:
https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/CameraAndPhotoLib_TopicsForIOS/Articles/PickinganItemfromthePhotoLibrary.html
I'm looking at the Adobe Creative SDK Image Editor, which seems to provide a UI. Might show some of their branding in the photo editor, however. Hope to follow up...

Ipad Mirroring - How to mirror a different view?

I don't know if "mirroring" is the correct term for this, but I have an Ipad app that takes pictures and keeps them in core data. I've googled around but I can't find code for what I need.
What I want is for the TV to show an image different the iPad. For example, if I don't take any pictures, the TV shows saved images on the screen. When I take a new picture that stops and it shows the new picture.
I find what I wanted in the reference guide.
http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AirPlayGuide/EnrichYourAppforAirPlay/EnrichYourAppforAirPlay.html%23//apple_ref/doc/uid/TP40011045-CH6-DontLinkElementID_3

iPad Photo Gallery Framework

I am looking for some framework that allows me to create a Photo Gallery that allows the user to swipe through pictures, and then select one and it takes them to a specific view based on that image. I seen this idea in another App, and didnt know if this was an open source framework, or if there is anything similar.
I wrote a library that should be able to handle this:
https://github.com/nicklockwood/iCarousel
Check out the "custom" or "time machine" carousels in the example app for something similar to what you've shown.
Here's a screenshot. Don't worry that it's not exactly the same as your picture; the angle, panel size, etc. can all be configured.

Resources