iOS popup/splash screen with external web content - ios

I need to display a popup or something like a splash screen every time I start my app. The content of this popup must be taken from an external web source (something like jpg, png or pdf).
The purpose of this popup is to warn users about news and special offers. The popup should disappear after a certain time (or at the pressure of a button).
From what I read on other threads, the UIPopoverController feature seems be helpful for what I need (as I read in this class reference), but I'm afraid that the main function of this popup is presenting a choice in result of the pressure of a button.

Why can you not simply add a webview to the screen when the app opens?
Like:
in appDelegate:
UIWebview *popover;
- (void)applicationDidBecomeActive:(UIApplication *)application
{
UIWindow *win = [[UIApplication sharedApplication] keyWindow];
popover = [[UIWebView alloc] initWithFrame:win.bounds];
... load content ...
[win addSubview:popover];
[self performSelector:#selector(dismissPopover) withObject:nil afterDelay:3];
}
-(void)dismissPopover
{
[popover removeFromSuperview];
}

Related

When is ios auto screen capture taken?

I am trying to change the screen shot that is automatically captured by iOS when the app enters the background.
However I am not entirely sure exactly when this screenshot is taken.
For example:
If you pull down the notification bar while in the app the following method is called:
- (void) applicationWillResignActive:(UIApplication *) application {
}
Also if you double tap the home button while in the app the same method is called. In addition if an alert view is show 'applicationWillResignActive' is called.
But in both of these cases
- (void) applicationDidEnterBackground:(UIApplication *) application {
}
is not called.
So my question is, is there a screenshot captured after the call to applicationWillResignActive even if the application does not enter the background? Or does iOS only capture a screenshot after applicationDidEnterBackground?
Look at official doc - "Preventing Sensitive Information From Appearing In The Task Switcher".
It says applicationDidEnterBackground: should be used for the said purpose.
you can look over here, in summary, change the view before return from applicationDidEnterBackground:
Yes.
- (void) applicationWillResignActive:(UIApplication *) application {
}
is called when you pull down the notification bar or even when you double click the home button. You have to do something here to prevent your sensitive information to be captured by the OS. One workaround might be:
Set a blurry screen overlay before the app goes in the background
Once the app becomes active remove this overlay
Something like this:
-(void)applicationWillResignActive:(UIApplication *)application
{
imageView = [[UIImageView alloc]initWithFrame:[self.window frame]];
[imageView setImage:[UIImage imageNamed:#"blurryImage.png"]];
[self.window addSubview:imageView];
}
And then remove this overlay before the application enters foreground:
- (void)applicationDidBecomeActive:(UIApplication *)application
{
if(imageView != nil) {
[imageView removeFromSuperview];
imageView = nil;
}
}
is called when you pull down the notification bar or even when you double click the home button. You have to do something here to prevent your sensitive information to be captured by the OS. One workaround might be:
Set a blurry screen overlay before the app goes in the background
Once the app becomes active remove this overlay

Is it possible to place a static image on top of the webview in a Cordova applicaiton

My case is the following: I have a Cordova application that has to run on iOS. I have to prevent sensitive information from being shown in the app switcher when the app is being in background.
Apple provides this solution for native applications which doesn't seem to solve my problem, because it doesn't manipulate the web view in any way.
I wonder if I can natively place some static image covering the webview. As I understand the system takes a screenshot of the view right after the applicationDidEnterBackground: method is invoked.
This way the system will take a screenshot of the image I put on top instead of the actual content of the webview.
I'm not an experienced iOS developer and I will appreciate any suggestion.
Thanks.
It turned out apple's solution could fix the problem with a small edit from me.
Instead of implementing
- (void)applicationDidEnterBackground:(UIApplication *)application
I implemented
-(void)applicationWillResignActive:(UIApplication *)application
Then no matter if you hit the home button once or twice, it will cover the viewcontroller with the blank one just created.
-(void)applicationWillResignActive:(UIApplication *)application {
UIViewController *blankViewController = [UIViewController new];
//this is how to attach image
UIImage *splashImage = [UIImage imageNamed:#"some image"];
blankViewController.view.backgroundColor = [UIColor colorWithPatternImage: splashImage];
// set some transition style
blankViewController.modalTransitionStyle = UIModalTransitionStylePartialCurl;
[self.window.rootViewController presentViewController:blankViewController animated:YES completion:NULL];
}
- (void)applicationDidBecomeActive:(UIApplication *)application
{
[self.window.rootViewController dismissViewControllerAnimated:YES completion:NULL];
}

How can I present an viewController from Application window's rootViewController when user press home button, but before application enter background? [duplicate]

This question already has answers here:
Controlling the screenshot in the iOS 7 multitasking switcher
(8 answers)
Closed 9 years ago.
I have an application that contains some sensitive information, I don't want others to snapshot the screen before app enter background, so I want to present a pattern lock viewController on the screen after user press home button. I tried this code
- (void)applicationWillResignActive:(UIApplication *)application
{
PatternLockViewController *pvc = [[SMICConfig sharedSMICConfig] patternLockVC];
if (!(pvc.isViewLoaded && pvc.view.window) && [SMICConfig sharedSMICConfig].isCookie) {
[self.window.rootViewController presentViewController:pvc animated:NO completion:nil];
}
}
But the PatternLockViewController only present after the app enter foreground. So, when the app stay in background, you can double-click home button to peek some information.
Tecent qq's pattern lock is very well. I just want to implement this effect.
Can any one help me? Thanks
You can tell iOS7 to avoid using the recent snapshot image during the next launch cycle by calling ignoreSnapshotOnNextApplicationLaunch Apple's documentation
The following is how I implemented for my App. Hope it helps.
- (void)applicationWillResignActive:(UIApplication *)application {
[application ignoreSnapshotOnNextApplicationLaunch];
self.imageView = [[UIImageView alloc]initWithFrame:[self.window frame]];
[self.imageView setImage:[UIImage imageNamed:#"Default-568h"]];
[self.window addSubview: self.imageView];
}
- (void)applicationDidBecomeActive:(UIApplication *)application {
if (self.imageView != nil) {
[self.imageView removeFromSuperview];
self.imageView = nil;
}
}

iOS AirPlay: my app is only notified of an external display when mirroring is ON?

I'm trying to enable AirPlay support in my app. I'm not doing video; I want to use the external display as a "second display".
Here's my problem: if I choose "AppleTV" from my AirPlay button, my app doesn't get notified. The only time my app -does- get notified is when I leave my app, go to the OS-level AirPlay button, choose "AppleTV" there and turn on Mirroring. If I turn mirroring off, my app is then told that the external display is gone.
So:
Why doesn't my app get notified when I pick an external display from
within my app?
Why does my app get notified of the presence of an
external display when I turn mirroring on...and not before? I'm obviously misunderstanding something, but it would seem like turning mirroring on should inform my app that the external display is gone (rather than now available, since the OS should now be using that external display for mirroring.)
Code sample below. Thanks in advance for any help!
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Override point for customization after application launch.
// Is there already an external screen?
if (UIScreen.screens.count > 1)]
{
[self prepareExternalScreen:UIScreen.screens.lastObject];
}
// Tell us when an external screen is added or removed.
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(externalScreenDidConnect:) name:UIScreenDidConnectNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(externalScreenDidDisconnect:) name:UIScreenDidDisconnectNotification object:nil];
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController" bundle:nil];
self.window.rootViewController = self.viewController;
// Add AirPlay control to view controller.
MPVolumeView* airplayButtonView = [[MPVolumeView alloc] init];
airplayButtonView.frame = CGRectMake(300, 300, 50, 50);
airplayButtonView.backgroundColor = [UIColor blackColor];
airplayButtonView.showsVolumeSlider = NO;
airplayButtonView.showsRouteButton = YES;
[self.viewController.view addSubview:airplayButtonView];
[self.window makeKeyAndVisible];
return YES;
}
#pragma mark - External screen handling
- (void)externalScreenDidConnect:(NSNotification*)notification
{
[self prepareExternalScreen:notification.object];
}
- (void)externalScreenDidDisconnect:(NSNotification*)notification
{
// Don't need these anymore.
self.externalWindow = nil;
}
- (void)prepareExternalScreen:(UIScreen*)externalScreen
{
NSLog(#"PREPPING EXTERNAL SCREEN.");
self.viewController.view.backgroundColor = [UIColor blueColor];
CGRect frame = externalScreen.bounds;
self.externalWindow = [[UIWindow alloc] initWithFrame:frame];
self.externalWindow.screen = externalScreen;
self.externalWindow.hidden = NO;
self.externalWindow.backgroundColor = [UIColor redColor];
}
That's correct, unfortunately. The secondary display (the airplay screen) is only available with mirroring.
Here is an application that shows how to implement this:
https://github.com/quellish/AirplayDemo
Looking at your code, you SHOULD be getting the UIScreenDidConnectNotification when a user goes to the airplay menu and turns on mirroring while your app is active. The "Airplay Button", wether a MPVolumeView or movie controller, does not control mirroring (and thus the external display functionality). Video and audio out are unfortunately separate from mirroring, and mirroring can only be turned on or off using the system wide mirroring UI.
Bottom line: You can't turn on that AirPlay screen from within your app.
Finally found the answer, you must have mirroring enabled in order to get the new screen notification, but then you should overwrite the screen with your second screen content. Very confusing!
See this example:
UIScreen screens always return 1 screen
Now, the worst part. You can add an AirPlay button inside your app using this:
MPVolumeView *volumeView = [ [MPVolumeView alloc] init] ;
[view addSubview:volumeView];
However you cannot enable mirroring from this picker! And there is no programmatic way to turn on mirroring.
How can I turn on AirPlay Screen Mirroring on the iPhone 4S programmatically
So apparently the only way to have a second screen experience is to instruct your user how to turn on AirPlay from multitasking bar and make sure they turn mirror on.
It seems not possible from inside of app unfortunately. Only airplay sound can be turned on from inside app afaik. Here's an example app using the second screen with OpenGL and sounds http://developer.apple.com/library/ios/samplecode/GLAirplay/Introduction/Intro.html

iOS app which runs on two screen (no mirroring)

I've created an iPad app which contains a slideshow and when this slideshow is tapped by the user he/she can entered some information.
What I'd like to do now is to display the slideshow contents on a TV when connecting the TV and iPad through AirPlay (or cable if possible, but that only seems to mirror things)
Can this be done? Can we have the slideshow run on the TV and also on iPad and then when the user taps the slideshow on the iPad the credentials input screen will show but on TV still the underlying slideshow will show and not the credentials?
How can this be done in iOS? Is it possible to display a portion of the application on the TV? So not mirroring the entire application.
You can write the app to handle 2 UIScreens using Airplay and an Apple TV then set a seperate root view controller for both the TV UIScreen and for the iPad UIScreen. Then display the image or slideshow on the TV's view controller and run that from the events of you iPads view controller!
AMENDED AFTER CLIFS COMMENT:
So firstly in your app delegate in didFinishLaunchingWithOptions or didFinishLaunching setup a notification to receive the screen did connect.
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(screenDidConnect:) name:UIScreenDidConnectNotification object:nil];
Then you need to keep a reference to your separate window and push controllers to it as you would any other window.
- (void) myScreenInit:(UIScreen *)connectedScreen:(UIViewController*)mynewViewController
{
//Intitialise TV Screen
if(!windowTV)
{
CGRect frame = connectedScreen.bounds;
windowTV = [[UIWindow alloc] initWithFrame:frame];
windowTV.backgroundColor = [UIColor clearColor];
[windowTV setScreen:connectedScreen];
windowTV.hidden = NO;
}
UIViewController* release = windowTV.rootViewController;
windowTV.rootViewController = mynewViewController;
[release removeFromParentViewController];
[release release];
}
- (void)setTvController:(UIViewController*)mynewViewController
{
UIViewController* release = windowTV.rootViewController;
windowTV.rootViewController = mynewViewController;
[release removeFromParentViewController];
[release release];
}
- (void)screenDidConnect:(NSNotification *)notification {
[self myScreenInit:[notification object]];
}
There appears to be a bug in iOS 5.0 which makes this tricky. You have to enable mirroring from the running task bar (scrolling all the way the left before a second screen is detected via the API. I've posted details in my question here: How to use iOS 5+ AirPlay for a second screen

Resources