UIImagePickerController and External display - ios

I am writing an iPad kiosk-type application that allows a visitor to record a video using the front facing camera and view existing videos in the Camera Roll that are targeting an External monitor. I am just learning XCode, and working in Xcode 4.4.1 targeting iOS 5. It seems like a lot has changed recently and this is making it much harder to learn so I am trying to keep things as simple as possible, that's why I am using UIImagePicker.
Everything is working as I wish, with one exception – I am not able to toggle between the external display and the iPad as I want. When the user records a video, it is full screen on the iPad. That's fine, however after they stop recording, the video is immediately sent to the external display for approval and a placeholder image is left in the UIPopover. What I would rather see/do is either keep the video preview full screen on the iPad, or target the video to the UIPopover.
The reason is that the external display is not easily viewable from where the user is accessing the iPad. Therefore, they are being asked to approve (click Use) on something they can't really see. It would be much better to keep it on the iPad. The code below is what I have used to allow recording.
Everything else works great, I want the user to select videos from the Library and display on the External monitor, and since that's the default behavior it works fine.
The closest answer I could find so far is this: UIImagePickerController in an existing UIPopoverController
Is there a simple way to disable the external display or keep the video preview from being sent?
- (IBAction)useCameraRoll:(id)sender
{
if([self.popoverController isPopoverVisible]) {
[self.popoverController dismissPopoverAnimated:YES];
} else {
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum])
{
UIImagePickerController *imagePicker =
[[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
imagePicker.mediaTypes = [NSArray arrayWithObjects:(NSString *) kUTTypeMovie, nil];
imagePicker.allowsEditing = NO;
self.popoverController = [[UIPopoverController alloc] initWithContentViewController:imagePicker];
//self.popoverController = [[UIPopoverController alloc] setContentViewController:animated];
self.popoverController.delegate = self;
[self.popoverController
presentPopoverFromBarButtonItem:sender permittedArrowDirections:UIPopoverArrowDirectionUp animated:YES];
newMedia = NO;
}
}
}

Coincidentally, I have been working on a similar kiosk app with an iPad. In my case it utilizes some Augmented Reality to show relevant content on the external display. I use the iPad screen as a configuration panel for the augmented reality experience.
Best way, that I found, to approach this, would be to use separate windows with the two UIScreens for each of the displays. This enables you to craft the experience properly. I am not sure if you are using this approach already, but if you aren't, then this is the way to go.
To get started you can read the ExternalDisplay code sample. From the excerpt:
From the ExternalDisplay sample code in the iOS Developer Library:
To display content on an external display, do the following:
Use the screens class method of the UIScreen class to determine if an external display is available.
If an external screen is available, get the screen object and look at the values in its availableModes property. This property contains
the configurations supported by the screen.
Select the UIScreenMode object corresponding to the desired resolution and assign it to the currentMode property of the screen
object.
Create a new window object (UIWindow) to display your content.
Assign the screen object to the screen property of your new window.
Configure the window (by adding views or setting up your OpenGL ES rendering context).
Show the window.
Also, the UIScreen documentation is quitehelpful.

Related

How to reload UIImagePickerController?

I use UIImagePickerController to get an image needed in my app.
// ...
picker = [[UIImagePickerController alloc] init];
picker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
// ...
But I found a problem in my process to get an image with UIImagePickerController.
When UIImagePickerController object's view is on device screen, user can make my app go to background and take a picture with other apps. User backs to my app then, my app still shows UIImagePickerController object's view. But the contents in Photo library is not changed unless user reopen UIImagePickerController.
I need to reload UIImagePickerController object's view. How to do it?
This question is about reloading UIImagePickerController, not app foreground notification.
Thank you for your attention.
Once you get the notification you can destroy and recreate the picker. This will refresh the images available but will lose any current selection / scroll position. As such, it probably isn't a good idea to always reload (you can use the assets library to check if the image count changed).

UIImagePickerController shows last picture taken instead of camera input

I'm having a strange behaviour within my app.
For taking pictures i'm using the following pretty standard code for displaying the UIImagePickerController:
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.allowsEditing = NO;
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
[self presentViewController:picker animated:YES completion:nil];
It works perfectly fine the first time I tap the button which calls this action. The strange behaviour starts when I tap that button again. The UIImagePickerController starts again BUT it doesnt show the input from the camera anymore. It shows the last picture I've taken.
More Details of this state:
Tapping on the image shows the yellow square of the auto focus. (which it actually uses to focus the camera correctly)
When I tap on the ImageCapture button -> the correct image is taken and presented on the screen.
If I take a picture and press 'Retake' the regular camera image is presented as input.
More weirdness: It has nothing to do with the iPad I'm using. Creating a new example app which only has button which calls the code from above everything works perfectly fine.
I assume it has something to do with the configuration of the app. Therefore I checked everything but could not find any differences which may cause this issue.
Thanks in advance for any advice!
Update:
I do implement the UIImagePickerControllerDelegate in order to dismiss the UIImagePickerController.
In reading the Apple documentation on UIImagePickerController doc here it states that "When the user taps a button to pick a newly-captured or saved image or movie, or cancels the operation, dismiss the image picker using your delegate object. For newly-captured media, your delegate can then save it to the Camera Roll on the device. For previously-saved media, your delegate can then use the image data according to the purpose of your app." Maybe you need to implement the UIImagePickerContriollerDelegate protocol methods and properly dismiss the existing UIImagePickerController object. See UIImagePickerControllerDelegate
I finally did find the issue: It was a category on UIViewController I used somewhere else in the project. As soon as I compiled it with my project, the UIImagePickerController acted weird. So I think I somehow managed to use a method name which is also used internally by UIImagePickerController. It still confuses me a little since the category wasn't used on the UIImagePickerController at all.

Display one thing on iPad and another on Apple Tv?

I have an app idea, but I'm not sure if it's possible.
I was wondering if I'm able to display one thing on the iPad ( or iPhone )
screen, and something totally different on the Apple Tv at the same time.
For example, a quiz app, where the question is displayed on the Apple Tv, and the multiple choices are listed on the iPad for the user to pick.
I'm not sure if this is possible or if you can only Mirror the iPad screen onto the Apple Tv.
If there is some "Proof of Concept" example code, I'd love to take a look.
Thank you so much.
Chris
Turns out that is is pretty simple to support two screens: the primary screen of the iOS device and a secondary screen (either an external display or mirroring on an Apple TV).
Based on information from the blog post Creating a Dual-Screen AirPlay Experience for iOS and Apple TV, you don't need to do much.
Basically you need to check the screens property from UIScreen. There are also notifications you should listen for (UIScreenDidConnectNotification and UIScreenDidDisconnectNotification) so you know if the number of screens changes while your app is running.
Once you have a second screen, you need to create a new window for it. Code like the following can be used:
if ([UIScreen screens].count > 1) {
if (!_secondWin) {
UIScreen *screen = [UIScreen screens][1];
_secondWin = [[UIWindow alloc] initWithFrame:screen.bounds];
_secondWin.screen = screen;
}
}
where _secondWin is a UIWindow ivar.
Once the window is setup, create a view controller, make it the window's root view controller, and show the window:
SomeViewController *vc = [[SomeViewController alloc] init...];
_secondWin.rootViewController = vc;
_secondWin.hidden = NO;
This is pretty much it other than proper handling of the notifications. Keep in mind that you can't get any touch events on the 2nd display so make sure whatever you show is basically display-only.
Depending on your app, you might have the 2nd screen/window being used throughout the lifetime of the app (as long as the 2nd screen is available any way). Or you might only create and use the 2nd window/screen under certain circumstances. When you don't setup the 2nd window/screen, your app will simply be mirrored to the 2nd display or Apple TV.
The last piece is to turn on mirroring to the Apple TV. This is done on the iOS device, not in the app.
The blog post I linked has a few more details worth reviewing.

UIWebView: Media Player Rotates View Controller

I'm encountering an issue while using a webview. My app is currently a portrait application, but when a media player loads in a webview within my app, the user can rotate the player, which rotates my view controller as well. I know I can get a notification for when the player comes up and disappears, but is there a way to prevent it from rotating the controller in the first place?
It doesn't seem like anyone has an answer. I have all of the standard "don't rotate unless if I tell you to" methods and plist values, but its still rotating. Its only when I load up a webview and the media player loads over it. If I rotate my device, the media player rotates along with it, which feels natural, but when I go back to the webview view controller, its rotated, which isn't good.
Add this key in your info.plist :
UISupportedInterfaceOrientations and set it to UIInterfaceOrientationPortrait
Ok, this a blind shot. I've used this trick to force the system to adjust to the required orientation, add this lines when the webview will appears, or when the player will disappear:
UIViewController *viewController = [[UIViewController alloc] init];
[self presentModalViewController:viewController animated:NO];
[self dismissModalViewControllerAnimated:NO];
Try in some of those methods, or in the didAppear or didDisappear if dosen't work. I know it look like rubbish, but sometimes works.
There doesn't seem to be a correct answer to this question.

how can i make a custom, in-app camera for my iphone app developed in xcode?

i am making a photography iphone app and am after, upon the app being launched, the screen to be a large image where the camera is shown in the center through a hole in the picture - this should be a similar look to that of hipstamatic. since the camera would be open upon launch, i would also need a button to take the picture (but that is not a priority at the moment). i am wondering whether there is an easy way to do what i have described? so far, research has pointed me towards using the uiimagepickercontroller, but using just this did not give me nearly the amount of customization i am after - or am i mistaken and i can do as i described using the uiimagepickercontroller?
right now, this is the code i'm using for the camera - it's currently an ibaction linked to a button that launches the camera once clicked. however, as i mentioned, i would like the camera to open on its own upon opening the app
self.picker = [[UIImagePickerController alloc] init];
self.picker.allowsEditing = NO;
[self.picker setSourceType:UIImagePickerControllerSourceTypeCamera];
[self presentModalViewController:self.picker animated:NO];
[picker release];
this code not only navigates away from the current view, but it has all the controls (zoom, tap to focus, etc.), is full screen, and plays that silly animation of the lens opening.
anything you have to offer would be greatly appreciated.
many thanks in advance
zach
Use the showsCameraControls and cameraOverlayView properties of UIImagePickerController to add custom overlay above the picker (using a view with a transparent background in which you add some elements as subviews to overlay / frame / mask some parts of the picker if needed.)
You can even use the cameraViewTransform to change the size and position (i.e. the transform) of the camera view that is capturing the camera image.

Resources