How to show all camera controls with/for UIImagePickerController? - ios

I have trouble showing the camera controls when using UIImagePickerController. Specifically, I need to be able to select between slo-mo, video, photo, square and pano.
The essential part of code I use is:
UIImagePickerController *pc = [[UIImagePickerController alloc] init];
[pc setSourceType:UIImagePickerControllerSourceTypePhotoLibrary];
pc.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeImage, nil];
But this shows a picker controller with the ability to take a picture only i.e., no square nor pano modes either.
Setting pc.mediaTypes to:
pc.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeImage, (NSString *)kUTTypeMovie, nil];
..shows a picker controller with video and photo. But how do I get the other camera modes/types to show? E.g. what is the UTType for pano?

UIImagePickerController doesn't give you Apple's whole Camera app to use inside of your own app. Not all of the functionality of the Camera app is available. You can print available media types by calling -[UIImagePicker availableMediaTypesForSourceType:] and you'll find that you get kUTTypeImage and kUTTypeMovie.
Square, slo-mo, time-lapse, and panorama functionalities are not provided by UIImagePickerController.

UIImagePickerController came out in iOS 2.0, and was probably modern, in its time:
https://developer.apple.com/documentation/uikit/uiimagepickercontroller?language=objc
What you see are "standard controls".
Even today the modern documentation points to the legacy (Objective-C era docs)
The styling of the chrome has changed, but as they say: "the song remains the same."
There are hooks to hide standard controls, and then you can add your own customer controls to operate the camera programatically. I have a pet project to make a version of time-lapse, etc.
But for serious projects, Apple suggests using the AV frameworks, and although I haven't started using them, I certainly agree.

Related

Xcode iOS external display mirroring

Im working at a local newspaper and I have built a simple video uploading app that we are going to use inside our organization to make it easier to upload videos to our servers.
My issues is that when I'm going to demonstrate it for +100 people i would like to use a projector using a hdmi adapter. So i bought the adapter for my iPhone and it works great if it wasn't for one thing. When i edit the clip before uploading it (the simple editing tool that is a part of the AVfoundation) it dosnt mirror the editing tools to the hdmi screen. The projector just shows the video in fullscreen. So i can't demonstrate the buttons and tools for editing. Everything else works perfectly to mirror by default. Is there some way to force it to truly mirror it instead? Has tried reading Apples official iOS database about it but only found info about when it comes to video playback.
The code for the camera and the editing:
// Hides the controls for moving & scaling pictures, or for
// trimming movies. To instead show the controls, use YES.
cameraUI.allowsEditing = YES;
cameraUI.cameraFlashMode = UIImagePickerControllerCameraFlashModeAuto;
cameraUI.videoQuality = UIImagePickerControllerQualityTypeHigh;
cameraUI.delegate = delegate;
// Show camera view controller.
[controller presentViewController:cameraUI animated:NO completion:nil ];
The best way to go is by using the apple tv, and also the recommended one
http://www.apple.com/airplay/

IOS Image Preview Before Using Camera Roll Photo

I have a UIImagePickerController that is used to upload images to a database. It loads photos from two sources, the camera and the camera roll. When I load the photo from the camera it allows me to take a photo and select if I want to retake the photo or use it. When I load the photos from the camera roll I can select what photo library I want to use but only get a tiled preview of all photos in the library. When I click on the tiled photo preview it gets loaded to the database automatically with no full preview. How can I make it so that when I load a photo from the camera roll it allows me to preview the image before using it in a similar fashion to how it loads from the camera? Here is the coe that I use to specify the UIImagePickerController type, data source and initial settings.
UIImagePickerController *imgPicker = [[UIImagePickerController alloc] init];
imgPicker.delegate = self;
imgPicker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
[self presentViewController:imgPicker animated:YES completion:nil];
You have two options:
You can use the allowsEditing property, set it to YES and the final step will be a “preview” of the picked photo. Of course the user can move and scale the photo from that preview, so maybe this is not what you want.
Code your own preview. It will be a simple view controller that you will push into the UIImagePickerController when the user chooses a photo from the gallery. The simplest solution will be just an UIImageView covering all the screen, and two buttons in the navigation bar to choose another photo or choose the previewed one.
Hey thanks for all the help! I ended up using the third party libraries described here: https://github.com/gekitz/GKImagePicker. The GKImagePicker libraries allow you to easily define an editable and scalable area in the same manner as the allowsEditing does.

Adding a "Pick from Photo Library" button to UIImagePickerController's toolbar

I am implementing a flow where the user can either select a photo from their library or take a photo using the camera.
When presenting the UIImagePickerController modally I have to either go:
picker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
or
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
This means presenting a "scaffolding controller" where to user can pick either their Library or the Camera.
I see that most other apps with this kind of functionality go directly to the Camera controller, but displays a button for selecting from the library instead.
All examples/tutorials I can find, including Apple's PhotoPicker source, does this empty viewController where you pick from camera or library - but all apps I check does the 'direct to camera - with the option for going to library".
How do I add the photo library button option to the camera tool bar?
(app >= iOS 4.3)
Thanks for any help given.
According to this previous question
Enabling the photo library button on the UIImagePickerController
you can use cameraOverlayView and showsCameraControls to make the implementation.
-Vik
For this purpose I have used PhotoPicker by Apple.

how to customize showsCameraControls in iPhone?

How to customize the controls over the camera in iphone using programming?
As there are two modes in camera, one is for taking photo and second in for recording movie, simply I want a view that does not have any recording control, and have some else option in that
How can I do that?
Help!
picker.showsCameraControls = NO;
picker.cameraOverlayView = someView;
Where someView is your custom UIView in which you design your own camera UI. UIImagePickerController class reference is a good place to get you started.

iOS - How to play a video with transparency?

I recorded a video with a bluescreen. We have the software to convert that video to a transparent background. What's the best way to play this video overlaid on a custom UIView? Anytime I've seen videos on the iPhone it always launches that player interface. Any way to avoid this?
Don't know if anyone is still interested in this besides me, but I'm using GPUImage and the Chromakey filter to achieve this^^ https://github.com/BradLarson/GPUImage
EDIT: example code of what I did (may be dated now):
-(void)AnimationGo:(GPUImageView*)view {
NSURL *url = [[NSBundle mainBundle] URLForResource:#"test" withExtension:#"mov"];
movieFile = [[GPUImageMovie alloc] initWithURL:url];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[movieFile addTarget:filter];
GPUImageView* imageView = (GPUImageView*)view;
[imageView setBackgroundColorRed:0.0 green:0.0 blue:0.0 alpha:0.0];
imageView.layer.opaque = NO;
[filter addTarget:imageView];
[movieFile startProcessing];
//to loop
[imageView setCompletionBlock:^{
[movieFile removeAllTargets];
[self AnimationGo:view];
}];
}
I may have had to modify GPUImage a bit, and it may not work with the latest version of GPUImage but that's what we used
You'll need to build a custom player using AVFoundation.framework and then use a video with alpha channel. The AVFoundation framework allows much more robust handeling of video without many of the limitations of MPMedia framework. Building a custom player isn't as hard as people make it out to be. I've written a tutorial on it here:
http://www.sdkboy.com/?p=66
I'm assuming what you're trying to do is actually remove the blue screen in real-time from your video: you'll need to play the video through OpenGL, run pixel shaders on the frames and finally render everything using an OpenGL layer with a transparent background.
See the Capturing from the Camera using AV Foundation on iOS 5 session from WWDC 2011 which explains techniques to do exactly that (watch Chroma Key demo at 9:00). Presumably the source can be downloaded but I can't find the link right now.
The GPUImage would work, but it is not perfect because the iOS device is not the place to do your video processing. You should do all your on the desktop using a professional video tool that handles chromakey, then export a video with an alpha channel. Then import the video into your iOS application bundle as described at playing-movies-with-an-alpha-channel-on-the-ipad. There are a lot of quality and load time issues you can avoid by making sure your video is properly turned into an alpha channel video before it is loaded onto the iOS device.
Only way to avoid using the player interface is to roll your own video player, which is pretty difficult to do right. You can insert a custom overlay on top of the player interface to make it look like the user is still in your app, but you don't actually have control of the view. You might want to try playing your transparent video in the player interface and see if it shows up as transparent. See if there is a property for the background color in the player. You would want to set that to be transparent too.
--Mike
I'm not sure the iPhone APIs will let you have a movie view over the top of another view and still have transparency.
You can't avoid launching the player interface if you want to use the built-in player.
Here's what I would try:
get the new window that is created by MPMoviePlayerController (see here)
explore the view hierarchy in that window, using [window subviews] and [view subviews]
try to figure out which one of those views is the actual player view
try to insert a view BEHIND the player view (sendSubviewToBack)
find out if the player supports transparency or not.
I don't think you can do much better than this without writing your own player, and I have no idea if this method would work or not.
Depending on the size of your videos and what you're trying to do, you could also try messing around with animated GIFs.
if you could extract the frames from your video and save them as images then your video could be reproduced by changing the images. Here is an example of how you could reproduce your images so that it looks like a video:
in this image that I uploaded the names of the images have a different name but if you name your images as: frame1, fram2, fram3.... then you could place that inside a loop.
I have never tried it I just know it works for simple animations. Hope it works.

Resources