iPhone: taking a photos programmatically in burst mode - ios

In one of my project I have to use the iPhone Camera in "Burst Mode". How it could be done, if possible?

There isn't a direct way to do it. However, you can work still achieve what you want with a small hack.
Make sure AVCaptureDevice has its activeFormat property set to the best resolution. Then you need to process say 10 frames/second. You can access the frames with the delegate method: captureOutput:didOutputSampleBuffer:fromConnection:
And for the sound you can just run a shutter sound with every processed frame.

Related

How to implement apple like camera app using AVCaptureSession?

In the iOS(9) camera app the controls overlay does not rotate but when 'Record' is selected the video is outputted with the correct orientation. Has anyone got any ideas how apple implemented this.
You've got a few different options to achieve the same effect:
You can rotate the camera, but keep the UI the same, when the device itself is rotated so that it matches UIDevice.currentDevice().orientation.
You could store the device orientation at the time of capture and then rotate the video afterwards.
You could even set the capture connection of the capture session to match the devices orientation while keeping the preview connection the same. Check this out for a similar question
I implemented exactly what you're looking for using 2 different UIWindow instances, one for the video preview layer and one for the UI to be displayed on top of it. The UIWindow with the UI needs to have a transparent background in order for the preview layer to remain visible.

How to record a time-limited video with Adobe AIR for iOS

I am trying to record a time-limited video with Adobe AIR for iOS.
For example, I want to implement the following function. Start a one-minute timer before launching CameraUI to record video. When the timeout event happens after one minute, stop recording video, close the CameraUI view and obtain the video data so far.
I have several questions related to that.
How to stop recording video from outside the CameraUI view(in this case, from the timeout event handler) and then close the CemeraUI view? As far as I know, to close the CameraUI view, the only way is to press the [Use Video] button or the [Cancel] button from inside the CameraUI view. Is it possible to close it from outside?
Even if the first problem mentioned above is solved, then how can I get the video data so far(in this case, the video data before the timeout). I know that normally we can get a MediaPromise object from MediaEvent parameter of the complete handler, and read the video data from the MediaPromise object. But obviously in this case, we can not access the MediaPromise object just because the complete handler itself will not be executed since the [Use Video] button is not pressed.
Is it possible to add a stopwatch to show possible remaining recording time when CameraUI view is open? It seems that the CameraUI automatically uses the full screen of iOS device(in my case, iPad) and there is no extra space to show the stopwatch.
Are there any solutions or workarounds about the three problem above? I really appreciate it if anyone has any idea about this. Thanks in advance.
I never worked with video specially on iOS, so I just putting down my thoughts on this issue, sorry if you find it useless.
I suppose it's impossible to write video outside CameraUI (unless you write your own ANE for that), and I think it's a bad design why do you need that?
Answer the same as 1.
It impossible to add display objects at native windows (once again unless you write your own ANE)
In general if you want more freedom to work with video in AIR you can do it in three ways:
Write you own ANE.
To stream you video data to your own server, and do whatever you want with it.
Least reliable way but you can try it. There is FLVRecorder library, I never tried it and even don't know does it work at all. Or you can try own approach (save you stage to bitmaps with some framerate and then encode it to video). It just suggestion I don't know will it work at all.
Hope my thoughts will help.

Trying to create an Xcode Objective-C function that records a video capture of my UIView contents and saves to phone

I'm trying to create an Xcode Objective-C function that can be called from a button tap, that will record the contents of a UIView and its subviews (or a fixed section of the screen e.g. 320x320 in the center) and then allow the user to save the video to their iPhone camera roll. I also want to include the audio that is being played by the app at the time of the recording e.g background music and sound effects.
I'm having trouble finding any info on this as their seems to be a lot of people trying to record their running app for external purposes like the app store video preview. I need my video captured within the app, to be used as an app feature.
Does anyone know if this can be done or know a website or tutorial where I can learn what's needed? Thanks
I know this post is two years old, but for anybody who comes along who might need to record their iOS app's screens and save them to the phone's camera roll or even a specific URL, take a look at https://github.com/alskipp/ASScreenRecorder
I've tried it and it works! The frames per second aren't 60 so I don't know how well it would work if you were trying to record an action game, but it's still pretty awesome.
You can't do that with just one function, check that project:
https://github.com/coolstar/RecordMyScreen

UIImagePicker Adjusting Focus Observer?

Is there a way to find out if the camera is currently focusing with UIImagePicker? Similar to the way the adjustingFocus observer works with AVFoundation.
I'm currently using AVFoundation, but I would like to be able to have the image quality that is achieved with UIImagePicker.
Thanks!
You need to define your exact requirement here - what is that you want adjusting focus observer for? As the name suggests, it is a way to signal you that AVCaptureDevice (reference to "real" device - one of these devices) is adjusting its focus. UIImagePicker is nothing but a file open dialog, not a real device. It works on top of either camera or static list of image files. While in camera mode, you can observe adjustingFocus.
In case that is your requirement, here is nearest you can refer to.
Refer to UIImagePickerController documentation to define how you can implement this for your own needs.

iOS Concurrent Camera Usage

I am developing a background app that periodically makes use of the camera. This app is for my jailbroken device, so no problem with public SDKs restrictions. I want to take a photo of the different places I go during the day, in an automatic manner.
I am using AVCaptureSession, getting the frames from a video in order to produce no sound.
The problem is that if another application wants to make use of the camera (The Camera App for instance), and my app tries to take a photo, the Camera interface get frozen. Then I need to reopen the Camera app again in order for it to work. I guess it is because the startRunning method of the AVCaptureSession blocks the usage of previous camera sessions.
Is there any way to use the Camera in shared mode, or concurrently?
I don't know if there is some kind of mixWithOthers property as is included for the sound compatibility.

Resources