iOS Camera wobble from video recording - ios

I'm having ONE single pain in the ass problem with recorded video from the iPhone.
Building a lap timer app for recording HD video on track days, I've eliminated the possibility of vibrations from wind, suspension and crucially the iPhone mount, so it must the the camera settings right?, I simply don't know what to change
Here's an example of the offending video 'wobble' at stand still (with iOS stabilization enabled), watch from 2:07s:
https://www.youtube.com/watch?v=beair-rOMgw
Is it the autofocus?
Focus
enum {
AVCaptureFocusModelLocked = 0,
AVCaptureFocusModeAutoFocus = 1,
AVCaptureFocusModeContinousAutoFocus = 2
I used the following settings for video above
AVCaptureVideoStabilizationModeAuto:
It seem like there's software compensation for engine vibration/frequency at stand still, the lens (I know it's physically fixed) but some how is moving, I'm I crazy or what?
Can someone please guide me?

Related

Sound not working for some AVCaptureDeviceFormat... is this a bug?

I see this question on SO about the same problem but in my case it is slightly difference.
On that question, the poster says he cannot record audio when he setups the app to shoot video at the highest resolutions a camera can provide under AVFoundation.
On the original question the poster mentions that his AVCaptureConnection has no audio. I believe he is talking inside captureOutput:didOutputSampleBuffer:fromConnection: but in my case the problem is slightly different. In my case, this method is never called for audio. I mean, every time this method is called connection is always a video one... or in other words, there is no data audio output delegate being called here.
I have checked the captureSession and the microphone is there, so captureSession contains a AVCaptureDeviceInput of audio.
(lldb) po _captureSession.inputs
<__NSArrayI 0x170227e00>(
<AVCaptureDeviceInput: 0x17422e2e0 [Back Camera]>,
<AVCaptureDeviceInput: 0x17422e8e0 [iPad Microphone]>
)
I am testing this on an iPad Pro 9.7. I have checked all resolutions of the front and back camera of this device and I have no audio for these:
FRONT CAMERA: 960p # 30 or 60 fps
BACK CAMERA 4032x3024 at 30 fps
I have tried to remove and add the audio device after changing the resolution but the captureSession hangs and the preview freezes. The app continues to work, no crash, but the preview freezes.
Is this a bug? I don't see any mention on any documentation saying I cannot record audio with the highest resolutions a camera can provide.
NOTE: To demo the problem, I have uploaded a modified version of Apple's CIFunHouse here. I have adjusted line 459 of FHViewController.m with 4032x3024 that is the maximum resolution of my iPad. You should adjust that for the maximum resolution of your device's rear camera.
For some strange reason, when you do that, the app crashes when it tries to initialize the audio. My code, that is based on that, initializes ok but does not record sound. I left the code crashing because perhaps it can help more that way. You will see that channelLayoutand and basicDescription are both NULL for that video format. Reduce the resolution and the audio will initialize ok.
Here is a hand-waving answer: 4032x3024 is not a commonly encountered video resolution. 480p, 720p and 1080p are though. And if you read about 4K resolution video you'll see that 3840x2160 is too.
In fact "2160p" does capture both audio and video on my iPhone 6s, so why not try that?
Will AVAssetWriter be able to encode 2160p? Who knows? Maybe.
But don't be too harsh on AVFoundation - it does a valiant job of putting a sane face on the craziness of hardware. If anything you should log functionality and documentation bugs.

Can iPhone 5, 6 or 6+ take a PICTURE with both cameras at the same time?

I found some answers regarding both front and back camera usage at the same time regarding AUDIO/VIDEO recording, which is impossible.
In detail here:
Can the iPhone4 record from both front and rear-facing camera at the same time?
However, is it possible to use both cameras at the same time to take pictures for iOS?
No this is definitely not possible I'm afraid.
Only one camera session can be used at a time when using AVCaptureSession (the lower level API for camera interaction on iOS).
If you try to invoke multiple sessions (from each camera) as soon as one session begins, the other will stop.
You could quickly alternate between sessions, but the images will not be taken in synchronicity.

Removing low frequency (hiss) noise from video in iOS

I am recording videos and playing them back using AVFoundation. Everything is perfect except the hissing which is there in the whole video. You can hear this hissing in every video captured from any iPad. Even videos captured from Apple's inbuilt camera app has it.
To hear it clearly, you can record a video in a place as quiet as possible without speaking anything. It can be very easily detected through headphones and keeping volume to maximum.
After researching, I found out that this hissing is made by preamplifier of the device and cannot be avoided while recording.
Only possible solution is to remove it during post processing of audio. Low frequency noise can be removed by implementing low pass filter and noise gates. There are applications and software like Adobe Audition which can perform this operation. This video shows how it is achieved using Adobe Audition.
I have searched Apple docs and found nothing which can achieve this directly. So I want to know if there exists any library, api or open source project which can perform this operation. If not, then how can I start going in right direction because it does looks like a complex task.

Play sound when silence in the room; stop sound when voices heard

I need some guidance as I may have to shelve development until a later time.
I want to play a sound once the lights are switched off and the room goes dark, then stop the sound once the light is switched back on. I've discovered that Apple doesn't currently provide a way to access the ambient light sensor (not in any way that will get App Store approval).
The alternative I've been working on is to try and detect sound levels (using AVAudioPlayer/Recorder and example code from http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/. I.e., when I detect voices of people in the room have dropped to a specific level (i.e. silence trying to compensate for background noise), I play my sounds.
However, if the people in the room start talking again and I detect the voices, I need to stop playing the sounds.
Q: is this self-defeating, i.e., the sound generated by the iPhone will essentially be picked up by the iPhone microphone and indistinguishable from any voices in the room? Methinks yes and unless there's an alternative approach to this, I'm at an impasse until light sensor API is opened up by Apple.
I don't think the noise made by the iPhone speaker will be picked up by the mic. The phone cancels sounds generated by the speaker. I read this once, and if I find the source I'll post it. Empirically, though, you can tell this is the case when you use speaker phone. If the mic picked up sound from the speaker that's an inch away from it, the feedback would be terrible.
Having said that, the only sure way to see if it will work for your situation is to try it out.
I agree with woz: the phone should cancel the sound it's emitting. About the ambient light sensor, the only alternative I see is using the camera, but it would be very energy inefficient, and would require the app to be launched.

How to screen record the iOS simulator at 60 fps?

It turned out that capturing video from the screen is a hard task on the Mac. I have a small game running in the simulator and want to make a screencast of the gameplay for youtube. Since it's a fast-paced scroller game, video must be recorded at 60 fps to look good.
I know the actual video on youtube for example is just 24 to 30 fps, but each such slow frame is blended with another.
When capturing the simulator at a lower frame rate than 60 fps the result is jagged a lot since every frame is razor sharp with no blending.
I tried a couple of Mac screen recorders but none of them were able to capture 60fps video from the simulator, and the frames in the resulting video looked like if the app took plenty of screenshots and stiffed them together into a video container.
But since there are great demo videos on youtube showing fast-paced gameplay of iOS apps without just recording the screen with a video camera, I wonder what kind of application they use to get a smooth screen capture.
Hopefully someone who already went through this problem can point out some solutions.
I've had good results screen recording from the simulator using SnapZ Pro X from Ambrosia software:
http://www.ambrosiasw.com/utilities/snapzprox/
One problem that you're likely to have is that the simulator only simulates iOS's OpenGL graphics in software, so unless you have a really powerful Mac, it's likely that the simulator won't be able to run your game at 60fps anyway.
It's possible that the videos you've seen used the HDMI video out on the iPhone to mirror the screen from the device into a video capture card on the computer. That would likely perform much better because the Mac wouldn't have to both generate and record the graphics simultaneously.
I remember watching a video of the Aquaria guys talking about how they recorded their gameplay videos. Essentially the game recorded the input from the controller/keyboard while the game was played normally. Then they could play back the game they had just played but one frame at a time, with each frame being rendered out to a file as it went. Then all those frames are composited together and bam, a full 60fps video with perfectly rendered graphics. Bit overkill but it's a nice solution.
A program that is able to record at 60 fps is Screenflick.

Resources