Panning sound between top and bottom speaker on iPhone 7 - ios

Is it possible to pan sound to either the top or bottom speaker on the iPhone 7 and newer models? I don't have one of these phones, but my understanding is that iOS mixes stereo sound and plays it back from both speakers when the phone is in portrait mode. I know it routes left and right channels to their respective speakers in landscape, but I can't find documentation about the behavior in portrait mode.
Is it possible to limit playback to just one speaker or the other, or to pan between top and bottom? My library cannot operate with the destructive interference of both speakers playing at the same time.

It turns out my question was misguided. It's hard to get credible information when you can't test on the device yourself.
On iPhone 7 and newer, the stereo channels are actually routed to the individual speakers, even though there is no stereo separation. The left channel routes to the bottom speaker, and the right channel routes to the top/headset speaker. Using the pan attribute can also accomplish the same thing.
Finally, there's one more option with channel assignments. Using AVAudioSession.sharedInstance.currentRoute.outputs, the two speakers combined show up as a single output (outputs[0]). Inside this output are two channels, outputs[0].channels[0] and outputs[0].channels[1]. Mapping to either of these with channel assignments works as well, with the first channel mapping to the bottom speaker and the second to the top.
Any of these methods works fine as a way to route sound output to the new stereo speakers, even when the phone is in portrait orientation.
For anyone who wants to try on their own device, I put together a test application that tests out the various approaches https://github.com/brian-armstrong/speaker-tester

Related

Simultaneous pictures with iPhone 7 Plus cameras

Is there a way to take a picture with the Telephoto lens and the Wideangle lens of the iPhone 7 Plus ?
I explored the different methods, but the best I can come with is to change the camera by removing the input AVCaptureDeviceTypeBuiltInTelephotoCamera and adding the input from AVCaptureDeviceTypeBuiltInWideangleCamera. This takes about 0.5 second however, I would like to capture it simultaneouly. From a hardware point of view, it should be possible since Apple is doing the same when using the AVCaptureDeviceTypeBuiltInDuoCamera.
Does anybody know other methods to capture a photo from both cameras at (almost) the same time?
Thanks!
I wanted to capture from both cameras too, but what I've found is this:
When you are using the AVCaptureDeviceTypeBuiltInDualCamera that
automatically switches between wide and tele, they are synchronized to
the same clock. Simultaneous running of the
AVCaptureDeviceTypeBuiltInTelephotoCamera and
AVCaptureDeviceTypeBuiltInWideAngleCamera cameras is not supported.
Source - https://forums.developer.apple.com/thread/63347

How to switch audio output between the two buds of headphones

My application demands a functionality that sound has to come through only one side of headphones based on the user choice. i.e either sound can play from left side of head phone or right side of the phone, but not from the two sides at a time.
I want to know that how to switch the audio output of an iOS device between the two sides/buds of the headphones connected to the device.How can i achieve this.Please share your suggestions and ideas.
Thanks in advance.
If you're using AVAudioPlayer to play the audio, you can use the pan property of the same to adjust the volume of each channel.
pan
The audio player’s stereo pan position.
#property float pan
Discussion
By setting this property you can position a sound in the stereo field. A value of –1.0 is full left, 0.0 is center, and 1.0 is full right.
Availability
Available in iOS 4.0 and later.
Declared In
AVAudioPlayer.h
Or if you want more control over the audio playback, you can use AudioQueueServices described here, along with the sample code.

Play sound when silence in the room; stop sound when voices heard

I need some guidance as I may have to shelve development until a later time.
I want to play a sound once the lights are switched off and the room goes dark, then stop the sound once the light is switched back on. I've discovered that Apple doesn't currently provide a way to access the ambient light sensor (not in any way that will get App Store approval).
The alternative I've been working on is to try and detect sound levels (using AVAudioPlayer/Recorder and example code from http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/. I.e., when I detect voices of people in the room have dropped to a specific level (i.e. silence trying to compensate for background noise), I play my sounds.
However, if the people in the room start talking again and I detect the voices, I need to stop playing the sounds.
Q: is this self-defeating, i.e., the sound generated by the iPhone will essentially be picked up by the iPhone microphone and indistinguishable from any voices in the room? Methinks yes and unless there's an alternative approach to this, I'm at an impasse until light sensor API is opened up by Apple.
I don't think the noise made by the iPhone speaker will be picked up by the mic. The phone cancels sounds generated by the speaker. I read this once, and if I find the source I'll post it. Empirically, though, you can tell this is the case when you use speaker phone. If the mic picked up sound from the speaker that's an inch away from it, the feedback would be terrible.
Having said that, the only sure way to see if it will work for your situation is to try it out.
I agree with woz: the phone should cancel the sound it's emitting. About the ambient light sensor, the only alternative I see is using the camera, but it would be very energy inefficient, and would require the app to be launched.

Show an applicatio non top of a Direct X game?

Is it possible to build an application that displays itself (TopMost) even when a game is running (Quake, Farcry, Black Ops, any Direct X driven game)
I would like to be able to record my key presses while I play a game for video recording.
It must be possible because FRAPS displays the FPS on top of everything that uses direct X, including video players.
Any thoughts?
First of all, it isn't that easy. FRAPs works with API-Injection, to set some code into the drawing-steps of the programs, where it takes the many different versions of directx and opengl into account. If found a link, where it is explained a little more: Case Study Fraps.
Maybe a solution to work in windowed mode and capture the input with global hooks is easier, but I never tried out something in this direction. If you want to work with api-hooks maybe this link will be useful: Direct3DHook

Recording the screen of an iPad 2

First of all: This question is not directly programming related. However, the problem only exists for developers, so I'm trying to find an answer here anyways since there are maybe other people on this community who already solved the problem.
I want to record the screen of the iPad 2 to be able to create demo videos of an app.
Since I'm using motion data, I cannot use the simulator to create the video and have to use the actual iPad itself.
I've seen various websites where different methods were discussed.
iPad 2 <==> Apple Digital AV Adapter <==> Blackmagic Design Intensity Pro <==> Playback software <==> TechSmith Camtasia screen recorder on the playback software to circumvent the HDCP flag
iPad 2 <==> Apple VGA Adapter <==> VGA2USB <==> Recording software
...
Everyone seems to have his own hacky solution to this problem.
My setup is the following:
iPad 2 (without Jailbreak)
Apple Mac mini with Lion Server
PC with non-HDCP compliant main board
Non-HDCP compliant displays
It doesn't matter whether the recording has to be on the mac or on the PC.
My questions:
Is it possible to disable the HDCP flag programmatically from within the application itself?
HDMI offers a better quality than VGA. Will the first method I've listed work with my setup although I don't have a full HDCP chain?
What about the Intensity Extreme box? Can I use it and then connect to the Thunderbolt port of the mac mini and record from there?
Is the Thunderbolt port of the mac mini bidirectional and is also suited for capturing? Is the mac mini HDCP compliant? If it does not work due to my screens not being HDCP compliant, will it work if I start the recording software, then disconnect the screens? Will it work if I use an iPad 2 over VNC as a screen since it has to be HDCP compliant if it sends HDCP streams?
If I have to fall back to the VGA solution: Will the VGA adapter mirror everything what's showing on the iPad 2 screen or do I have to program a special presentation mode which sends everything to the VGA cable instead of the iPad screen? Will the proposed setup using VGA2USB be qualitatively high or would you recommend other tools?
What about the Apple Composite AV Cable? Maybe another approach?
I decided to use the Blackmagic Design Intensity Pro together with the Apple Digital AV adapter on a machine with a working HDCP chain.
This approach worked well, capturing is possible in 720p with the iPad's screen centered within the captured video. Therefore, capturing happens not at the full native resolution of the iPad's screen, and black borders are introduced to fill the 720p video frames.
I posted info about displaying HDMI HDTV output to a non-HDCP monitor. Look for my posts. Perhaps it will be of use. Or, how about just using a cell phone to record a video of your tablet's screen, with proper lighting, low reflectance, etc. Won't be 100% but might be sufficient.

Resources