How to switch audio output between the two buds of headphones - ios

My application demands a functionality that sound has to come through only one side of headphones based on the user choice. i.e either sound can play from left side of head phone or right side of the phone, but not from the two sides at a time.
I want to know that how to switch the audio output of an iOS device between the two sides/buds of the headphones connected to the device.How can i achieve this.Please share your suggestions and ideas.
Thanks in advance.

If you're using AVAudioPlayer to play the audio, you can use the pan property of the same to adjust the volume of each channel.
pan
The audio player’s stereo pan position.
#property float pan
Discussion
By setting this property you can position a sound in the stereo field. A value of –1.0 is full left, 0.0 is center, and 1.0 is full right.
Availability
Available in iOS 4.0 and later.
Declared In
AVAudioPlayer.h
Or if you want more control over the audio playback, you can use AudioQueueServices described here, along with the sample code.

Related

How to detect main sound direction from AVAudioSession

I want to detect the main direction of the sound recorded from iPhone. For example, I want to detect if the sound comes from "front" or "rear" camera.
https://developer.apple.com/documentation/avfoundation/avaudiosessiondatasourcedescription
This link describes how to set, but not how to detect in real time.
UPDATE:
Example use:
I start recording with front and back camera at the same time. I want to detect if audio comes from front o rear to change camera automaticatlly.
Is there any way?
Thanks!
You can iterate over AVAudioSession.inputDataSources, to check out available sources and obtain the one you want, and then set it to AVAudioSession.setPreferredInput(). If You don't need to set the input but just check it, use AVAudioSession.currentRoute.inputs

Panning sound between top and bottom speaker on iPhone 7

Is it possible to pan sound to either the top or bottom speaker on the iPhone 7 and newer models? I don't have one of these phones, but my understanding is that iOS mixes stereo sound and plays it back from both speakers when the phone is in portrait mode. I know it routes left and right channels to their respective speakers in landscape, but I can't find documentation about the behavior in portrait mode.
Is it possible to limit playback to just one speaker or the other, or to pan between top and bottom? My library cannot operate with the destructive interference of both speakers playing at the same time.
It turns out my question was misguided. It's hard to get credible information when you can't test on the device yourself.
On iPhone 7 and newer, the stereo channels are actually routed to the individual speakers, even though there is no stereo separation. The left channel routes to the bottom speaker, and the right channel routes to the top/headset speaker. Using the pan attribute can also accomplish the same thing.
Finally, there's one more option with channel assignments. Using AVAudioSession.sharedInstance.currentRoute.outputs, the two speakers combined show up as a single output (outputs[0]). Inside this output are two channels, outputs[0].channels[0] and outputs[0].channels[1]. Mapping to either of these with channel assignments works as well, with the first channel mapping to the bottom speaker and the second to the top.
Any of these methods works fine as a way to route sound output to the new stereo speakers, even when the phone is in portrait orientation.
For anyone who wants to try on their own device, I put together a test application that tests out the various approaches https://github.com/brian-armstrong/speaker-tester

Prevent recording vertical videos in app

I am creating an iOS app that allows users to take photos and record videos. I would like to block recording "vertical" videos - video recording in portrait orientations. I couldn't find any software libraries that implement this functionality so I guess I will have to implement it myself.
I am using UIImagePickerController and I tried to achieve that using cameraOverlayView, but I don't believe it can be done that way.
So is there any way to solve this?
Thanks
Actually the videos are always recorded in landscape-right regardless of the device orientation b/c that's how the sensor is oriented in the hardware (although you can request rotated buffers in AVFoundation). However there's a flag stored as video metadata that describes the device's orientation during recording and this is used during playback to rotate the content. See AVAssetTrack preferredTransform.
If you don't want your video to be rotated, just discard this information during playback.

iOS - Audio level metering of app's audio output

I want to show audio meter levels in my app from the audio that my app send to the speaker. I can find the audio level from AVAudioPlayer but it works on audio files.
I have tried to acheive this using The Amazing Audio Engine as it is provided in its documentation here but I am unable to find out how to do that.
Is it possible to achieve this in ios? Can anyone suggest me any library, audio engine or method?
Thanks in advance
If you are using "remote i/o audio unit for handling audio input and output" this is a possibility:
https://developer.apple.com/library/ios/samplecode/aurioTouch2/Listings/ReadMe_txt.html
"… , and a sonogram view (a view displaying the frequency content of a signal over time, with the color signaling relative power, the y axis being frequency and the x as time). Tap the sonogram button to switch to a sonogram view, tap anywhere on the screen to return to the oscilloscope. Tap the FFT button to perform and display the input data after an FFT transform. Pinch in the oscilloscope view to expand and contract the scale for the x axis." …

Play sound when silence in the room; stop sound when voices heard

I need some guidance as I may have to shelve development until a later time.
I want to play a sound once the lights are switched off and the room goes dark, then stop the sound once the light is switched back on. I've discovered that Apple doesn't currently provide a way to access the ambient light sensor (not in any way that will get App Store approval).
The alternative I've been working on is to try and detect sound levels (using AVAudioPlayer/Recorder and example code from http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/. I.e., when I detect voices of people in the room have dropped to a specific level (i.e. silence trying to compensate for background noise), I play my sounds.
However, if the people in the room start talking again and I detect the voices, I need to stop playing the sounds.
Q: is this self-defeating, i.e., the sound generated by the iPhone will essentially be picked up by the iPhone microphone and indistinguishable from any voices in the room? Methinks yes and unless there's an alternative approach to this, I'm at an impasse until light sensor API is opened up by Apple.
I don't think the noise made by the iPhone speaker will be picked up by the mic. The phone cancels sounds generated by the speaker. I read this once, and if I find the source I'll post it. Empirically, though, you can tell this is the case when you use speaker phone. If the mic picked up sound from the speaker that's an inch away from it, the feedback would be terrible.
Having said that, the only sure way to see if it will work for your situation is to try it out.
I agree with woz: the phone should cancel the sound it's emitting. About the ambient light sensor, the only alternative I see is using the camera, but it would be very energy inefficient, and would require the app to be launched.

Resources