I want to detect the main direction of the sound recorded from iPhone. For example, I want to detect if the sound comes from "front" or "rear" camera.
https://developer.apple.com/documentation/avfoundation/avaudiosessiondatasourcedescription
This link describes how to set, but not how to detect in real time.
UPDATE:
Example use:
I start recording with front and back camera at the same time. I want to detect if audio comes from front o rear to change camera automaticatlly.
Is there any way?
Thanks!
You can iterate over AVAudioSession.inputDataSources, to check out available sources and obtain the one you want, and then set it to AVAudioSession.setPreferredInput(). If You don't need to set the input but just check it, use AVAudioSession.currentRoute.inputs
Related
Well, here is the thing. I'm now doing a project named html5 player with real time Gps showing on the map. It means that I need to show the current position on the map when the video is playing. Video file and Gpx file related to it are on my hands. I have already realized the player part and I have successfully added a map below the player. The map can already show the track of the video. What I need to do next is to show the position on the map(maybe a marker or icon on the map,showing that you are moving) with the video playing. They should be synchronized. So is there any function or method in Ol3 can realize this? What I have in my mind is that I parse the GPX file to extract time data and position data and then match the video file's current time to it. But it's kind of lot of calculation. I would appreciate that if u guys could help me out wiz this!
You can look at this sample:
http://openlayers.org/en/v3.14.2/examples/feature-move-animation.html
if you have calculated the path of your point, i suggest you use the map postcompos event to keep a smooth rendering
This is kind of what a barcode scanner does, except I do not wish to detect a barcode (I will write the code for what I want to detect). How do I even set up the camera so it is a continuos scanner? Like the user just presses a play button and the camera will automatically scan for stuff? Just as an example, say I wish to run the scanner until the camera runs into the event that the whole screen is pure black, at which point it will display the message "detected all black".
There is an older Apple Technical Q&A that details how to use AVFoundation to continuously generate low resolution UIImages from a video capture session that you could then sample and use for your detection:
https://developer.apple.com/library/ios/qa/qa1702/_index.html
I am trying to write an app in Apple Swift that monitors audio from the microphone and displays the volume level on a VU meter style graph. I know how to do it using AVAudioRecorder, but I don't want to record and save the audio, I just want to monitor and observe the volume levels, as I will be monitoring the audio for multiple hours and saving this to the phone would take up tons of space.
Can anybody lead me in the right direction as to how I can do this?
Thanks!
I do not have any code to show, as I a just looking for the right direction to go not debugging.
You can use AVCaptureSession:
add an input device (the microphone) using AVCaptureDeviceInput;
add an output AVCaptureAudioDataOutput, setting the sample buffer delegate
Once you start the session, the delegate will receive audio samples that you can process however you wish.
Don't forget to ask permission before using the AVCaptureDevice!
My application demands a functionality that sound has to come through only one side of headphones based on the user choice. i.e either sound can play from left side of head phone or right side of the phone, but not from the two sides at a time.
I want to know that how to switch the audio output of an iOS device between the two sides/buds of the headphones connected to the device.How can i achieve this.Please share your suggestions and ideas.
Thanks in advance.
If you're using AVAudioPlayer to play the audio, you can use the pan property of the same to adjust the volume of each channel.
pan
The audio player’s stereo pan position.
#property float pan
Discussion
By setting this property you can position a sound in the stereo field. A value of –1.0 is full left, 0.0 is center, and 1.0 is full right.
Availability
Available in iOS 4.0 and later.
Declared In
AVAudioPlayer.h
Or if you want more control over the audio playback, you can use AudioQueueServices described here, along with the sample code.
I want to show audio meter levels in my app from the audio that my app send to the speaker. I can find the audio level from AVAudioPlayer but it works on audio files.
I have tried to acheive this using The Amazing Audio Engine as it is provided in its documentation here but I am unable to find out how to do that.
Is it possible to achieve this in ios? Can anyone suggest me any library, audio engine or method?
Thanks in advance
If you are using "remote i/o audio unit for handling audio input and output" this is a possibility:
https://developer.apple.com/library/ios/samplecode/aurioTouch2/Listings/ReadMe_txt.html
"… , and a sonogram view (a view displaying the frequency content of a signal over time, with the color signaling relative power, the y axis being frequency and the x as time). Tap the sonogram button to switch to a sonogram view, tap anywhere on the screen to return to the oscilloscope. Tap the FFT button to perform and display the input data after an FFT transform. Pinch in the oscilloscope view to expand and contract the scale for the x axis." …