I want to show audio meter levels in my app from the audio that my app send to the speaker. I can find the audio level from AVAudioPlayer but it works on audio files.
I have tried to acheive this using The Amazing Audio Engine as it is provided in its documentation here but I am unable to find out how to do that.
Is it possible to achieve this in ios? Can anyone suggest me any library, audio engine or method?
Thanks in advance
If you are using "remote i/o audio unit for handling audio input and output" this is a possibility:
https://developer.apple.com/library/ios/samplecode/aurioTouch2/Listings/ReadMe_txt.html
"… , and a sonogram view (a view displaying the frequency content of a signal over time, with the color signaling relative power, the y axis being frequency and the x as time). Tap the sonogram button to switch to a sonogram view, tap anywhere on the screen to return to the oscilloscope. Tap the FFT button to perform and display the input data after an FFT transform. Pinch in the oscilloscope view to expand and contract the scale for the x axis." …
Related
I want to detect the main direction of the sound recorded from iPhone. For example, I want to detect if the sound comes from "front" or "rear" camera.
https://developer.apple.com/documentation/avfoundation/avaudiosessiondatasourcedescription
This link describes how to set, but not how to detect in real time.
UPDATE:
Example use:
I start recording with front and back camera at the same time. I want to detect if audio comes from front o rear to change camera automaticatlly.
Is there any way?
Thanks!
You can iterate over AVAudioSession.inputDataSources, to check out available sources and obtain the one you want, and then set it to AVAudioSession.setPreferredInput(). If You don't need to set the input but just check it, use AVAudioSession.currentRoute.inputs
I am trying to write an app in Apple Swift that monitors audio from the microphone and displays the volume level on a VU meter style graph. I know how to do it using AVAudioRecorder, but I don't want to record and save the audio, I just want to monitor and observe the volume levels, as I will be monitoring the audio for multiple hours and saving this to the phone would take up tons of space.
Can anybody lead me in the right direction as to how I can do this?
Thanks!
I do not have any code to show, as I a just looking for the right direction to go not debugging.
You can use AVCaptureSession:
add an input device (the microphone) using AVCaptureDeviceInput;
add an output AVCaptureAudioDataOutput, setting the sample buffer delegate
Once you start the session, the delegate will receive audio samples that you can process however you wish.
Don't forget to ask permission before using the AVCaptureDevice!
My application demands a functionality that sound has to come through only one side of headphones based on the user choice. i.e either sound can play from left side of head phone or right side of the phone, but not from the two sides at a time.
I want to know that how to switch the audio output of an iOS device between the two sides/buds of the headphones connected to the device.How can i achieve this.Please share your suggestions and ideas.
Thanks in advance.
If you're using AVAudioPlayer to play the audio, you can use the pan property of the same to adjust the volume of each channel.
pan
The audio player’s stereo pan position.
#property float pan
Discussion
By setting this property you can position a sound in the stereo field. A value of –1.0 is full left, 0.0 is center, and 1.0 is full right.
Availability
Available in iOS 4.0 and later.
Declared In
AVAudioPlayer.h
Or if you want more control over the audio playback, you can use AudioQueueServices described here, along with the sample code.
I want to create an application for measuring the sound input on an iPad.
Users will have to scream into the iPad mic and a gauge will display the scream level. So I will have to get the mic volume and display the value on the gauge.
You can use the AVAudioRecorder for that. Have a look at this tutorial: Tutorial: Detecting When A User Blows Into The Mic
In Apple's documentation, focus on:
Using Audio Level Metering
meteringEnabled property
updateMeters
peakPowerForChannel:
averagePowerForChannel:
In my BB application I need to play/record sound and simultaneously show a Sound Graphic Analyser(as shown in image attached) within the application.I have searched forums but have found nothing significant.
I want to show graphics as shown when playing or recording music dependending upon the pitch of the sound. Is this possible?
As far as I know there is no API in RIM SDK to obtain such data upon playing a media file. But you can analyze the sound file contents by yourself, draw diagram and implement "cursor" (vertical green line on your image) that will be based on the time passed after start of the sound playing.