Capture WKWebView audio for metering - ios

I am currently working on an app that contains a WKWebView in which I have loaded an iFrame with video streaming from YouTube.
I would like to be able to create an audio visualizer that is shown alongside this iFrame and moves in reaction to the audio from the stream.
I have been following along with this Ray Wenderlich tutorial for creating a music visualizer, but the tutorial uses the setMeteringEnabled and updateMeters functions built in to AVAudioPlayer.
Is there any way to meter audio coming from a WKWebView? I just want an average volume level, not the actual audio stream itself.
I have attempted to look at libraries like The Amazing Audio Engine, but none of them seem to allow you to capture the channel coming from the WKWebView at all, let alone for metering purposes.

Related

Manual Bit rate control of video in swift

I wanted to implement a video quality control in video player in swift.I have used preferredPeakBitRate of avplayer item but I am not able to change the quality of video while the video is playing.I have the manifest file URL that contains different bit rates of video.Kindly suggest me any third party video player that used manual video quality control or tell me how I can achieve using avplayer in swift.

How can I make Apple's mixer audio unit on iOS not do an audio fade?

I am using The Amazing Audio Engine to simply play an audio file, but I find that when the channel starts playing, there is some automatic fade in happening.
You can see the top waveform is the output of my iPad, and the bottom waveform is the actual raw audio file. There is definitely a 30ms microfade being done.
There is nothing doing that within the amazing audio engine library, so it's something internally happening from apple's mixer audio unit. Is there any way to turn off this behavior?
I suspect that the AudioFilePlayer (used by TAAE) uses the Extended Audio File Services under the hood. ExtAudioFileRef will do that on the first read after a seek if there is any de-coding or sample rate conversion. I had to use the Audio File Services directly to get ride of the implicit fading.

Loosing audio after processing in AVComposition

I am using AVMutableComposition to combine two pieces of media (one video one graphic overlay). The video was captured using UIImagePickerController.
Video records fine and audio is there when previewing the recording.
Process with Composition, and export session. Video saves fine (with overlay), but no audio.
iOS7.
I'm not specifically doing anything with audio in the composition. I just assumed it would "come along" with the video file. Is that accurate, or do I need to create a dedicated audio track in the composition?
_mike
After much researching, I found the solution to this in another Stackoverflow question (and answer).
iOS AVFoundation Export Session is missing audio.
Many thanks to that user.

With html5 audio limitations on iOS, is it possible to play background music and sound effects at the same time?

I've been reading about the limitations of html5 on iOS.
Currently, all devices running iOS are limited to playback of a single audio or video stream at any time. Playing more than one video—side by side, partly overlapping, or completely overlaid—is not currently supported on iOS devices. Playing multiple simultaneous audio streams is also not supported. You can change the audio or video source dynamically, however. See “Replacing a Media Source Sequentially” for details.
Apparently I can only play one file at a time. A common technique is to have one file, but combine all of the sounds you need into this one file and seek to the parts you want to play. This is called an audio sprite.
But here's what's not clear to me: If I use an audio sprite, can I overlap it with itself? For example, can I have the sound of a bullet while I'm playing background music? Or, can I have the sound of two bullets firing simultaneously?
Recent versions of Mobile Safari (http://caniuse.com/audio-api) support the Web Audio API, which supports simultaneous playback.
Check this demo on an iOS device: https://webaudiodemos.appspot.com/TouchPad/index.html
Shameless plug for a simple wrapper: https://github.com/endemic/sona

iOS: analysing audio while recording video to apply image filters

I'm desperate to find a solution to the following problem: I have an iPhone application that:
can record Video and audio from the camera and microphone to a video file
perform some audio processing algorithms in real-time (while the video is being recorded)
Apply filters to the video (while it's recording) that are modified by the latter algorithms
I've accomplished all of the tasks separately using some libraries (GPUImage for the filters, and AVFoundation for basic audio processing) but I haven't been able to combine the audio analysis and the video recording simultaneously, i.e: it records perfectly the video file and applies the filters correctly, but the audio processing part just STOPS when I start to record the video.
I've tried with AVAudioSession, AVAudioRecorder and have looked all around google and this page but I couldn't find anything. I suspect that it has to do with concurrent access to the audio data (the video recording process stops the audio processing because of concurrency) but either way I don't know how to fix it
Any ideas? anyone? Thanks in advance.

Resources