Streaming the microphone in an iOS app - ios

I can send files, messages and even video but no microphone streamming code seen to work as should.
files, messages and video have some kind of minimal structure that I can send and rebuild in the other side, but audio seems to have no structure simple to work on.
At this point already tried to use this project as base and send the AudioBuffer whith MCSession, but the code just crash when i tried to modify the buffer on the receive side.
I also tried to modify this stream program to uses the microphone but the stream of files seems different then the excepted for real-time audio input.
Have someone tried something like streaming real-time microphone data to another iOS device? or have any advice or clue that i can work on?
thank you in advance!

Related

How to access audio stream in real time when using Tokbox

I'm building a voice-only IOS (swift) app and Tokbox is my VoIP provider.
My app is simple: user1 is talking to user2. However, I would like to get access to the audio stream in real-time. I'm ok with both options: 1. The audio stream goes to my piece of code then I stream it back to Tokbox 2. The audio stream is forked to Tokbox and to my code in parallel.
The only way I was able to put my hand on the audio stream is by using their archiving capabilities, but that is too late (only after the session ends)
Any ideas? or maybe other providers that give me that option?
Option 1 can be done using the external/custom audio driver, take look at this example on how to use/implement it https://github.com/opentok/opentok-ios-sdk-samples/tree/master/Custom-Audio-Driver

Inputs on methods for developing a VoIP application in Swift 3

I'm developing a VoIP app where I want to stream audio captured from the microphone as Data to the server. From the server side I want to send the Data out to all the participants in the conversation.
My first thought is to use AVAudioEngine and then install a tap on the microphone, convert the AVAudioPCMBuffer to Data and then use the Alamofire framework
Alamofire.upload(data: Data, to: URLConvertible) method to upload each audio frame and then sending this out to all the clients. This comes with some complications though on the client side, like establishing the connection and so on. It feels like with this solution I need to send out thousands of messages from the server all the time saying "more packages are now available".
Anyone out there that can shed some light or input on good solutions around this? Or is this actually a good way to go?
Any help, input or thought is very much appreciated.

iOS RTP live audio receiving

I'm trying to receive a live RTP audio stream in my iPhone but I don't know how to start. I'm seeking some samples but I can't find them anywhere.
I have a Windows desktop app which captures audio from the selected audio interface and streams it as ยต-law or a-law. This app works as an audio server that serves any incoming connection with that streaming. I have to say that I've developed an Android app that receives that stream and it works, so I want to replicate this functionality on iOS. In Android we have "android.net.rtp" package to manage this and transmit or receive data streams over the network.
Is there any kind of equivalent package for iOS to implement this? Could you give me any kind of reference / sample to do this, or just tell me where to start?
You can see this libraryHTTPLiveStreaming, But his protocol maybe is not standard one, You can check my fork aelam/HTTPLiveStreaming-1, I'm still working on it, it can be played by ffplay. You can try
check the file rtp.c in ffmpeg, I think it will help out

ios audio input and output via rtp stream

I want to create an App, which can play sound received by rtp-stream and capture sound from the microphone and send it via rtp-stream. The problem is, that I have no idea how to start. Can someone tell me a library for rtp-streaming on ios devices? I took a look at Live555, but I didn't understand how it works.
Thanks

Stream live video from ipad? [duplicate]

I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.

Resources