I'm developing a VoIP app where I want to stream audio captured from the microphone as Data to the server. From the server side I want to send the Data out to all the participants in the conversation.
My first thought is to use AVAudioEngine and then install a tap on the microphone, convert the AVAudioPCMBuffer to Data and then use the Alamofire framework
Alamofire.upload(data: Data, to: URLConvertible) method to upload each audio frame and then sending this out to all the clients. This comes with some complications though on the client side, like establishing the connection and so on. It feels like with this solution I need to send out thousands of messages from the server all the time saying "more packages are now available".
Anyone out there that can shed some light or input on good solutions around this? Or is this actually a good way to go?
Any help, input or thought is very much appreciated.
Related
I have read several posts here about live streaming video/audio from iOS device while user is recording. Unfortunately it seems that there is not any "good" solution.
I understand that I must have access to files while I am recording and then send files to server from which other users can watch my stream live (with a small time lag).
Working with iOS is not problem for me, I am more struggling with part where data should be handled to server and the whole processing on server.
I have several questions:
Saying just server is very vague, what "kind of" server it should be?
I understand that I must use some protocol to send data TO server and then to get data FROM server so user can watch live video, what protocol should I use?
I feel very lost with whole server side processing, what should be done with files that were sent to server?
All this seems to be very nontrivial is there any third party solution? For example what technology apps like Periscope, Ustream or Meerkat use to provide live stream feature for their users?
I would also really appreciate if possible answers would more than one word long for each question.
Please find my answers to your questions:
There is a class of software called "media servers". E.g. Wowza, Red5, Nimble Streamer, nginx-rtmp-module and a few others.
Most common protocols for sending data TO media server are RTMP and RTSP. Watching the video is done via several ones like RTMP (requires Flash installed), HLS (native for iOS, supported by Android 4+, working on some web-players), DASH (supported by some players).
No files needed, media server can process incoming live stream and handle connections from viewers.
Basically they use combination of mentioned technologies plus their own "know-how".
I can send files, messages and even video but no microphone streamming code seen to work as should.
files, messages and video have some kind of minimal structure that I can send and rebuild in the other side, but audio seems to have no structure simple to work on.
At this point already tried to use this project as base and send the AudioBuffer whith MCSession, but the code just crash when i tried to modify the buffer on the receive side.
I also tried to modify this stream program to uses the microphone but the stream of files seems different then the excepted for real-time audio input.
Have someone tried something like streaming real-time microphone data to another iOS device? or have any advice or clue that i can work on?
thank you in advance!
I want to build an iOS app that streams audio and some additional custom data between two users real-time. This is possible using GameKit if people are on the same network, but I haven't been able to find an SDK that can do this across the world.
Does anyone know if there is an existing service that does this?
If not, what services do you recommend for doing these two things (streaming audio and streaming data) separately?
Thanks.
WebRTC (www.webrtc.org/) has support for iOS (for audio streams, video promised later). But anyway in order to support communication for peers behind NATs - you will need your own sans, stun servers...
After looking into it, it seems that QuickBlox is able to do everything I need.
I'm working in app where I need to implement a simple voice functionality between two iOS devices in same network.
My doubt is about how to get audio units from master device and send over wifi or bluetooth network directly to slave device in realtime.
Some things about network communication I have done, I can transfer any NSData between devices using TCP.
Is very important to not use GK framework, because I need to connect two clients without any notification. For example, when I use GK to connect two devices, the iOS display a alert with a connection request, I need to avoid this connection request.
Currently I can do this with video, is more simple than audio.
Any ideias about the implementation of this are welcome.
Thanks.
First of all, you need learn how to capture the audio using Audio Queue Services.
Then transmit the audio data it over wifi and bluetooth and play it at the other side.
The most obvious choice for you is to use bonjour.
GameKit framework is built on top of this. You dont have to build too much on top of this for your application though. Yours is a straightforward application of bonjour.
Please refer the code for chatty if needed. Incase you need some background theory, please refer to http://mobileorchard.com/tutorial-networking-and-bonjour-on-iphone/.
I have used Audio Queue Services for the same sort of project.
For Networking I am using Bonjour and it has really solved the problem of transmitting text and video.
Did a lot of workarounds to make a voice chat in wifi using Audio Queue Services but have not succeeded. I will update once I find the solution other than this.
So I am trying to write a simple app that will connect to another iPhone and send messages over bluetooth, and it seems like the best way to do this is using the GameKit. If I am wrong, please point me in the right direction now before you read this whole question, haha.
The two requirements are:
iPhone to iPhone, same application (easy)
I can get the time it took to send the unreliable message (not easy)
I am going to assume this is not possible over GameKit based on the (little) research I have done, and I have not yet found a good guide to the CoreBluetooth framework. Once again, if you could point me in the right direction that would be appreciated.
TL;DR: Is there a way using GameKit to get how long it took the small unreliable message to be sent? If not, can I do this with CoreBluetooth?
-Jake
This feels like a strange question, you want to know, how long it took to send the message between the two phones?
The way to this is to set up a ping between client and server, the server will ping the client, then the client immediately sends the response back to the server, and the server divides the total time by 2.