WebRTC Audio in electron - electron

I'm making a desktop app using electron and want to build a voice chat inside the app, i understand that WebRTC is the principal solution for this, in my research i read about Assignaling, Stun and Turn server's. But my realy question is, if i have a client using Electron, can i made the app send streaming voice from microphone to a Server(Elixir, python or Node) and the server just broadcast for everyone in the room ? or a just need the signaling, Stun and Turn ?
If someone have a tip in material where i can learn about the solution i realy grateful.
Thank's.

For this, you can use webRTC just as you would in the Browser. I tryed to develop a voice chat using peerJS and socket.io, but It seems like if there is a problem with electron that prevents voice connections outside of your network. For one-to-many-broadcasting you can use mediaSoup, for example. More information can be found here.
If you manage to get webRTC media streams working somehow, I'd be happy if you let me know

Related

Video Streaming and Broadcasting using WebRTC

I am very new to Real Time Protocols and I had some questions about how WebRTC works and how I can implement it. I am trying to create a one to many livestream like facebook or periscope, where one user broadcasts and other users join and stream the video. I am using Swift from my client end.
My questions are:
How do I broadcast a video using WebRTC
Is there an SDK for WebRTC in Swift/iOS
I know the questions are very vague but a guidance to the right direction would be great because I am not sure where to start
You will need to use backend servers for that.
If you plan on broadcasting to multiple users directly from your mobile app then stop...
You need to connect your mobile app to a backend media server which then can be used to broadcast the video to a larger audience.
There are several commercial and open source alternatives that enable you to do that. I'd check Red5Pro, Wowza, SwitchRTC, Jitsi, Janus and Kurento for this task.
For the client side, look at react-native-webrtc
You can find more tools for WebRTC developers here.
Regarding your question (2), there's also a SDK for iOS here and a neat get-started-page here (although 2.5ys old, but I haven't found anything better so far yet)

iOS XMPP vs WebRTC, which should I use?

I want to build an iOS application that people can video call or audio call to each other. Stable calling is my goal, and it means I need less connection interrupt as much as good, I also need light application (not too high application size because of video libraries)
I've googled about "ios video chat' keywords since last few days. Researched and found that the most popular framework (technology, library) for video/audio calling are XMPP and WebRTC (I'm I right or do guys have something better?)
XMPP - Client/server TCP communication
WebRTC - P2P Connection
The information about these libraries make me confused, so which library I should use for better performance, light application, stable?
Any idea?
XMPP is about signaling (reaching from A to B, indicating the desire to have a "call", disconnecting, etc).
WebRTC is about media (actually sending voice and video).
You need both signaling and media in your app.
For media use WebRTC. There's nothing else that will make sense. On iOS, it is kind of tricky at the moment, as iOS 11 incorporates WebRTC already, so how this will apply and help you in your development is yet to be seen (see here).
My suggestion is to aim for a web app and then figure out if you need to go for a fully native implementation and port WebRTC to iOS - or just use a webview inside an app (Cordova or Crosswalk should do).
For signaling, you can use XMPP. Or anything else for that matter. My own personal preference is a proprietary protocol. Look at Matrix or SimpleWebRTC for that.
Also - don't forget that you will need to deal with STUN and TURN - NAT traversal, but that's a simpler thing to handle.
XMPP Framework: https://github.com/robbiehanson/XMPPFramework/wiki/IntroToFramework
WebRTC Native Code: https://webrtc.org/native-code/ios/
its not about which is best its about what fulfills our requirements

WebRTC for iOS for VoIP communication

Is there any WebRTC solution for iOS for free with easy setup?
I tried to use http://www.webrtc.org/native-code/ios because our web end is already done with its web api and I thought I may not have other way around for letting calls go between web and iOS too. But iOS API's setup is very tedious and time taking (The downloading of WebRTC checkout is taking like lives with no gain).
I searched around and found a few like tokBox and quickblox but they are not free.
Did you look at RestComm iOS SDK ? It supports WebRTC Audio only right now but we are working on adding video in the next few weeks. Also it uses SIP as a signalling protocol.
https://github.com/Mobicents/restcomm-ios-sdk
http://www.telestax.com/how-to-integrate-the-restcomm-ios-client-sdk-in-your-app/
http://docs.telestax.com/restcomm-client-ios-sdk-quick-start/
Take a look at https://github.com/oney/RCTWebRTCDemo .
This is a React Native WebRTC project which works on iOS and Android and also has a signaling server example (but you can also use the online version for quick tests!).
Since the WebRTC requires DTLS-RTP, RTCP-FB, ICE and a lot of other newest standards, but the VoIP standards are old about 10+ years, therefore you need setup a gateway to convert the signaling and transcoding the RTP.
With the WebRTC Gateway, in the browser side, you can create the HTML5 application to connect to WebRTC gateway, the gateway will communicates with your PBX, and your iOS client connects to your PBX, then the call can be established between browser with iOS client app.

Live Streaming App iOS

I am trying to develop a live-streaming application like the meerkat app, where user A can broadcast a live stream while other users are able to watch it. I am having trouble understanding the architecture and mechanisms used to upload video to a server. Currently, I am using a dedicated server with FFMPEG installed on it. I also know FFServer can be used to perform RTSP communication, but I am still unclear how to do this. Can anyone guide me on this?
I would like to know how to upload videos to a server or whether there is another way to perform a live stream. Open source frameworks are welcome.
for Live streaming video/audio http://www.wowza.com/ give you the best functionality. you have to set up your server in WOWZA also you cant test in that.
for IOS you can broadcast and receive from the below demo you can download from here
i think it's helpful to you :)
Well i was in search of something open source which can be implemented without any additional cost. Luckily found Red5 Server (Open Source) https://github.com/Red5/red5-server
I had configured it on my dedicated server and running perfectly fine. Now as server side issue is solved. I need something thing to work on iOS side. For that also i found https://github.com/slavavdovichenko/MediaLibDemos3x
So with the combination of this two repos i was able to make an live streaming app like meerkut
Thanks

Audio Chat + Data Stream Library for iOS

I want to build an iOS app that streams audio and some additional custom data between two users real-time. This is possible using GameKit if people are on the same network, but I haven't been able to find an SDK that can do this across the world.
Does anyone know if there is an existing service that does this?
If not, what services do you recommend for doing these two things (streaming audio and streaming data) separately?
Thanks.
WebRTC (www.webrtc.org/) has support for iOS (for audio streams, video promised later). But anyway in order to support communication for peers behind NATs - you will need your own sans, stun servers...
After looking into it, it seems that QuickBlox is able to do everything I need.

Resources