We are using Twilio video in iOS and Android (through a react-native extension). We would like to warn users about lousy connections if they are making a call. Is there something buried in Twilio's SDKs to do this? Or do people have suggestions for a great library to measure connection quality? (for instance: https://www.npmjs.com/package/network-js)
Twilio developer evangelist here.
In WebRTC implementations there is normally a way to get the connection statistics. In iOS using Twilio Video, you need to call the getStatsWithBlock method on the room. In Android you want the getStats method on the room. I don't know how you've made your react-native wrapper, but those are the bits you are looking for.
Once you've got the stats, you can start looking for the track stats, which will give you information about the frames per second, frames received, frames dropped, etc. This means you can judge the quality of the call not on the network, but on the data being sent and received.
Sorry I can't be more specific, hopefully this gets you started.
Related
So the idea is pretty simple: to make caller to listen some nice melody instead of default ring beeps.
I have already looked at Twilio tutorials, but the only thing i found is that it is possible with queued calls.
Does such feature available with verb and not using queues?
Twilio developer evangelist here.
If you dropped your caller into a conference you could set the wait music while you dialled another party into the call. That's not much different to using a queue though. There's no way to just set the dialling tone while the phone is ringing, so I would recommend one of these options.
Also note, the phone may ring before Twilio is able to deal with it. That is out of our control, so you may not get your music for the entire ringing period.
I am very new to Real Time Protocols and I had some questions about how WebRTC works and how I can implement it. I am trying to create a one to many livestream like facebook or periscope, where one user broadcasts and other users join and stream the video. I am using Swift from my client end.
My questions are:
How do I broadcast a video using WebRTC
Is there an SDK for WebRTC in Swift/iOS
I know the questions are very vague but a guidance to the right direction would be great because I am not sure where to start
You will need to use backend servers for that.
If you plan on broadcasting to multiple users directly from your mobile app then stop...
You need to connect your mobile app to a backend media server which then can be used to broadcast the video to a larger audience.
There are several commercial and open source alternatives that enable you to do that. I'd check Red5Pro, Wowza, SwitchRTC, Jitsi, Janus and Kurento for this task.
For the client side, look at react-native-webrtc
You can find more tools for WebRTC developers here.
Regarding your question (2), there's also a SDK for iOS here and a neat get-started-page here (although 2.5ys old, but I haven't found anything better so far yet)
I'm developing a VoIP app where I want to stream audio captured from the microphone as Data to the server. From the server side I want to send the Data out to all the participants in the conversation.
My first thought is to use AVAudioEngine and then install a tap on the microphone, convert the AVAudioPCMBuffer to Data and then use the Alamofire framework
Alamofire.upload(data: Data, to: URLConvertible) method to upload each audio frame and then sending this out to all the clients. This comes with some complications though on the client side, like establishing the connection and so on. It feels like with this solution I need to send out thousands of messages from the server all the time saying "more packages are now available".
Anyone out there that can shed some light or input on good solutions around this? Or is this actually a good way to go?
Any help, input or thought is very much appreciated.
for Android, several apps exist that show such information, like https://play.google.com/store/apps/details?id=com.farproc.wifi.analyzer does it:
I need similar functionality in my iOS app but can't find any information about that. It this forbidden by Apple?
Apple doesn't allow apps to access the hardware like that. A fairly common workaround is to measure network throughput instead. You could take periodic samples of the current network speed, and then display that data over time.
Network Multimeter, for example, is an app that will give you info on your network as you walk around. It does that by sending data across the network and displaying the throughput.
I need help making an app that works over bluetooth/wifi. I prefer using bluetooth since there won't be no lags.
Basically, what I want the app to do is stream audio from one device to another device. It has to be accurate.
Any help?(I am willing to pay for the man that helps me).
Thanks in advance!
All of the above is just the beginning. The API allows your iPhone to
stream data from the service once it is discovered and connected to,
etc. Just like the older Bluetooth spec, but simpler. In the spec’s
simplicity lies its power.
for details please have a look at LINK here.