Houseparty App streaming technology - ios

Anyone know what streaming technology or methodology the Houseparty app uses to ensure that there's no lag between guests' streams?
https://itunes.apple.com/us/app/houseparty-group-video-chat/id1065781769?mt=8&_branch_match_id=328927036209051924

Houseparty uses WebRTC as it's base technology and currently uses Tokbox, a provider of video/audio WebRTC services.
Latency should be typically ~ 300 msec between parties plus any network latency. Typically anything < 500 msec delay is not perceivable by small group conversation.

Related

Will Google Cloud Run support GPU/TPU some day?

So far Google Cloud Run support CPU. Is there any plan to support GPU? It would be super cool if GPU available, then I can demo the DL project without really running a super expensive GPU instance.
So far Google Cloud Run support CPU. Is there any plan to support GPU?
It would be super cool if GPU available, then I can demo the DL
project without really running a super expensive GPU instance.
I seriously doubt it. GPU/TPUs are specialized hardware. Cloud Run is a managed container service that:
Enables you to run stateless containers that are invokable via HTTP requests. This means that CPU intensive applications are not supported. Inbetween HTTP request/response the CPU is idled to near zero. Your expensive GPU/TPUs would sit idle.
Autoscales based upon the number of requests per second. Launching 10,000 instances in seconds is easy to achieve. Imagine the billing support nightmare for Google if customers could launch that many GPU/TPUs and the size of the bills.
Is billed in 100 ms time intervals. Most requests fit into a few hundred milliseconds of execution. This is not a good execution or business model for CPU/GPU/TPU integration.
Provides a billing model which significantly reduces the cost of web services to near zero when not in use. You just pay for the costs to store your container images. When an HTTP request is received at the service URL, the container image is loaded into an execution unit and processing requests resume. Once requests stop, billing and resource usage also stop.
GPU/TPU types of data processing are best delivered by backend instances that protect and manage the processing power and costs that these processor devices provide.
You can use GPU with Cloud Run for Anthos
https://cloud.google.com/anthos/run/docs/configuring/compute-power-gpu

Streaming live desktop video to a web application

I'm looking to find a way to stream a user's desktop LIVE (through some piece of software, such as Open Broadcaster Software) to a web application.
I'm assuming I should use a CDN to get the live streamed video to my web application, but how (and what software should I use) to get the user's desktop to a streaming service? Should I use a service such as Red5 or an AWS service? Or if only a few viewers are using it, should I host the service myself?
Although I have built my share of web applications, I have never dealt with live media streaming before, and I would appreciate any assistance anyone could lend.
By far the best resource for video on Rails is OpenTok
Our own demo here: http://bvc-video.herokuapp.com/broadcasts/1
--
Streaming
Video streaming is a tough one
The problem is really dependent on what you're trying to stream. If it's "live" video - I.E captured & sent directly to the viewers, you'll have to use some sort of server to process the video.
Although I don't have huge experience with this, the main issue we've found is the compression / distribution of the feed. It's actually very simple to acheive video streaming on iOS - all the software / hardware is the same (just use the same API / drivers)
This often negates the requirement for a central server, although it's highly recommended (almost required) for many cases. Problems arise when you try and beam to multiple clients on multiple systems; as you'll run into compatibility issues
--
Solutions
The solutions we've found are thus:
The most stable part of the app is to take the stream & send to a server
The wizardry will then be to beam that stream to multiple clients
The way to do this is typically to use a flash widget & pull the stream from the server
WebRTC is becoming the standard (OpenTok is built on this)
I'm not sure about video compression / distribution. Akami is an industry heavyweight, but I've never used it. Brightcove too

How to convert RTSP streaming to Http Live Streaming using lighttpd?

I'm having a problem here. I want to play RSTP streaming on ipad and iphone. but I find out that it will be much more easier if I use Http Live Streaming. I want to convert my RTSP streaming to Http live streaming using lighttpd. but I really have no idea how to do that. Do lighttpd accetp rtsp streaming url as input? Can anyone help? thanks!
you have two choices:
1) Run a server on your network that re-streams rtsp as hls.
a) wowza - popular, expensive
b) live 555 - free, lots of work
d) ffserver - free and as basic as it gets tons of work to make work.
Advantage :
No bandwidth restrictions over cellular or wi-fi
play with native apple players
Disadvantage
High server bandwidth - if your paying for server time you may want to watch this.
high letancy - forget any kind of live video.
2) Run FFmpeg based player on device
advantages :
a) A lot easier than it used to be, we do this all the time
b) deal with lgpl license, clear guidelines at ffmpeg.org and not a huge hassle
c) all on device, no server load issues.
Disadvantages
Limited bandwidth over celluar (about 10 min intervals), unlimited over wifi
lighttpd doesn't accept RTSP as an input. You will need some sort of translator program to read the rtsp stream and output the files to the website storage. I think you could do it with the avconv/ffmpeg program.

PubNub long polling vs sockets - mobile battery life

I recently began using PubNub in my iOS app, and am very happy with it. However, I have been looking at the other options available, such as Pusher and Realtime.co, which use Websockets. PubNub, on the other hand, uses long polling. I have done my own little speed comparisons and for my purposes, I find that they are all fast enough.
PubNub offers some nice features like message history and a list of everyone in the channel , so barring everything else I am leaning toward them. My question is, should I be concerned with battery life and heavy usage with a long-polling solution like PubNub? Would a Websockets solution be significantly more power efficient?
PubNub on Mobile with Battery Saving
As a preface to battery performance and efficiency, PubNub is an optimized service for mobile devices on-the-go when compared to alternative or self-hosted websocket solutions. PubNub offers a catch-up feature on Mobile phones that will automatically redeliver messages that were missed, especially for devices that are moving between cell-network towers and changing from 3G/4G to WiFi. Websockets tend to be unrecommended for mobile due to reliability in common scenarios and that is why PubNub will select the best transport for your device automatically; so you don't have to decide what makes the most sense for the phone in transit.
Battery Savings Pattern with PubNub
PubNub has a keep-alive connection that is uncommonly long and set to one hour. A ping is sent each 300 seconds (300,000ms). This long enough to provide the best mix between mobile performance and battery saving.
Battery Saving Tips on Mobile
Keeping messages as small as possible.
Sending Fewer messages less frequently.
Connect to only one channel rather than two or more.
Automatic Transport Detection
PubNub will automatically select the best transport for you when needed especially on mobile devices. An interesting conversation about websockets occurred in Portland, Oregon this last October 2012 at KRTConf that I recommend to you https://speakerdeck.com/3rdeden/realtimeconf-dot-oct-dot-2012
Let me know if this was helpful.
I don't think this is correct. See http://eon.businesswire.com/news/eon/20120927005429/en/Kaazing/HTML/HTML5
I am the one who actually did the testing for Kaazing on comparing WebSocket and regular http-based message transfers. I saw a drastic decrease in battery consumption with WebSocket. Now Kaazing has additional technology above and beyond WebSocket to reduce battery consumption, but even if you don't use Kaazing, you will still see some battery consumption efficiencies with WebSocket. I tried this out myself by running actual tests even for basic WS versus http without any special battery optimization algorithms.

Real time audio conversation iOS

I am designing an iOS app for a customer who wants to allow real-time (with minimum lag, max 50ms) conversations between users (a sort of Teamspeak). The lag must be low because the audio can also be live music, played with instruments, so all the users need to synchronize. I need a server, which will request audio recordings to every client and send to others (and make them hear the same sound at the same time).
HTTP is easy to manage/implement and easy to scale, but very low-performing because an average HTTP request takes > 50ms... (with a mid-level hardware), so I was thinking of TCP/UDP connections kept open between clients and server.
But I have some questions:
If I develop the server in Python (using TwistedMatrix, for example), how are its performance ?
I can't develop the server in C++ because it is hard to manage (scalable) and to develop.
Anyone used Nodejs (which is easy to scale) to manage TCP/UDP connections?
If I use HTTP, will it be fast enough with Keep-Alive? Becuase usually the time required for an HTTP Request to be performed is > 50ms (because opening-closing connection is hard), and I want the total procedure to be less than that time.
The server will be running on a Linux machine.
And finally: which type of compression can you suggest me? I thought Ogg Vorbis would be nice, but if there's anything better (and can be used in iOS), I am open to changes.
Thank you,
Umar.
First off, you are not going to get sub 50 ms latency. Others have tried this. See for example http://ejamming.com/ a service that attempts to do what you are doing, but has a musically noticeable delay over the line and is therefore, in the ears of many, completely unusable. They use special routing techniques to get the latency as low as possible and last I heard their service doesn't work with some router configurations.
Secondly, what language you use on server probably doesn't make much difference, as the delay from client to server will be worse than any delay caused by your service, but if I understand your service correctly, you are going to need a lot of servers (or server threads) just relaying audio data between clients or doing some sort of minimal mixing. This is a small amount of work per connection, but a lot of connections, so you need something that can handle that. I would lean towards something like Java, Scala, or maybe Go. I could be wrong, but I don't think this is a good use-case for node, which, as I understand it, does not do multithreading well at this time. Also, don't poo-poo C++, scalable services have been built C++. You could also build the relay part of the service in C++ and the rest in whatever.
Third, when choosing a compression format, you'll have to choose one that can survive packet loss if you plan to use UDP, and I think UDP is the only way to go for this. I don't think vorbis is up to this task, but I could be wrong. Off the top of my head, I'm not sure of anything that works on the iPhone and is UDP friendly, but I'm sure there are lots of things. Speex is an example and is open-source. Not sure if the latency and quality meet your needs.
Finally, to be blunt, I think there are som other things you should research a bit more. eg. DNS is usually cached locally and not checked every http call (though it may depend on the system/library. At least most systems cache dns locally). Also, there is no such protocol as TCP/UDP. There is TCP/IP (sometimes just called TCP) and UDP/IP (sometimes just called UDP). You seem to refer to the two as if they are one. The difference is very important for what you are doing. For example, HTTP runs on top of TCP, not UDP, and UDP is considered "unreliable", but has less overhead, so it's good for streaming.
Edit: speex
What concerns the server, the request itself is not a bottleneck. I guess you have sufficient time to set up the connection, as it happens only in the beginning of the session. Therefore the protocol is not of much relevance.
But consider that HTTP is a stateless protocol and not suitable for audio streaming. There are a couple of real time streaming protocols you can choose from. All of them will work over TCP or UDP (e.g. use raw sockets), and there are plenty of implementations.
In your case, the bottleneck with latency is not the server but the network itself. The connection between an iOS device and a wireless access point (AP) eats up about 40ms if the AP is not misconfigured and connection is good. (ping your iPhone.) In total, you'd have a minimum of 80ms for the path iOS -> AP -> Server -> AP -> iOS. But it is difficult to keep that latency stable. (Typical latency of AirPlay on my local network is about 300ms.)
I think live music over iOS devices is not practicable today. Try skype between two iOS devices and look how close you can get to 50ms. I'd bet no one can do it significantly better, what concerns latency.
Update: New research result!
I have to revise my claims regarding the latency of wifi connections of the iDevice. Apparently when you first ping your device, latency will be bad. But if I ping again no later than 200ms after that, I see an average latency 2ms-3ms between AP and iDevice.
My interpretation is that if there is no communication between AP and iDevice for more than 200ms, the network adapter of the iDevice will go to a less responsive sleep mode, probably to save battery power.
So it seems, live music is within reach again... :-)
Update 2
The ping-interval required for keep alive of low latency apparently differs from device to device. The reported 200ms is for an 3rd gen. iPad. For my iPhone 4 it's more like 50ms.
While streaming audio you probably don't need to bother with this, as data is exchanged on a more frequent basis. In my own context, I have sparse communication between an iDevice and a server, but low latency is crucial. A keep alive therefore is the way to go.
Best, Peter

Resources