We have a Wowza server to provide video recording.
Now we need to perform load testing of the recording process.
I spent number of hours to find an appropriate SaaS, but has not reached success in this.
What tools (cloud services first) do you now?
Related
I'm creating an iOS app where I want a user to be able to live stream a video, however, users who join the live stream after it starts, start watching the stream from the beginning instead of live (I will also add functionality that allows the user watching to skip ahead and then be able to watch live).
I have looked at many third party streaming options such as Agora, Twilio, Vimeo, etc, however, I don't believe they meet my needs as I need users who join the live stream to start watching from the beginning and not live.
I have explored continuously uploading small video chunks to something like firebase storage, and then continuously reading those chunks for users watching the stream. However, as explained here: https://stackoverflow.com/a/37870706/13731318 , this is to very efficient and leads to a substantial lag.
Does anyone have any idea how to go about doing this that leverages third parties?
I think you can use the HLS protocol to implement this.
HLS allows starting to watch from the beginning or not. That is controlled by the settings.
I am not sure about uploading because I think it has to be implemented on the server-side more.
I'm working on the live streaming app like Periscope and doing research on requirement and restriction on iOS.
I found out that Apple only allows HLS (Http Live Streaming) for certain conditions. I found such conditions below from apple site.
If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming.(https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW5)
But I'm not sure that HLS should be used for both publishing and watching video or only for watching is acceptable? Because i thinking of using RTMP for publishing and HLS for watching.
I wrote an app similar to periscope that is on the app store now and it can use 2Mbps and connects via RTMP protocol to send the data. So my guess is they no longer enforce it. I also beleive that at the time that that was written cell service load was possibly to high and they were hoping that HLS would help with that would be my guess. Now with 4gLTE it can handle the loads a little better. Again that is just a guess. My app went up with no problem or mention of that and the review team was more than aware what the app did.
I use AVPlayer to play .MOV videos stored on my dedicated server.
When an user wants to play a video, i load it on AVPlayer using a direct link like this : "wwww.myserver.com/videos/video.mov".
(I don't use php file, maybe i should..)
Generally videos take 1-2 seconds to start playing, but sometimes it can be very slow(until 1 minutes)
However, once the video started, the loading is very quick, but the start can be long, very long (even with fast connection).
Videos are small (maximum 6Mo), i compressed them using SDAVAssetExportSession.
Also I've disabled App Transport Security.
The issue can be server side ? i really don't know how solve this problem
Any help is appreciated
edit 1 : link to video on my server
A Little Background On Why I Have To Do This
I am currently optimising an app in order to improve the transferring of media files to the WiFi speakers that our team developed. Our solution before was using iPhone as an HTTP server and then allow the speakers to connect and download music from it. But unfortunately a lot of problems occurred such as frequent slow transfer speed, file read failure, and when user uses the "seek" command, the speakers would have to download the whole file in order for it to seek into that particular time before it starts to play. This is a very bad experience for our users.
What I Need
In order to solve the problem I mentioned above. We thought of changing the HTTP server to an RTP server that will be ran on an iPhone and then allows the WiFi speakers to stream music from it. However, from what I read on other Q&A platforms they mentioned that iPhone does that support transferring of data using RTP. I also tried searching here in stack but were not able to find an answer that solves my problem.
My Question
Is it possible to run an RTP server on iPhone and is there any demo about this that I can refer to?
Any suggestions would be high appreciated.
Please read link http://dss.macosforge.org/
Darwin Streaming Server from Apple official.
However, I'm not sure it can work on iOS.
Best regards,
I am using the Wowza media server to publish a stream via RTMP.
Now I would like to go further.
I want to play the video while I'm publishing it, without starting playing again.
How can I perform this?
I guess you need to outline what technologies are you using.
It is not a big deal to create flash app that will publish and play simultaneously.
If it is related to mobile applications then it is harder to accomplish that goal.