I am using cordova ios app for live streaming. I am able to stream video from iphone but when i try to join stream it never calls remoteStreamAddedHandler function to display streaming video.
I am using cordova-plugin-iosrtc plugin. It also shows status that "someone has joined room" but not calling that remoteStreamAddedHandler where i can append video tag from. It is working fine in andriod phone.
Thanks
remoteStreamAddedHandler event is fired when a remote user has already joined the room and published his stream.
On apiRTC tutorial : 11-VIDEO CALL STREAMING, user only subscribe to available streams (there is no video publish)
You need to have an user connected with tutorial 10-GROUP CALL, this user will publish his stream and subscribe to remote streams.
The tutorial 12-GROUP CALL - ADVANCED shows you an advanced sample where user can choose to publish/subscribe streams.
Related
I am streaming video from an IP camera to Youtube using the restreamer software (https://datarhei.github.io/restreamer/) running on a Raspberry Pi 4.
Actually everything works as well. But when the stream is interrupted (for whatever reason), a new livestream is created only when I click on the "Dismiss" button in Youtube Studio.
But since I want to start the new stream automatically, I'm looking for a way to execute the click on the button (or the associated command) via the Youtube API.
How can I execute this?
I will answer my own question because I found the problem description and a solution in another forum: https://obsproject.com/forum/threads/stream-didnt-show-error-or-load-to-youtube.130897/
Something has changed on youtube in the last few days, but the solution to manage a real streaming reconnection is this: from the youtube control room from the manage icon, you need to create a scheduled streaming. Start the encode on this stream and everything will work fine. If the connection drops, youtube will say streaming offline; when the connection is re-established, streaming restarts.
I have to be able to record an incoming video call into a file. The recording must be done on the desktop application, built with electron. I'm using OpenVidu as a streaming platform. Is there any way to do that?
#Vasniktel Technically it could be possible to record the video client side as there are a number of WebRTC examples that record locally on the client, however this is not natvie to openvidu. However recording on electronjs is...
github.com/hokein/electron-screen-recorder
tutorialspoint.com/electron/… You could integrate recording separately along side your openvidu app.
The main difference here is that you want to record an incoming call and while you likely won't be able to just write the incoming webrtc data you should be able to record the area of the app (canvas) where the video player is rendered. You will be re-encoding the decoded rendered video stream, but it shouldn't be too much of a hit performance wise.
I want to broadcast existing videos to multiple users through wowza...
Suppose I want to broadcast any 1 uploaded video (in wowza server) to my multiple users? so how can I do that.. can wowza call any API to start streaming in other users devices? Means when I started streaming video from my application then it should start in other devices also through wowza API.
Are you talking about broadcasting (streaming) an MP4 file as a simulated Live stream (playout) or as Video On Demand (VOD)?
Obviously you cannot force devices to start playing a stream. That'd only work if you develop an App that can listen for commands and trigger playback accordingly. Wowza doesn't have such an App, nor any built-in features that can do this.
If you want devices to access a stream on-demand you can simply upload the file to Wowza's content folder. If you want to have a programmed playout, like a TV channel, then you can check out this article: https://www.wowza.com/docs/how-to-schedule-streaming-with-wowza-streaming-engine-streampublisher
(the source code of the plug-in that is used in that article is available from https://github.com/WowzaMediaSystems/wse-plugin-streampublisher)
As from your Question, You can broadcast stream successfully and you might have used Wowza Go Coder SDK for doing it.
On Broadcasting live stream, broadcasted videos stored in CONTENT directory which is inside of Stream Engine installation directory structure.
You can find all the streamed videos in it.
Now, You want to broadcast particular stored video then you can do it by loading specific URL for that video. Broadcasting for that Video is not possible but, you can play that particular video as below which will be accessible to all your application users :
In IOS, Link or URL for Videos stored in CONTENT directory is as below :
http://[Host Address]:[PORT]/vod/mp4:sample.mp4/playlist.m3u8
rtsp://[Host Address]:[PORT]/vod/sample.mp4 (In Android)
Here, sample is the name of your Stream. You have to broadcast live stream video with different stream names so that all videos are accessible.
In this way, you can play stored live stream videos.
I am using iOS SDK Skype for Business following are my concerns.
Latency issue - It takes lots of time to connect the call even at good network also I am keeping video service on demand default connection is only for Audio feeds.
After call connected audio feed default set to muted, didChangeIsMuted delegate returns true (Mute). User has to manually press the button to unmute it.
Latest SDK Demo at url https://github.com/OfficeDev/skype-ios-app-sdk-samples/tree/master/BankingAppSwift is not compiling successfully. Few resource files are missing (Helper files).
The Meeting join only connects to IM and Audio. It does not enable Video by default. You have to explicitly start the video service one the meeting join in complete (Conversation moves to established state).
For Latency, can you give an estimate.. how long did it take to join the meeting?
We will look at this one and reply back.
Please use the Helper files from the SDK Zip package. They are not provided by default in the GitHub samples.
I am developing a video conference application using licode having multiple users(suppose 4).
I want that every user can view his webcam's video but he can publish his video in conference room only when he gets the permission.
I get access of camera using following.
localStream.init();
localStream.show("myVideo");
this is working fine.
Through a script we decide which user will get permission of publishing stream, under the script i am using following code to publish users stream.
room.publish(localStream);
but through this users stream is not publishing under the room, please tell me what i am doing wrong.
also is there any process to check how many streams in the room??
Thanks
The localStream is always available and can be used to publish anytime. Just recheck your code again. I would suggest to use setTimeout and publish the stream after 30 seconds your localstream is generated. I am sure this will work.