AWS Elemental Media Live broadcast channel - ios

I have created a channel and an input in AWS Media Live for publishing a live stream using RTP Push. I followed the documentation in http://docs.aws.amazon.com/medialive/latest/ug/getting-started.html. After creating the channel and running it, the channel has started successfully but the problem is
I do not know where to publish the stream and how to publish it?
I'm trying to integrate it with an iOS app to publish the stream and receive it in another end.
Any help would be appreciated.

Since no one answered this. MediaLive merely receives an input, transforms it and pushes it somewhere else. You need to give it a destination to push to which you can then pull from a client. AWS also provides MediaPackage for this purpose.

When you created your channel in MediaLive you specified an Input which contains two destinations (A and B). A is the primary and B the backup.
You have to publish your stream to the destinations (ie rtp://ip:5000). Please double check the Security Group you created for the Input to make sure you allow your server to push your stream.

Related

Unity Chat application

i am currently working on a real-time chat app in Unity
and i found those platforms to work with,
Firebase : Can we send Videos efficiently ??
MatriX : https://www.ag-software.net/matrix-xmpp-sdk/
but i am not sure that we can send videos with MatriX ?
i wanted to know from your experience
what is the best way to make real-time chatting (support photos and videos sending) in Unity ?
thanks in advance
You need to find or create services where your clients can connect and:
upload files (photos, videos .etc) and get an public and downloadable URL.
send messages to other connected clients that apart from the string, also contain media metadata (.eg a list of file attachments which are actually URLs uploaded at service (1))
Now, if you cannot find a single service that supports those two then you could try to find two different ones.
here is an example of a chat console application in C#. It contains a web service and client library that is used by the console app. Instead of a console app, it could be used in a Unity app. It does not support file uploading but it can send messages between clients over web sockets.
If you were to create something yourself, instead of finding a 3rd party service, I would recommend node.js/express and socket.io for a server since its quite beginner friendly.
here is a C# client library that can listen to socket.io events from the server. It must be the same that is used in the console application I shared above.

Twilio video: recording rooms server-side

Context: we're building a HIPAA-compliant video chat, and evaluating Twilio as a potential supplier for video streams. Part of the requirement is that we need to make a recording on each video -and this needs to be stored encrypted in a HIPAA-compliant storage.
Having set up Twilio's excelent quickstart example, I've started a server, and were able to connect with two clients to it, with videos. However, looking around Twilio's room configuration, the server-side recording appears to refer to Twilio-based storage, which is not HIPAA-compliant.
Question: In what ways can we configure the started Node server to save a local copy of all streams participating in a room?
Thank you!
Twilio developer evangelist here.
When you set up a group room based video chat using Twilio Video all participants in the chat make WebRTC connections to a Twilio server in order to transmit and receive data via the room. When you turn on recording, the video that passes through the server is then written to disk. As far as I'm aware, this is not HIPAA compliant.
We do have a page on building HIPAA compliant video applications with Twilio Video but the advice is to use peer to peer rooms so that the only media that potentially goes through Twilio (via the TURN relay) is encrypted and can't be read or saved by Twilio.
You can't record the video on the Node server from the quickstart, because that's not used to stream the media at all. It only exists to generate an access token.
You could build a server that also joined the peer to peer room of the chat and saved the video that way. I have no experience in building WebRTC server applications though, so I can't help guide you with that. It's certainly not a case of just configuring the server differently.
Your other option would be to record the video in the client and somehow transfer that to your server. That might be unwieldy though for long chats that would cause extra work on the client and result in a potentially large video file to send to the server.

How to use the comment update service in iOS?

I have created an app in objective C, In this app i have one page there User can comment live and for the updating comments i am hitting every 5 min the web-service. I have no idea about the server data change .
I want to hit the service while the data has been change to the server.
Is it possible . Or we can use some other way for the web services.
Thanks, Please answer if you have an correct way to solve it.
Go to this web site PubNub, download SDK's for both Server and Objective-C. PubNub is a common Stream Service with Subscribe/Publish services. After implementing SDK's, make your Client as Subscriber, and make your Server as Publisher. Simply; Subscribers are listening channels for data. When you have a new comment, Publish that comment from Server to channel which your client has already subscribed. Do not forget, free accounts are for demo purposes and have limitations.

iOS broadcasting live to Azure Media Services

I am trying to make a Periscope-like app (not practically, but technical requirements are alike) where users can start streaming quickly from their iPhone to an unknown amount of users, both mobile. I am trying to use Azure Media Services for live video streaming, but even after reading pages of documentation I'm stuck.
I'm using VideoCore (https://github.com/jgh-/VideoCore) to publish from iOS device to the RTMP server. On local (using Wowza) I can just connect to the local server with my set username and password as shown:
vcSession = [[VCSimpleSession alloc] initWithVideoSize:CGSizeMake(1280, 720) frameRate:30 bitrate:1000000 useInterfaceOrientation:NO];
[self.view addSubview:vcSession.previewView];
vcSession.previewView.frame = self.view.bounds;
vcSession.delegate = self;
[vcSession startRtmpSessionWithURL:#"rtmp://172.20.10.2:1935/live?rtmpauth=test:test" andStreamKey:#"test"];
Where the rtmpauth parameter has the username:password format, which I've set both to test on my local server. It works. In Azure, I've created a channel named test, and I've got the following Ingest URL:
rtmp://test-myappname.channel.mediaservices.windows.net:1935/live/some-long-hexadecimal-string
In Wirecast, I'm able to stream to URL (though EXTREMELY slow and connection frequently lost, don't know why) by selecting Azure Media Services in Output Settings and typing that Ingest URL. In iOS, I have no idea how to connect to Azure Media Services.
In startRtmpSessionWithURL:andStreamKey: method, I've tried all the possible combinations of URL and a stream key, but no luck. I have no idea what my username/password is (nothing is given at the Azure side), what the stream key is (I've tried test, live, empty string) and what that long hexadecimal string is (some sources say that it's called a locator, though).
What is the correct format of RTMP URL and stream key when connecting to Azure Media Services for streaming?
I'll find someone to help you. I think you are just missing a stream name after the long hex string in the URL.
rtmp://test-myappname.channel.mediaservices.windows.net:1935/live/some-long-hexadecimal-string/[YOUR-CUSTOM-STREAM-NAME-Anything Really!]
Also, do you have any control over the encoding settings? Its possible that some encoding settings are not right. We have not tested with that VideoCore library, so it may also be that there is a slight variation in the RTMP protocol (since it is very poorly documented and there is a lot of missing information out there).
I'm curious why your Wirecast setup is having trouble as well. That doesn't sound good to start with. Network issue? Are you setting it to the proper Encoder preset with H.264 and NOT x264 set?
Review your settings in Wirecast against Cenk's blog post here: http://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/

licode publishing licode stream not working

I am developing a video conference application using licode having multiple users(suppose 4).
I want that every user can view his webcam's video but he can publish his video in conference room only when he gets the permission.
I get access of camera using following.
localStream.init();
localStream.show("myVideo");
this is working fine.
Through a script we decide which user will get permission of publishing stream, under the script i am using following code to publish users stream.
room.publish(localStream);
but through this users stream is not publishing under the room, please tell me what i am doing wrong.
also is there any process to check how many streams in the room??
Thanks
The localStream is always available and can be used to publish anytime. Just recheck your code again. I would suggest to use setTimeout and publish the stream after 30 seconds your localstream is generated. I am sure this will work.

Resources