i was asked recently if its possible to sniff the s3 playback url of a video that is shown in a Java application. So i loaded up wireshark and launched the Java app and started recording.
but i few question came up.
how would i filter for video url
how do i tell if the packet sniffed belongs to a video stream
how can i rebuild the aws s3 url that was picked up. i assume its a private url, would the query parameters providing the access key be picked up?
or if this is at all possible.
thank for any feedback or suggestions.
Related
I am trying to build a web app for users to easily add text (as open caption) and other assets in my app as overlays in real-time to their YouTube live stream video.
They will use their camera to record their video, and select from my app which text should be added to the video.
Then, the video will be sent to Youtube live through their API.
Here are my questions:
First of all, I was wondering if mixing video + subtitle and sending it to Youtube's rtmp url can be done from the client side, so it's simple and lightweight.
Second, should I encode the output being sent to Youtube? Can this be done from the browser too?
I'm only seeing a few node.js frameworks, and even they're not very mature (or is Webcodecs for this purpose?). Is a web app a poor choice for this task?
Lastly, if I do need a server to process the video, where should the encoding happen (from the user's machine, or in the server, or both?)? Is my server most likely going to be the bottleneck given YouTube's infrastructure, since video files are huge and my server is limited?
I am new to video streaming, so please excuse my lack of understanding of the subject. Also, if there's any good resource for my problem, please share them with me.
First of all, I was wondering if mixing video + subtitle and sending it to Youtube's rtmp url can be done from the client side, so it's simple and lightweight.
You can do the video compositing and audio mixing and what not, but browsers don't support RTMP. To get the data to an RTMP server, you need to send it to a server where it is proxied off to the final URL.
They will use their camera to record their video, and select from my app which text should be added to the video.
Yeah, that's no problem at all. Draw everything to a canvas every frame.
Second, should I encode the output being sent to Youtube?
Yes, you must. Check out the Media Recorder API.
Lastly, if I do need a server to process the video, where should the encoding happen (from the user's machine, or in the server, or both?)?
The video has to be encoded client-side to get to the server in the first place. The server can then hopefully just repackage with flv and send it along. If the browser doesn't support H.264 in its Media Recorder API, then you'll have an intermediary codec like VP8, and you'll have to transcode server-side.
A few years ago, I wrote a tutorial on how to do all of these steps here: https://github.com/fbsamples/Canvas-Streaming-Example Note that the tutorial is in the context of Facebook, but this should teach you the concepts.
I'm building an iOS app and Android app that will display a series of private videos. Someone will purchase the app for x amount and then have access to the videos through that app only.
I already know a couple of ways to do that part. But the real trick is hiding the video urls to traffic sniffers/etc. I don't want anyone to be able to detect the video urls, or at least the endpoint will reject a request without an auth token.
So I could build my own Node/Express server, incorporate wowza maybe with Amazon to store the files - but that is a lot of work.
So what is the simplest solution to stream my videos to mobile without people being able to load up the videos outside of the app?
It looks like you'll need to implement some sort of authentication system, so that even if they get the video url somehow, they will be unable to view it without the authentication key.
Your videos should be hosted in a directory on your server that is inaccessible from the web. Then use some sort of index page which takes a parameter for the video ID and does the authentication before serving up the video file contents.
I want to know details how Mobicent Media server plays audio using URL? In which method it's streaming using URL. For local storage I'm understanding but for remote storage(URL) I'm not getting how it's working. I didn't find anything in the user guide or in website. It will be very helpful if you please inform me details or suggest me any web link.
Thanks.
In order for the Media Server to play a file you need to send a Play signal, where you specify the URL of the file to be played. If the file is local then the URL format will be file://path_to_file/filename.wav; otherwise if the file is hosted remotely then the URL format will be http://something/filename.wav.
Upon receiving the Play request, the Media Server will ask the underlying AudioPlayer to process the URL, first guaranteeing that it is not malformed and that the file type is supported: .wav, .gsm, .tone, .mov, .mp4, .3gp. If all is fine, the player will open a stream to the file right away.
Finally, the AudioPlayer is activated so it can start processing the file and transmitting audio to the remote peer.
You can find a fully detailed discussion about this topic in the mobicents public forum as well (including links to github code):
https://groups.google.com/d/msg/mobicents-public/4zuUOM3zHsM/fQM6o80JEXwJ
Let me know if this helps.
Regards.
I am trying to make a Periscope-like app (not practically, but technical requirements are alike) where users can start streaming quickly from their iPhone to an unknown amount of users, both mobile. I am trying to use Azure Media Services for live video streaming, but even after reading pages of documentation I'm stuck.
I'm using VideoCore (https://github.com/jgh-/VideoCore) to publish from iOS device to the RTMP server. On local (using Wowza) I can just connect to the local server with my set username and password as shown:
vcSession = [[VCSimpleSession alloc] initWithVideoSize:CGSizeMake(1280, 720) frameRate:30 bitrate:1000000 useInterfaceOrientation:NO];
[self.view addSubview:vcSession.previewView];
vcSession.previewView.frame = self.view.bounds;
vcSession.delegate = self;
[vcSession startRtmpSessionWithURL:#"rtmp://172.20.10.2:1935/live?rtmpauth=test:test" andStreamKey:#"test"];
Where the rtmpauth parameter has the username:password format, which I've set both to test on my local server. It works. In Azure, I've created a channel named test, and I've got the following Ingest URL:
rtmp://test-myappname.channel.mediaservices.windows.net:1935/live/some-long-hexadecimal-string
In Wirecast, I'm able to stream to URL (though EXTREMELY slow and connection frequently lost, don't know why) by selecting Azure Media Services in Output Settings and typing that Ingest URL. In iOS, I have no idea how to connect to Azure Media Services.
In startRtmpSessionWithURL:andStreamKey: method, I've tried all the possible combinations of URL and a stream key, but no luck. I have no idea what my username/password is (nothing is given at the Azure side), what the stream key is (I've tried test, live, empty string) and what that long hexadecimal string is (some sources say that it's called a locator, though).
What is the correct format of RTMP URL and stream key when connecting to Azure Media Services for streaming?
I'll find someone to help you. I think you are just missing a stream name after the long hex string in the URL.
rtmp://test-myappname.channel.mediaservices.windows.net:1935/live/some-long-hexadecimal-string/[YOUR-CUSTOM-STREAM-NAME-Anything Really!]
Also, do you have any control over the encoding settings? Its possible that some encoding settings are not right. We have not tested with that VideoCore library, so it may also be that there is a slight variation in the RTMP protocol (since it is very poorly documented and there is a lot of missing information out there).
I'm curious why your Wirecast setup is having trouble as well. That doesn't sound good to start with. Network issue? Are you setting it to the proper Encoder preset with H.264 and NOT x264 set?
Review your settings in Wirecast against Cenk's blog post here: http://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/
I'm trying to play a live stream that is being sent out by a Wowza server, we are using RTMP to handle the streams. We have an equivalent for that works on android and the way they do it is by connecting to the server via the url and 2 parameters to identify the actual stream to play and if you are allowed to see the live video. After the connection is attempted, the server does a call back sending an integer for a check to see if the user is logged in. Once the check has passed, the video is played.
I have no idea how to handle the call back or how to properly set up the connection so that it takes both parameters and the url.
One big issue is that the Wowza server was created by a third party that we are no longer in contact with, so i have no idea on how the actual server is set up.
Any suggestions would be greatly appreciated
I recommend a few steps to start with:
Determine how the Android app actually works. Is the the server you speak of to which it sends the two parameters the Wowza server? If so, it is probably a custom plugin.
Get access to the server, so that you can configure it for iOS streaming.
You'll also need to check out some documentation, http://www.wowza.com/forums/content.php?3-quick-start-guide, http://www.wowza.com/forums/content.php?217#cupertinostreaming
Once you have a better understanding of the problem, the Wowza folks are very helpful at http://www.wowza.com/forums.
Good luck!
there is no need to android apps etc. Wowza is supporting the output which can be played # ios, because of the ios cant play flash, then it cannot played rtmp or rtsp . However Ios can play the stream which is like http: // myWowzaServer/myApplication/myStream/playlist.m3u8 ...
For better config detail please visit the
http://emrekaratasoglu.com/php-freelance-watch-wowza-live-stream-apple-ios-mobile-phone/