I have done with Video Encoding using AVFoundation framework in ios.
Now i want to stream these video to a RTMP server using FFMPEG.
IT would be great help if anyone of you post a link / sample Code for achieving this.
Any other Solution other than this is also invited.
Thanking you in advance.
Here's some sample code to get you started.
Related
I saw this announcement today about YouTube streaming 360 video, https://youtube.googleblog.com/2016/04/one-step-closer-to-reality-introducing.html
Does GCSVideoView loadFromUrl: work? Code below modified from VideoWidget iOS sample doesn’t show 360 video...
NSString *videoPath =#"https://www.youtube.com/watch?v=Db-uq08ydI4";
[_videoView loadFromUrl:[[NSURL alloc] initFileURLWithPath:videoPath]];
Excited to see this working! Thanks!
Before anything, check this question answer's
I'm really looking forward to see a working answer to this... Working myself into a possible solution..
What I found out is that GCSCardboardView is an extension of GLSurfaceView. All the Cardboard Viewports and Proyections are on top of OpenGL. I'm no expert but the way to go (for me) is 'How to show videos through an OpenGL view'.
Second step would be: Create a Pixel Buffer in OpenGL to support video stream.. that's where i'm stuck.
The Google VR SDK is made to turn a 360º video file into the VR environment. Youtube is an HTML page. If you could access the stream directly, you would be able to use the GVRVideoView.
Anybody know how to use FFMpeg for live streaming in ios? Where do I download the FFMpeg library from? Can somebody give some hint about where to start? I don't have any code in this question because I have no idea what to ask.
Any link where I can start with will be a great help. All the answers for similar questions are outdated. There are no proper tutorials also.
Is there any way to get PCM frames from a song which is playing in Deezer or Spotify and if there is, could you maybe explain briefly how ?
I checked in the both API a way to do that but I'm not very lucky tonight and I didn't find a answer yet... :(
Any kind of help will be very usefull, thanks a lot.
Kind Regards,
Sébastien.
Disclaimer: I work for Spotify
libspotify delivers raw PCM frames in the music_delivery callback, see the API documentation for more details. Actually this is the default delivery mechanism for libspotify, so you don't need to do anything special to get raw PCM, that's the format which the library speaks.
I'm not sure about the Spotify Web Apps platform, I'm not a Javascript guy at all...
I have been searching for a way to stream audio and video for a while. I could find some explanations but not a full tutorial on how to do it. Can anyone please provide a way to do it. Tutorials or sample codes will be very helpful...
Here's a fairly recent blog post on the BlackBerry Developer's Blog about the Streaming Media API, including sample code.
I am developing a silverlight application in which i want to play youtube video.
Any Suggestions plz. Any samples or any links for this to refer.
Thanks in advance
here's an intersting thread about this issue, with some samples in SL 3.0 beta:
http://silverlight.net/forums/t/53008.aspx
hope it helps
http://slyi.silverlight.googlepages.com/youtube2.html
was working. Now the link used is wrong, but if you change it that example still works!