I saw this announcement today about YouTube streaming 360 video, https://youtube.googleblog.com/2016/04/one-step-closer-to-reality-introducing.html
Does GCSVideoView loadFromUrl: work? Code below modified from VideoWidget iOS sample doesn’t show 360 video...
NSString *videoPath =#"https://www.youtube.com/watch?v=Db-uq08ydI4";
[_videoView loadFromUrl:[[NSURL alloc] initFileURLWithPath:videoPath]];
Excited to see this working! Thanks!
Before anything, check this question answer's
I'm really looking forward to see a working answer to this... Working myself into a possible solution..
What I found out is that GCSCardboardView is an extension of GLSurfaceView. All the Cardboard Viewports and Proyections are on top of OpenGL. I'm no expert but the way to go (for me) is 'How to show videos through an OpenGL view'.
Second step would be: Create a Pixel Buffer in OpenGL to support video stream.. that's where i'm stuck.
The Google VR SDK is made to turn a 360º video file into the VR environment. Youtube is an HTML page. If you could access the stream directly, you would be able to use the GVRVideoView.
Related
Is that any possibility to buffer Live Streaming?
I searched a lot but didn't get any official answer. Different people have different views on this.
Many people told that It is open from ios 10 , but did not get this.
Some have answered that use caching proxy, but I did not understand this.
Thanks for your valuable time
I'll share a good answer i found.
https://stackoverflow.com/a/36538295/2037169
To sum it up. this is wwdc 2016 on how to work with HLS from ios 10.
i did a poc today using this example it works great.
im currently looking into how to use HLS in tvOS.
the AVFoundation class they used in the example seem to apply to ios only
NS_CLASS_AVAILABLE_IOS(9_0) __TVOS_PROHIBITED
AVAssetDownloadTask : NSURLSessionTask
I am wondering if there is a possibility to show only a fragment of a video stream. Let's say we have a video 1920x1024 and I want to show only a rect from 0,0 to 600,400.
Can I achieve this? What kind of libraries can do that? I tried to find something in VLC but i did not find anything.
Something similar to this:
https://www.youtube.com/watch?v=8nNoUDH2k3Y
Thanks
You should be able to achieve this yes, however I am not quite sure if you can achieve exactly the same dynamic cropping effect shown in the video you linked to.
For VLC the term crop is what you're looking for. The wiki is quite minimal, however it should get you started. Otherwise check out the command line help (vlc -H) and look for the crop and padd parameters.
An example crop such as in your case 0,0 600,400 would be achieved through the video filter named croppadd or the vout filter crop (depending on your VLC version and platform)
vlc $video_input --sout='#transcode{vcodec=XXX,vb=XXX,fps=XX.X,width=1920,height=1024,vfilter=croppadd{croptop=0,cropbottom=624,cropleft=0,cropright=1320}}:standard{access=XXX,mux=XX,dst=XXX.XXX.XXX.XXX}'
vfilter
vfilter=croppadd{croptop=0,cropbottom=624,cropleft=0,cropright=1320}
Another way to use it as stated in the wiki
vlc $video_input --video-filter=croppadd --croppadd-croptop=0 --croppadd-cropbottom=624 --croppadd-cropleft=0 --croppadd-cropright=1320
For the library VLC wise it would be libvlc and the corresponding LibVLC video controls get and set_crop_geometry - See this link for the SDK.
Maybe this leads you in the right directions. It certainly depends on your setup. Are you scripting and restranscoding a video stream or are you trying to integrate a library into your code.
I am developing an application for playing video from a url in bb10 cascades.I know it's very simple to play in media player.Please help me to play a video in default player.
I strongly recommend that you review the documentation provided on the BB10 Native micro-site that you will find here:
http://developer.blackberry.com/native/
I know there is a lot to look at, but it does seem to me to be quite logically presented. I just looked round and fairly quickly found this:
http://developer.blackberry.com/native/documentation/cascades/graphics_multimedia/audio_video/playing_audio_or_video.html
which seems to answer your question. Alternatively, you want want to just invoke the native player, in which case you need to look here:
http://developer.blackberry.com/native/documentation/cascades/device_platform/invocation/invocation_framework.html
I have a website that has a variety of embedded YouTube videos. When a user pauses a given video I want a screenshot to be taken of the playing video. Now, I've taken many approaches in tackling this problem such as copying the video frame to canvas (this doesn't work because the videos are external to my site), and also through the use of FFMpeg, and FFMpeg-PHP. The latter two- although very powerful- also do not work as the given piece of media has to be hosted on my server.
I'm at my wits end about what to do as I've spent countless hours trying to do this, and I'm ready to accept defeat.
Any ideas?
Regards,
Andre.
There's no supported method in the YouTube Player or Data API to take a screenshot of an arbitrary frame of a video.
I used the img.youtube.com/vi path to get the image. The function getScreen basically parses the youtube url and grabs the &v= argument to get the video id.
Since I use youtube.com/embed/ url format, then I had to rework the function a little to get the video id.
http://mistonline.in/wp/get-youtube-video-screenshot-using-simple-php-and-javascript/#
I am trying to create a video player for iOS, but with some additional audio track reading. I have been checking out MPVideoPlayerController, and also AVPlayer in the AV Foundation, but it's all kinda vague.
What I am trying to do is play a video (from a local .mp4), and while the movie is playing get the current audio buffer/frames, so I can do some calculations and other (not video/audio relevant) actions that depend on the currently played audio. This means that the video should keep on playing, with its audio tracks, but I also want the live raw audio data for calculations (like i.e.: getting the amplitude for certain frequency's).
Does anyone have an example or hints to do this ? Of-course I checked out Apple's AV Foundation library documentation, but it was not clear enough for me.
After a really (really) long time Googling, I found a blog post that describes MTAudioProcessingTap. Introduced in iOS 6.0, it solves my problem perfectly.
The how-to/blogpost can be found here : http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I Hope it helps anyone else now....... The only thing popping up for me Googling (with a lot of different terms) is my own post here. And as long as you don't know MTAudioProcessingTap exists, you don't know how to Google for it :-)