Is there anyway to shut down a YouTube video that you've embedded on your page? It seems like they play to the end, no matter what. What I'd like is some way to use JavaScript to send a STOP signal to the YouTube player to completely disengage from the video and show whatever was there before the video was started. An END signal from the video player would also be nice, that called my JavaScript when the video finished.
By the way, I notice when I right-click the playing video that one option is "About the HTML5 Player" so apparently YouTube is using the HTML5 player. That might make the task of communicating with the player with JavaScript a bit easier.
Thanks for any ideas.
Yes, there is a way. Use what official documentation says: player.stopVideo(). If you have a problem with it, see Stop a youtube video with jquery? and How can I stop a video with Javascript in Youtube? as they not only contain solutions to common problems but discuss alternative ways of achieving the same goal.
Related
I hope you are doing well!
I am working on an eLearning website and came across the topic of the video loading. Since videos are of various sizes, it would be impossible to make the user wait for the entire download of the video for them to start watching, so it must be taken as a stream where the video keeps loading content as the user watches (similar to YouTube I guess). However, I am failing to find how this works? I've been recommended the use of SCORM and xAPI to help with this but I am only finding help on how to upload SCORM files or how to write xAPI code and not how to set them up in our website.
How can we make our videos download as the User watches? Are SCORM and xAPI actually what we should be looking for?
For context, we will be using React JS for our Frontend and will be saving the videos on a server.
I would greatly appreciate any advice you have and thank you for your time!
We tried using xAPI and SCORM however we aren't understanding how they might help
SCORM and xAPI by themselves are not going to assist you with this in general. To stream video via an eLearning course you will need to use a video player (such as the HTML5 video player or video.js) that understands streaming video protocols and to encode the video files in a format supported by that player. I would suggest reading about HLS for instance, though I didn't read the entire page, this is a good place to start: https://www.dacast.com/blog/hls-streaming-protocol/
A traditional eLearning course, such as you would have with SCORM, is going to provide a reasonable way to wrap the playing of video such that it can be launched for a learner via an LMS and may capture data such as completion. xAPI is probably suggested because it provides a more robust way of enabling the capture of interaction data such as when the learner plays, pauses, or seeks in a video. My preferred approach for doing this is to leverage cmi5, and there is an example of xAPI video profile usage within a cmi5 course in the Project CATAPULT sample content, see https://github.com/adlnet/CATAPULT/tree/main/course_examples. It could be adapted to leverage something like HLS and get streaming capability. Confirm with your LMS of choice ahead of time whether it supports cmi5 as adoption is still lower than for SCORM.
SCORM Cloud (a bit of a misnomer, https://cloud.scorm.com/) provides builtin video handling via the cmi5 mechanism and will soon support video streaming beyond just from YouTube without the need to author a course separately.
Is there any possibility to create a YouTube playlist that will work in a real time.
All viewers will watch the same content at the same time as the traditional television.
https://viloud.tv offers this feature, but I wonder is there some script or way of embedding to achive the same thing.
You will probably have to start a live stream.
I am building a web application where I will have multiple videos. But besides that there are also plenty of other things I want to be able to do, like click on video and save a video tag, then it will show up next time for other users who see the video (like youtube). Or pause the video, get the time where it is paused and then add a comment to it and save the time and the comment on my database.
Is it possible to do with just ruby on rails or do I need to use api or use other stuff? I will also want to do a bit more complex video manipulations but for now this ones are enough.
I am citing an example for the general HTML5 video tag, which supports few of the popular video formats. But this will be applicable across other popular video players like flowplayer as well
Have a look at link
You can send an ajax at every button(play/pause) press to your controller, which will save the details of the time at the which video was paused, so you can record that in your database. This link will give you an example of most of the properties, that you can play with :)
I would like to build a new player with following requirements like:
Call to action : Something like showing suggestion videos at the end of the video. (Like youtube)
Playlist : Player should have playlist (Like Youtube)
Playback speed : Playback speed should be adjusted as like in VLC media player
Closed Captioning : Something like closed captions which comes at the play time of a video in youtube
Should play from various service providers CDN
Will youtube's documentation on https://developers.google.com/youtube help us in any way with this requirement?
There are lots of open source and/or free players that already do this, so there would be no reason to build your own; have a look at Flowplayer, JW Player, videoJS, and others that a Google search pull up. If you're insistent on building your own, the YouTube documentation may give you some ideas as to what to look for in your own player, but the function calls will be different as you'll have to use the javascript media APIs (for HTML5 video).
I am trying to create a video player for iOS, but with some additional audio track reading. I have been checking out MPVideoPlayerController, and also AVPlayer in the AV Foundation, but it's all kinda vague.
What I am trying to do is play a video (from a local .mp4), and while the movie is playing get the current audio buffer/frames, so I can do some calculations and other (not video/audio relevant) actions that depend on the currently played audio. This means that the video should keep on playing, with its audio tracks, but I also want the live raw audio data for calculations (like i.e.: getting the amplitude for certain frequency's).
Does anyone have an example or hints to do this ? Of-course I checked out Apple's AV Foundation library documentation, but it was not clear enough for me.
After a really (really) long time Googling, I found a blog post that describes MTAudioProcessingTap. Introduced in iOS 6.0, it solves my problem perfectly.
The how-to/blogpost can be found here : http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I Hope it helps anyone else now....... The only thing popping up for me Googling (with a lot of different terms) is my own post here. And as long as you don't know MTAudioProcessingTap exists, you don't know how to Google for it :-)