Manipulating video in ruby on rails - ruby-on-rails

I am building a web application where I will have multiple videos. But besides that there are also plenty of other things I want to be able to do, like click on video and save a video tag, then it will show up next time for other users who see the video (like youtube). Or pause the video, get the time where it is paused and then add a comment to it and save the time and the comment on my database.
Is it possible to do with just ruby on rails or do I need to use api or use other stuff? I will also want to do a bit more complex video manipulations but for now this ones are enough.

I am citing an example for the general HTML5 video tag, which supports few of the popular video formats. But this will be applicable across other popular video players like flowplayer as well
Have a look at link
You can send an ajax at every button(play/pause) press to your controller, which will save the details of the time at the which video was paused, so you can record that in your database. This link will give you an example of most of the properties, that you can play with :)

Related

How are videos on Youtube and such sites loaded and how is progress saved?

I hope you are doing well!
I am working on an eLearning website and came across the topic of the video loading. Since videos are of various sizes, it would be impossible to make the user wait for the entire download of the video for them to start watching, so it must be taken as a stream where the video keeps loading content as the user watches (similar to YouTube I guess). However, I am failing to find how this works? I've been recommended the use of SCORM and xAPI to help with this but I am only finding help on how to upload SCORM files or how to write xAPI code and not how to set them up in our website.
How can we make our videos download as the User watches? Are SCORM and xAPI actually what we should be looking for?
For context, we will be using React JS for our Frontend and will be saving the videos on a server.
I would greatly appreciate any advice you have and thank you for your time!
We tried using xAPI and SCORM however we aren't understanding how they might help
SCORM and xAPI by themselves are not going to assist you with this in general. To stream video via an eLearning course you will need to use a video player (such as the HTML5 video player or video.js) that understands streaming video protocols and to encode the video files in a format supported by that player. I would suggest reading about HLS for instance, though I didn't read the entire page, this is a good place to start: https://www.dacast.com/blog/hls-streaming-protocol/
A traditional eLearning course, such as you would have with SCORM, is going to provide a reasonable way to wrap the playing of video such that it can be launched for a learner via an LMS and may capture data such as completion. xAPI is probably suggested because it provides a more robust way of enabling the capture of interaction data such as when the learner plays, pauses, or seeks in a video. My preferred approach for doing this is to leverage cmi5, and there is an example of xAPI video profile usage within a cmi5 course in the Project CATAPULT sample content, see https://github.com/adlnet/CATAPULT/tree/main/course_examples. It could be adapted to leverage something like HLS and get streaming capability. Confirm with your LMS of choice ahead of time whether it supports cmi5 as adoption is still lower than for SCORM.
SCORM Cloud (a bit of a misnomer, https://cloud.scorm.com/) provides builtin video handling via the cmi5 mechanism and will soon support video streaming beyond just from YouTube without the need to author a course separately.

Select local video, shorten length, then upload video in a rails application

I need any idea to do this with javascript in ruby on rails application.
I want the user to upload a video, then the file has to be shortened from either start or finish. The purpose of that is to make the video smaller in size.
Then the video should be uploaded to server.
I have looked at https://github.com/danielcebrian/rangeslider-videojs but maybe someone knows a better solution
For uploading I have looked at FFmpeg, for cropping the video server side.
First off I should point out that if the editing process needs user interaction in the browser then this is not a simple problem, it's pretty complex and there's a lot of ways you could do it.
Uploading is a pretty easy part nowadays, as the built in rails ActiveStorage module works pretty well.
Now, if you don't need to edit videos in the browser, just want to clip them down to a specific size, then that is not too bad. You can indeed just have rails call ffmpeg with system. The -ss option is the flag you need, it's for cropping video.
If you do need user editing of videos in browser then you'll need to investigate a good JavaScript plugin for this, because it's not something that is going to be quick to write by hand.

PHP get screenshot from YouTube video

I have a website that has a variety of embedded YouTube videos. When a user pauses a given video I want a screenshot to be taken of the playing video. Now, I've taken many approaches in tackling this problem such as copying the video frame to canvas (this doesn't work because the videos are external to my site), and also through the use of FFMpeg, and FFMpeg-PHP. The latter two- although very powerful- also do not work as the given piece of media has to be hosted on my server.
I'm at my wits end about what to do as I've spent countless hours trying to do this, and I'm ready to accept defeat.
Any ideas?
Regards,
Andre.
There's no supported method in the YouTube Player or Data API to take a screenshot of an arbitrary frame of a video.
I used the img.youtube.com/vi path to get the image. The function getScreen basically parses the youtube url and grabs the &v= argument to get the video id.
Since I use youtube.com/embed/ url format, then I had to rework the function a little to get the video id.
http://mistonline.in/wp/get-youtube-video-screenshot-using-simple-php-and-javascript/#

How does YouTube count its views?

Trying to implement a similar approach towards our view counts for our web app.
Reading this article: http://mashable.com/2012/06/25/why-do-youtube-videos-freeze-at-301-views/
And watching: http://www.youtube.com/watch?v=oIkhgagvrjI
What approaches is YouTube taking to determine whether a view is valid? For example, not coming from bots, views services, or a user trying refresh the page several times. I know they probably have several approaches towards this. But looking to get started.
This is slightly difficult to answer and I am not from youtube. But I can take a stab at a few things to help you think about this.
When should you declare a view? There are several options:
The moment the user clicks on a video link and data starts flowing. [Simplest engineering solution but not really a valid measure]
If the user has watched the first 25% (or 30% or enter your number here) of the video. This could also be changed to say the user watched 40% of the video with scrubbing.
If the user has watched the entire video through to the end. Too conservative. Someone may stop watching at the last 5 seconds because say credits are rolling.
The user has watched the entire pre-roll advert (perhaps an advertisement client is only interested in this!) before going on to watch the video
There are also aspects of whether the video view is human/automated?
Are you getting too many views from the same location at a rate that is not humanly possible?
Are your video views showing a very unlikely pattern [say all views stop at exactly 45 seconds or at 50% of the video or all views are always to the end] even if the rate of arrival is not very fast. A human will have variations in the viewing pattern.
What are the sources that link to your views. Are you getting views from different sources?
Some rules on what was the previous video view, what is the next video view can also add to the detection of bots. [Say videos are being watched in alphabetical order or order that they were presented in a search. you know most likely this is an automated program just going through a list. ]
Then you can combine rules with location, OS, browser, device etc trying to stream. It gets more complicated than that after the initial set of rules. But I think you will get the gist of it.

Multiple HTML5 media elements on one page in iOS (iPad)

My research has led me to learn that Apple's media element handler is a singleton, meaning I can't have a video playing while an audio is playing in the background. I'm tasked to build a slideshow presentation framework and the client wants a background audio track, timed audio voice-overs which match bullet points, and variable media which can either be an image or video - or a timed cycle of multiple media elements.
Of course, none of the media works on iOS. Each media element cancels out the previous.
My initial thought is to embed the voice-over audio into the video when there's a video present, but there's an existing Flash version of this setup which depends on existing assets so I pretty much have to use what's delivered.
Is there ANY work-around for this? I'm testing on iOS 4.3.5. The smartest devs in the world are on this site - we've got to be able to come up with something.
EDIT: Updated my iPad to iOS 5.0.1 and the issue remains.
How about do it with CSS to do the trick.
Maybe you know about a company called vdopia that distribute video ad on mobile.
http://mobile.vdopia.com/index.php?page=mobilewebsolutions
They claim to developed what so called vdo video format, that actually just to do a css sprite running on that :D
I mean you could have your "video" as a framed image, then attach html5 audio tag on there.
I would like to know your response
Are you working on a Web App or on a Native Application?
If you are working on a Web App you're in a world of hurt. This is because you simply do not have much control over things that Mobile Safari doesn't provide right away.
If this is the case I would come forth and be honest with the stakeholders.
If you are working on a Native Application you can resort to a mechanism that involves some back and forth communication between UIWebView and ObjC. It's actually doable.
The idea is the following:
Insert special <object> elements in your HTML5 documents, that you handcraft yourself according to your needs, taking special care to maintain the attr-* naming convention for non-standard attributes.
Here you could insert IDs, paths and other control variables in the multimedia artifacts that you want to play.
Then you could actually build some javascript (on top of jQuery,p.e.) that communicates with ObjC through the delegation mechanism on the UIWebView or through HTTP. I'll go over this choice down below.
Say that on $(document).ready() you go through all the objects that have a special class. A class that you carefully choose to identify all the special <object>.
You build a list of such objects and pass them on to the ObjC part of your application. You could easily serialize such list using JSON.
Then in ObjC you can do what you want with them. Play them through AVPlayer or some other framework whenever you want them played (again you would resort to a JS - ObjC bridge to actually signal the native part to play a particular element).
You can "communicate" with ObjC through the delegation pattern in UIWebView or through HTTP.
You would then have a JS - ObjC bridge in place.
The HTTP approach makes sense in some cases but it involves a lot of extra code and is resource hungry.
If you are building an ObjC application and want further details on how to actually build an ObjC - JS bridge that fits these needs get back to us :)
I'm halting this post as of now because it would be nice to know if it is in fact a Native App.
Cheers.
This is currently not possible. As you notice when a video plays it takes up the full screen with quicktime and moves the browser to the background. The only solution at this time is to merge the audio and video together into an mp4 format and play that single item.
If I understand you correctly, you are not able to merge the audio and video together because it relies on flash? Since iOS can't play flash you should merge the audio and video together and use flash as a backup. There are numerous html5 players which use javascript to try and play the html5 video first then fallback to flash for backup.
You mention there is an existing Flash setup of the video - is it a an swf file, could you import it into a video/audio editing software and add an audio track on top?
Something like this: http://www.youtube.com/watch?v=J2vvH7oi8m8&feature=youtube_gdata_player
Also, if it is a Flash file, will you be converting it to an avi or like for iOS? If you'd have to do it anyway, there is your chance for adding an audio track.
Could you use a webservice to merge the streams in real time with FFMpeg and then stream one output to quicktime?
To elaborate maybe a library like http://directshownet.sourceforge.net/about.html could also work. It looks like they have method
DESCombine – A class library that uses DirectShow Editing Services to combine video and audio files (or pieces of files) into a single output file. A help file (DESCombine.chm) is provided for using the class.
This could then be used to return the resulting data as the response to the call and loaded via the HTML5 player.

Resources