Thumbnail upload via YouTube API: Why does it show a surrogate image? - youtube-api

I am using a small self written Java program which uploads videos and thumbnails to YouTube (I communicate with the API directly not via a special YT-library).
Since roughly last week uploading of thumbnails does not work anymore, although I have not changed anything in my program. The API does not report an error, but when I log in to my YouTube-Account it shows the following image instead of the thumbnail:
This is different from what happens if no thumbnail is uploaded at all, because in this case an automatically chosen snapshot from the video is shown.
Now I can upload my thumbnail using the Web frontend and it works perfectly, which is why I believe it is not a problem with my image.
What could be the cause of the problem?
This issue may or may not be related...

The problem was probably a bug on the server side. Without that I changed anything at all in my application it suddenly continued to work at some point in time (which is already quite a while ago as I forgot about that question here).

Related

Videos failing to completely download

I have a rails app in production. In this rails app, I'm serving some video content to the user. I have all these videos stored in a Digital Ocean Space.
I'm using VideoJS for showing the videos. However, it's very common that I get this error:
A network error caused the media download to fail part-way.
After looking at a github issue, the consensus was that there was something wrong with the server, and not VideoJS. What do you think I should try doing?
Note: This was happening in development, and storage in that environment was on my hard disk However, I figured that once I was storing the videos in a "bucket," that it would work better but I'm still getting this error.
Has anyone else had this problem?
Update
So the videos are loading, and I'm able to watch them. However, when I'm around half way through the video, i'll get this error and then the video will just stop playing.

HEIC/HEIF changed to jpeg without metadata at upload

I have a small webapp that runs on a server intended for use in a setting without reliable internet access (i.e., the app can't depend on outside resources during production). The purpose of the app is simply to upload image files, read the metadata, and categorize them in the right location on the disk. Up until recently there was no problem with this process, then I noticed that some of the files did not have all of the metadata attached (specifically the creation date). Upon further inspection, it appears that these are files that were shot on my iPhone as HEIC/HEIF photos and uploaded directly to the webpage from the phone.
Due to the design of the webapp, the filename of the uploaded file is shown on the page. Every time an HEIC photo is uploaded it displays the filename as ending in .jpeg.
I've had a hard time finding good documentation on this, but it sounds like the default for the iPhone at this point is to convert HEIC files to jpeg if it looks like they are transferring to a location that may not be able to read them. I guess a website form falls into this category. It also appears that as part of this conversion some of the EXIF data disappears.
So, does anyone know a way to retain the EXIF data? My primary limitation here is that the upload needs to happen through the webapp and that multiple users will be using this. As a result, I can't simply have everyone change their iPhone settings to only shoot jpegs.
In case it matters, the webapp is running on node.js and expressjs.

How to download and store images to not be cleared by device later?

I have an application which downloads its product data(descipttions and images) from server and stores them localy to be available offline. For product images I'm using forge.file.cacheURL which works great but on my iPAD, this cache is being cleared during the day when I work with another apps. This causes my application to have only descipription texts available without images and user must connect to internet and synchronize again what is quite annoying. Is there a better way how to implement this scenario?
Have you tried to use the forge.prefs module for this purpose? It allows you so persistantly save key-value-pairs locally (much like HTML5 localstorage). Read more about the exact syntax in Trigger.io's official API documentation.
I'm not quite sure whether its possible to save images with this method, but you could easily transform your images into strings and vice-versa. Check out the second chapter of this post by Robert Nyman on how to save images in localStorage.

Video distribution to iPhone with Amazon Cloudfront, with fixed content programming

I would like to set up a scalable video distribution server/infrastructure for streaming video to iOS devices. The client will have some programming of pre-produced content, e.g. 6 hours that will be played and then repeated from the beginning. It should be possible to enter the exact schedule when the video starts, and also the possibility to have it start at different times on different days.
I've been pointed to the Live Smooth Streaming offer from Amazon, using the Amazon CloudFront.
So my question to you: does this support the features I need, and how do I get it set up properly. I've already taken a look at their documentation at http://awsdocs.s3.amazonaws.com/CF/latest/cf_dg.pdf but that didn't cover the use case I want, namely setting up some programming scheme. I've seen references to Cloudformation templates for the live streaming but is there also s.th. similar for doing the fixed programming, or maybe it can be used for that too?
Thanks for your time!
Flo
Your question is a bit mixed up. iOS devices need HLS protocol content. You simply need to create your content in HLS form [ts files with .m3u8] and store in a S3 bucket and link your cloudfront to it.
Since you mention pre-produced content i am guessing it means that it is available beforehand and not generated live.
Your program then should point to the right .m3u8 file to pick and can update the .m3u8 file appropriately. Your program which controls access to the m3u8 (when its available what should be playable etc) is independent of the storage in s3/cloudfront.
You can also generate content live but nothing changes except content is getting created on the fly. Your program controlling the .m3u8 will control what the client gets access to.
If it was not for iOS devices but also other devices, the same would apply. Keep your content on S3 bucket and link to CF. You need your content in the format the device needs. Let your webserver program control access to the content. Remember CF is not a player. CF provides support for flash server as well and you can use that as well.

Watch video in the time they are uploaded

It is possible to implement a feature that allows users to watch videos as they are uploaded to server by others. Is html 5 suitable for this task? But flash? Are there any read to go solutions, don't want to reinvent the wheel. The application will be hosted on a dedicated server.
Thanks.
Of course it is possible, the data is there isnt it?
However it will be very hard to implement.
Also I am not so into python and I am not aware of a library or service suiting your requirements, but I can cover the basics of video streaming.
I assume you are talking about video files that are uploaded and not streams. Because, for that, there are obviously thousands of solutions out there...
In the most simple case the video being uploaded is already ready to be served to your clients and has a so called "faststart atom". They are container format specific and there are sometimes a bunch of them. The most common is the moov-atom. It contains a lot of data and is very complex, however in our use case, in a nutshell, it holds the data that enables the client to begin playing the video right away using the data available from the beginning.
You need that if you have progressive download videos (youtube...), meaning where a file is served from a Webserver. You obviously have not downloaded the full file and the player already can start playing.
If the fastastart atom was not present, that would not be possible.
Sometimes it is, but the player for example cannot display a progress bar, because it doesnt know how long the file is.
Having that covered the file could be uploaded. You will need an upload solution that writes the data directly to a buffer or a file. (file will be easier...).
This is almost always the case, for example PHP creates a file in the tmp_dir. You can also specify it if you want to find the video while its being uploaded.
Well, now you can start reading that file byte by byte and print that data to a connection to another client. Just be sure not to go ahead of what has already been recieved and written. You would probaby initiate your upload with a metadata set in memory that holds the current recieved byte position and location of the file.
Anyone who requests the file after the uploaded has started can just recieve the entire file, or if the upload is not yet finished, get it from your application.
You will have to throttle the data delivery or pause it when the data becomes short. This will appear to the client almost as a "slow connection". However you will have to echo some data from time to time to prevent the connection from closing. But if your upload doesnt stall, and why shoud it?, that shouldnt be a problem.
Now if you want to have someting like on the fly transcoding of various input formats into your desired output format, things get interesting.
AFAIK ffmpeg has neat apis which lets you directly deal with datasterams.
Also handbrake is a very good tool, however you would need to take the long road using external executeables.
I am not really aware of your requirements, however if your clients are already tuned in, for example on a red 5 streaming server, feeding data into a stream should also work fine.
Yes, take a look at Qik, http://qik.com/
"Instant Video Sharing ... Videos can be viewed live (right as they are being recorded) or anytime later."
Qik provides developer APIs, including ones like these:
qik.stream.subscribe_public_recent -- Subscribe to the videos (live and recorded)
qik.user.following -- Provides the list of people the user is following
qik.stream.public_info -- Get public information for a specific video
It is most certainly to do this, but it won't be trivial. And no, I don't think that you will find an "out of the box" solution that will require little effort on your behalf.
You say you want to let:
users watch videos as they are uploaded to server by others
Well, this could be interpreted two different ways:
Do you mean that you don't want a user to have to refresh the page before seeing new videos that other users have just finished uploading?
Or do you mean that you want one user to be able to watch a partially uploaded video (aka another user is still in the process of uploading it and right now the server only contains a partial upload of the video)?
Implementing #1 wouldn't be hard at all whatsoever. You would just need an AJAX script to check for newly uploaded videos, and those videos could then be served to the user in whatever way you choose. HTML5 vs. Flash isn't really a consideration here.
The second scenario, on the other hand, would require quite a bit of effort. I am guessing that HTML5 might not be mature enough to handle this type of situation. If you are not looking
to reinvent the wheel and don't have a lot of time to dedicate to this feature than I would say that you would be out of luck. You may be able to use ffmpeg to parse partial video files and feed them to a Flash player, but I would think of this as a large task.

Resources