How to minimize youtube API credit usage? - youtube-api

We have an app that calculates some metrics from our users' videos on youtube, (about 400k-ish videos at present, rising steadily) - I can initially get the data about their uploads since our users are just arriving piecemeal every day, however keeping it updated every time people log in is absolutely killing our API credit usage (especially for very large channels with 1000s of videos). How can I just determine those videos that have CHANGED their snippet (title, description, tags) since we last asked? - It's very important to the app that changes to videos are reasonably quickly reflected in our users' metrics.
We already get the snippet for users channels in a paged way (50 per page, this seemed to cut our credit usage considerably), and I tried already using etags but it has had no effect on quota usage even if all the videos return 304 - I also searched the docs for a "modifiedDate" and other relevant terms, but I have so far found nothing. I also checked the etag when retrieving just the "id" part, however this etag doesn't change when I modify title, description, or tags....Lastly, we also have a request in to increase our quota, but there must be a better solution than that?

Please read my answer to a somewhat similar question. Note that the respective solution is by no means straightforward to implement, yet it is most likely what you're looking for. Also, I suppose that the activities endpoint may be of help as well.

Related

Regarding YOUTUBE API Daily Quota Extension

I have reached the Daily Quota Limit, and have submitted the Quota Increase Form.
After seeing the confirmation notice of my submission, I have not heard or received an email from them.
Is there any other solution to this issue? How long does it usually take for them to get back?
With things considered, we may have to increase the daily quota up to 100,000.
Is there a way to collect multiple data from a single quota?
My website mainly involves collecting view counts of videos through video IDs.
I have submitted the YouTube API Services - Audit and Quota Extension Form.
Thank you in advance
The time to get the quota increase varies greatly. It kind of depends on how back logged the team is.
In the beginning when they reduced it to 10k and I applied for mine it took more then three months.
These days I think you should get something in less then two weeks but don't hold me to that I dont work for YouTube this is just my experience.
Oh and just check it now and then they may apply it before the actually send you an email saying that they are going to apply it.

Youtube Data API Wrongly Calculated, Quota Exceeded

I have a very simple message and getting the v3 youtube data api to get the list of comments. I am just fetching the list of videos and then fetching the comments (at frequency of 5 sec) to get updated messages. using the page token as needed to minimize the load and computaion.
Today after some time while internally testing the application i started getting the quota exceeded exception. I know the youtube provided by default 10000 units and since reading the comments (and videos as well) is just 1 unit, i should expect to get similar numbers.
However, the data is wrongly calculated.
Following are request details
If you see, there are 2895 total requests LiveChatMessages-> List.
However, when i go to IAM-> Quotas, it showed 14k earlier, then 12.6k in quota usage
There seems to be some problem either with the computation or with the Documentation that defines the units for queries. Can someone help please..
PS: Just using the two apis as mentioned above in screenshot. Both are list.
If you see, there are 2895 total requests LiveChatMessages-> List. However, when i go to IAM-> Quotas, it showed 14k earlier, then 12.6k in quota usage
Yes i can see that there are 2895 requests, but how do you know what the qutoa costs are for those requests. You are using the YouTube Live Streaming api for those requests. Not the YouTube-Data-api
There is no documentation of the quota cost for the YouTube Live Streaming api calls. If Google says you used all your quota then you probably have.
I would post an issue over on the issue forum asking them to document the quota cost for the calls Issue forum

Applying for Additional Quota for YouTube API as an Individual (without business info)

I recently began using the Youtube Data v3 API for a program that I'm writing which is purely for personal use. To give a brief summary of what it does, it checks the the live chat from my most recent (usually ongoing) livestream and performs actions based on certain keywords entered in chat (essentially commands for people to use from live chat). In order to do that, however, I have to constantly send requests to get a refreshed livechat. As it is now, it sends requests on 1 second intervals. I recently did a livestream to test out my program and it only took about 25 minutes for me to reach the daily quota limit of 10,000 units/day.
The request is:youtube.liveChatMessages().list(liveChatId=liveChatId,part="snippet")
It seems like every request I make costs 6 units, according to the math. I want to be able to host livestreams at lengths of up to 3 hours, which would require a significant quota increase. I'm aware that there is an option to fill out a form to request additional quota. However, it asks for business information such as a business name, business website, business mailing address, etc. Like I said before, I'm doing this for my own use only. I'm in no way part of a business, and just made my program as a personal project. Does anyone know if there's any way to apply for additional quota as an individual/hobbyist? If not, do you think just putting n/a in those fields would be acceptable? I did find another post where someone else had the exact same problem, but no one was able to give a helpful answer. Any advice would be greatly appreciated.
Unfortunately, and although only related, it seems as Google is for the money here. I also tried to do something similar myself (a very basic chat bot just reading the chat messages), and, although some other users on the net got some different results, they all have in common that, according to the doc how it should be done, all poll at this interval of about once a second (that's the timeout one get as part of the answer to a poll for new messages). I, along with a few others, got as most as about 5 minutes with polling once a second, some others, like you, got a few more minutes out of it. I changed the interval by hand in incrementing intervals of 5 seconds each: 5, 10, 15, etc... you get the picture. I can't remember on which value I finally tuned in, but I was only able to get about 2 1/2 hours worth with a rather long polling interval of just once every 10 seconds or so - still way enough for a simple chat bot just reading the chat. But also replying would had at least doubled the usage and hence halfed the time.
It's already a pain to get it working as an idividual as just setting up the required OAuth authentication requires one to at least provide basic information like providing a fixed callback and some legal and policy information. I always ended up in had it rejected with this standard reply "Your project seem to be for internal use only.". I even was able to got this G suite working (before it required payment) to set up an "internal" project (only possible if account belongs to a G suite organization account), but after I set up the OAuth login I got the error that my private account I wanted to use the bot on was not part of the organization and hence can't be used. TLDR: Just useless waste of time.
As far as I'm in for this for several months now there's just no way to get it done as a private individual for personal use. Yes, one can just set it up and have the required check rejected (as it uses the YouTube data API scopes), but one still stuck with that 10.000 units / day quota. Building your own powerful tool capable of doing more than just polling once every 10 to 30 seconds with just a minimum of interaction doesn't get you any further than just a few minuts, maybe one or two hours if you're lucky. If you want more you have to set up a business and pay for it - simple and short: Google wants you to pay for that service.
As Mixer got officially announced to be shut down on July 22nd you have exactly these two options:
Use one of the public available services like Streamlabs, Nightbot, etc ... They're backed by their respective "businesses" and by it don't seem to have those quota limits (although I just found some complaints on Streamlabs just from April - so about one month prior to when you posted this question where they admitted to had reached their limits - don't know if they already got it solved).
Don't use YouTube for streaming but rather Twitch - as Twitch doesn't have these limits and anybody is free to set up an API token either on the main account or on a second bot account (which is also explicitly explained in their docs). The downside of this are of course the objective sacrifices one has to suffer: a) viewers only have the quality of the streamer until one reaches at least affiliate b) caped at max 1080p60 with only 6.000kBit/s c) only short time of VOD storage
I myself wanted to use YouTube as my main platform (and currently do, but without my own stuff at the moment) and my own bot stuff and such as streaming on YouTube has some advantages over Twitch, but as YouTube wants me to pay what others (namely: Twitch) offer me for free (although overall not as good quality) it's an easy decision to make. Mixer looked promissing, as it also offered quite some neat features (overall better quality than Twitch, lower latency), but the requirements to get partner status were so high (2.000 followers along with another insane high number to reach) and Mixer itself just so little of a platform (I made the fun to count all the streamers and viewers - only a few hundred streamers with just a few 10.000s viewers the whole platform had less than some big Twitch channels on their own) - and now it's announced soon to be dead anyway.
Hope this may give you some input into what a small streamer has to consider and suffer from when chosing a platform - but after all what I experienced I have these information: Either do it like all the others: Stream on Twitch and use YouTube as an archive to export to from Twitch (although Twitch STILL doesn't have an auto-export of the latest VOD implemented - but I guess that could be done by some small script) - or if you want to stay on YouTube use some existing bot like Nightbot or any of the other services like Streamlabs.
If you get any other information on how to convince Google to increase the limit as an individual please let us know.

Why does Youtube shows 301+ views?

I am just curious about why does Youtube shows 301+ views. I think there must be some logic behind that. What it could be?
I have seen exact count of views lesser than 300 views as well as having count in several thousands and millions.
It can stay stuck for a while. This is a control procedure for preventing the use of bots, any video getting more than 301 views in a short period gets verified in terms of source of traffic. However views are still getting counted (logged) in the back-end and will appear when YouTube will unlock the view counts.
Answering myself, it is no longer the case and it's been resolved.
So whenever a new video used to upload it receives many likes including from bots. So to verify the legitimacy of likes YouTube counter used to stop after 300 likes to verify sources of likes.
Official release: https://mobile.twitter.com/YTCreators/status/628958720953819136

Twitter app development best practices?

Let's imagine app which is not just another way to post tweets, but something like aggregator and need to store/have access to tweets posted throught.
Since twitter added a limit for API calls, app should/may use some cache, then it should periodically check if tweet was not deleted etc.
How do you manage limits? How do you think good trafficed apps live while not whitelistted?
To name a few.
Aggressive caching. Don't call out to the API unless you have to.
I generally pull down as much data as I can upfront and store it somewhere. Then I operate off the local store until it runs out and needs to be refreshed.
Avoid doing things in real time. Queue up requests and make them on a timer.
If you're on Linux, cronjobs are the easiest way to do this.
Combine requests as much as possible.
Well you have 100 requests per hour, so the question is how do you balance it between the various types of requests. I think the best option is the way is how TweetDeck which allows you to set the percentage and saves the rest of the % for posting (because that is important too):
(source: livefilestore.com)
Around the caching a database would be good, and I would ignore deleted ones - once you have downloaded the tweet it doesn't matter if it was deleted. If you wanted to, you could in theory just try to open the page with the tweet and if you get a 404 then it's been deleted. That means no cost against the API.

Resources