We're using the youtube data api v3 and have been for quite some time without any problems. Recently, we've been getting this 403 exception:
The request cannot be completed because you have exceeded your quota.
In the google developer's console, it says that we are still under the quota (currently it states "units/day 163,817 of 50,000,000").
Am I missing something about how quotas work?
You can create more API keys and randomly use them all. Its good way as Im also using it without any issue and never got quota exceeded issue. You need to create separate project for each API. in PHP you can use it like
$api = array("API key # 1", "API key # 2" ,"API key # 3");
$rand_keys = array_rand($api, 1);
$usage = $api[$rand_keys];
For each request new key will be used. Better way to avoid any downtime.
Quota have been reduced yesterday (2016-04-21) from 50 Million units to just 1 Million ...
YouTube also has a quota of 3,000 requests per second. Perhaps you're hitting that.
Related
Need a clarification on this:
As per docs "By default, a search result set identifies matching video, channel, and playlist resources", how this matching takes place, do they search on comments also, any idea on this.
Thanks !!
The YouTube api operates on the same rate limits as the other google apis.
There are project based limits and user based limits.
You can see the limits on google developer console.
My project can make a max 1800000 requests per minute
It also has a quota cost limit of 10000 which is not really what it sound like.
Then each user can make a max of 180000 request per minute.
This not related to the amount of data a user has on their account. Its strictly related to the number of requests or the cost of the request your application or a user can make over a period of time.
You can request additional daily quota over the development 10k if you want. Just submit the form over on google cloud console.
I am not aware of increased abilities with YouTube APIs for big YouTube channels.
I am trying to use Youtube API to download caption of a video. None of my requests were successful due to quotaExceeded error. However, I have not spent any quota other than request caption list to get the id of the caption.
request = youtube.captions().download(
id="O-jAeIynN9yCRz1el0-7JaFewbFekv8NUbhAZBwVajw="
)
# TODO: For this request to work, you must replace "YOUR_FILE"
# with the location where the downloaded content should be written.
fh = io.FileIO("/Users/joehuangx/Desktop/test", "wb")
Based on the documentation, download caption requires 200 units in quota, which is within the range of daily limit.
The default quota limit for this api when you first create a project is 10,000. From time to time someone like yourself will start getting the quota exceeded error before ever making any requests. This always turns out to be that the their project quota is set to 0
As you can see from your quota picture your current quota is 0
I have posted this as an issue a number of times and there has never been a solution. You have two options
Request for a quota extension.
delete the project your just created. Create a new one enable the Youtube data api again and see if it gives you the default quota then.
I have yet to have found any way of knowing what causes this to happen and YouTube isnt telling.
The only information i have is this issue #211012781
Hi. If you're seeing Queries per day quota set to 0 and the API is indeed enabled, then this means that your project’s access to YouTube Data API Service has been disabled.
You should’ve received a notice via email regarding this action, which also contains the steps that need to be taken to regain the project’s access. But just in case you missed it, please fill out and submit the exceptions form below:
https://support.google.com/youtube/contact/yt_api_form?hl=en
I'm building an alternative client for YouTube subscriptions browsing (folder based subscriptions with according feed generated), and I'm making a lot of requests to YouTube to aggregate that data.
I'm caching a lot of requests as it is not needed to refresh them once it has been fetched on any other day before the current one.
The fact is, current-day refreshes are consuming a lot, and I reach my quota pretty fast even though those requests are read-only.
I submitted that YouTube quota increase request form, but still, I'm quite afraid.
Am I missing something with the userIp & quotaUser parameters ?
Shouldn't those requests - as they are pretty much the same that a normal user would do on the regular YouTube client - be considered as "Queries per 100 seconds per user" ?
My main quota, the "Queries per day" currently seems to handle ALL the requests coming from my app, even though I added the quotaUser parameter on all my requests made by a user on the frontend.
I think I am missing something as my app should not be considered as "data consuming" as it is sending almost nothing to YouTube in terms of data, and it is just reading data that is also available on the YouTube main client, but not in the same format..
Thanks for your help.
I need to do a keyword based data fetching on Twitter. I looked up the documentation and "POST statuses/filter" seemed like the best option. However, I do not understand how the rate limiting works. Does this mean that I can fire this request repeatedly? If yes, at what rate should I do so? Or do I have to fire the request only once and keep on getting data continuously? They have given clear explanations for the REST API. There's even a page showing the number of requests permissible in a 15 minute window for each REST API method. I was unable to find something similar for "POST statuses/filter".
From what I've been researching about using the Streaming API there aren't any rate limits because you just make the request once to open the connection, then you keep it open and you are sent a stream (hence the name) of tweets.
Once applications establish a connection to a streaming endpoint, they
are delivered a feed of Tweets, without needing to worry about polling
or REST API rate limits.
https://dev.twitter.com/docs/streaming-apis/streams/public
I have a Twitter app that works fantastic locally - it searches for keywords then for each user it grabs their info using Hpricot to parse the xml e.g.
Hpricot(open("http://twitter.com/users/show/"+myuser+".xml"))
Works fine locally but when I go love it fails. Looking at my log I get this error:
OpenURI::HTTPError (400 Bad Request):
The weird thing is though, sometimes it works.
This has been a recurring problem for a few days now and driving me nuts. Will hug anyone with a solution :)
It's almost definitely rate-limiting - http://apiwiki.twitter.com/HTTP-Response-Codes-and-Errors . Haven't seen 400s returned for anything other than rate limit before, though the docs say there could be an accompanying message that tells you more exactly what's wrong.
You might be able to get whitelisted for more queries, see http://twitter.com/help/request_whitelisting .
You are probably making too many requests. You are allowed to make 150 REST API calls per hour unauthenticated.
REST API Rate Limiting
Unauthenticated calls are permitted 150 requests per hour. Unauthenticated calls are measured against the public facing IP of the server or device making the request.
OAuth calls are permitted 350 requests per hour and are measured against the oauth_token used in the request.