We are using YouTube Data API v3 in our production environment, all of a sudden today the Quota limit went to 0, we are now not able to make any API calls.
Is there any way to get at least the default limit of 1000. We have submitted the form to increase the quota but not sure how long it may take as our current users are not able to use our service.
You did not mention if your application had gone through the verification process yet or not. Assuming it has not you are given a short window of developer of quota to use during the development process. If you have not verified your application in a timely manner. Then your quota will often be removed and set to 0
I have also seen developers whos quota gets set to 0 if they are doing something that was not verified in their original request.
You will need to request additional quota, it can take months in my experience three months to get a quota extension.
Related
I got this error:
The request cannot be completed because you have exceeded your quota.
and I can not understand YouTube limits the number of requests? That is, I cannot create my project by taking API from my channel? If this is so, what is the point of YouTube Data API, if at the development stage I was already limited, what will happen when users come in, then my project will fall within 5 minutes?
and I cannot understand how I was able to make 10,000 requests per day, given that I worked on the localhost for about 3 hours, is this possible?
Indeed the Google's Developers Console shows text like Queries per day, but that's very much misleading (and may well be reported as an Web UI bug to Google).
You have to acknowledge that YouTube Data API's quota system is not accounting for the number of endpoints calls you made during a day long, but it accounts for the cumulated number of quota units corresponding to each of your endpoint calls.
For example, if you have 10000 units of quota allocated for daily usage, you may very easily exceed this upper bound after only 100 calls to the Search.list API endpoint.
Many API users find the default amount of quota allocated -- 10000 units -- to be quite constraining -- that even during the development stage of their apps. For tackling this issue, I recommend two things:
Develop your app such that to cache API responses it received from the endpoints it calls; this way, during the development stage of your app (afterwards, even during production, but albeit functioning with a different logic), repeated calls to endpoints would not result in actual API requests, but would get served from the app's local cache.
Apply for a quota extension, using Google's official form; be aware that, as per the experience of users of this forum, Google's answer, usually, does not arrive shortly.
If you set up a OAuth for Youtube within your app that allows users to upload videos, does each video cost towards your 10,000pt quota?
I run a personal uploading bot and it does 3ish uploads per day within the 10,000 point quota but if I was to scale out as an app this wouldn't work since 5 users would max it out.
So if a user approves your app for upload permissions, would this cost toward your clients 10,000pts or is it 10,000 points per user per day?
Also how easy is YouTube's quota expansion form process if it is the former?
https://support.google.com/youtube/contact/yt_api_form?hl=en
By checking the quota calculator you will be able to see what each call costs. The vido.insert call for example costs 1600 quota.
If you check the google developer console and check your quota it might look something like this.
As you can see one of them states "per user" while the other does not.
Queries per day 10,000 is a project based quota. while Queries per minute per user is a user based quota.
It sounds like you should be applying for a quota extension if the 10,000 limit is not enough for your needs.
Also how easy is YouTube's quota expansion form process if it is the former?
Its a long process google says it takes twenty days my experience is three to six months average. You need to be prepared to get a NO. You also need to be prepared to have your quota shut down suddenly because they detect something they identity as spam or a violation. In the event of a shutdown you will need to apply for a new extension. Which again will take time.
I went through the approval process of having my Youtube Data API quota (units/day) increased over 2 months ago. Yet, I am not seeing the increase within Google Cloud Platform. Here is a picture of the display in which I am talking about.
I am wondering if this is a bug within GCP, or potentially my quota was never even increased upon approval?
Its like happens the same on my couple of project but didn't found any public document, when I opened ticket with gcp team and they mentioned me the same
Quota increases as follows :
Google verifies your need of quota increase
Justification needed for quota increase
If Google increases quota, it is not permanent
If your usage decreases for a certain period of time then increased quota would revert to default quota
Its like happens the same on my couple of project but didn't found any public document, when I opened ticket with gcp team and they mentioned me the same. To verify your situation open issuetracker case with gcp team
This question is regarding the O365 Activity Management API
We are using the API to retrieve audit log notifications from multiple channels (AzureAD, Outlook, SharePoint, etc.) for very large tenants, meaning that we need to retrieve potentially millions of notifications over a relatively short timespan.
O365 gathers audit notifications into a series of "blobs" which then contain a number of individual notifications (JSON messages). To my understanding, which in part comes from correspondence with the API's dev. team and from reading the docs, these blobs should contain a "considerable" number notifications as to function as a sort of batch approach when doing the actual web requests.
In our approach, we request blobs URLs for an interval of an hour, and then do a request for the individual blobs.
However, we have tested with a number of different tenants and different PublisherIdentifiers, but only seem to get around 2.5 messages per blob on average, no matter the total number of notifications "waiting" to be fetched.
This becomes a major issue for the larger tenants as is puts a strain on the SIEM solution running the fetcher logic (a Python service), due to the number of needed requests, and it also gives us throttling issues with the API itself.
In effect, we simply cannot fetch the audit notifications fast enough to keep up - within the retention period. Had the blobs contained more notifications per blob, we would be fine - as the total amount of data (in MBs) is not that large.
A "funny" thing is, that if we use the visual query tool within the Admin Center of the tenant, it searches and retrieves the notifications very fast.
My questions
Has anyone had any experience with this issue, or perhaps had a better "batch performance"?
Does anyone have any ideas as to what we could try to get a better performance?
As mentioned we have been in direct contact with the dev team and the program manager in Redmond. They have been very helpful with other issues we had, but they referred us to support for this specific issue - who in turn referred us to the forums / community. We currently do not have access to premium support...
Example request for content blobs for an hour
https://manage.office.com/api/v1.0/{tenantid}/activity/feed/subscriptions/content?contentType=Audit.Exchange&PublisherIdentifier={pub.id}&startTime=2017-12-03T10:31:24&endTime=2017-12-03T11:31:24
When retrieving the individual blobs, we just use the URLs given to us by the above request.
You can avoid throttling by appending "?PublisherIdentifier={Tenant ID}" to the contentUri in the retrieve content get request.
How can I add a PublisherId to a GetBlob call to the Office365 Rest API to avoid throttling?
I have been working with Office 365 Management Activity API for the past 6 months. I too faced this kind of issue before. This issue will occur if you are trying to get all the audit log contents from your Office 365 tenant at a particular interval, it will result in throttling issue. For your information, it is not possible to avoid throttling issues (resource over usage) for large active tenants.
To overcome these issues, you can create and deploy a web application in cloud and register with Office 365 Management Activity API webhook.
Whenever the office 365 tenant wrap the activity logs into an Azure Blob, it will immediately give the blob details to your registered Web Application. You can refer this link to know about how to enable webhook for a Web Application. Once you received the blob detail from Office 365 tenant, extract the logs from the Azure Blob and save it in your own blob storage / store in SQL / NOSQL databases.
I had a similar issue. Pulling down logs would take longer than the interval of time allotted to the Python script and the script would start overlapping itself or would fall behind when trying to pull logs for a SIEM implementation.
https://github.com/IntegralDefense/o365_log_fetch
I'm a little late to this post, but by using Asyncio in Python 3.5+ as well as aiohttp, you can make concurrent calls to O365 Management API and pull down the logs much faster. I performed some testing and retrieved logs for a 13 hour window (Audit.Exchange, Audit.AzureActiveDirectory, and Audit.Sharepoint). It took around 20 minutes using 'requests' and sequentially making the API calls. After implementing Asyncio/aiohttp, the same time frame took just under 2 minutes (500,000+ individual events were pulled from the data located at several thousand content blobs/locations).
I've been running the script in 10 minute intervals and usually the script completes in < 10 seconds.
The script I pasted above also supports pagination. So if you get a content list that was truncated in the response from Microsoft, the script will keep reaching out and pulling down more content locations.
At this time, the documentation isn't up to speed, but hopefully that will be caught up soon.
a few programmers and I want to create a new Plugin for the YouTube-Addon in Kodi (XBMC). For this Project we need a hughe increase of the Quota because we have at least 4 Million Users (Download-counter)
The Link above is the current plugin which we would to replace. In this plugin we allways have the problem with the quota (plugin is blocked for at least 4 hours)
Which requirements are essential for the plugin to get an increase of the quota?
You'll be answering an apply for addtional quota form for Youtube Data API v3.
Here's the full statement from the docs:
If you reach the quota limit, you can request additional quota on the
Quotas tab in the Developer's Console.
Note that projects that had enabled the YouTube Data API before April
20, 2016, have a different default quota for that API
.