Though there are some questions on this context I am yet to understand the math of quota limits.
Google developer console states the following
Queries per day - 10,000
Queries per 100 seconds per user - 300,000
Queries per 100 seconds 3,000,000
So
A user can burn 300,000 / 100 = 3,000 units per second, which
means 10K units can be exceeded after ~3.3 seconds. What if the client has 4 users? Can they burn all the units in a less than a second time period?
A client is allowed to burn 3M / 100 = 30,000 units per second which already exceeds daily 10K limit.
How can a client burn 3M units in a second if she only has 10K units per day.
Can someone help me understand all this magic, especially one from #YouTubeDev team?
they reduced the daily quota limit from ~400.000 to 10.000 on the 5th March 2020. It was 1.000.000 a few months ago and 10.000.000 a few years ago.
You can neither buy nor manually upgrade your daily limit. The only way I know of is requesting more quota through their formulas.
Related
If you set up a OAuth for Youtube within your app that allows users to upload videos, does each video cost towards your 10,000pt quota?
I run a personal uploading bot and it does 3ish uploads per day within the 10,000 point quota but if I was to scale out as an app this wouldn't work since 5 users would max it out.
So if a user approves your app for upload permissions, would this cost toward your clients 10,000pts or is it 10,000 points per user per day?
Also how easy is YouTube's quota expansion form process if it is the former?
https://support.google.com/youtube/contact/yt_api_form?hl=en
By checking the quota calculator you will be able to see what each call costs. The vido.insert call for example costs 1600 quota.
If you check the google developer console and check your quota it might look something like this.
As you can see one of them states "per user" while the other does not.
Queries per day 10,000 is a project based quota. while Queries per minute per user is a user based quota.
It sounds like you should be applying for a quota extension if the 10,000 limit is not enough for your needs.
Also how easy is YouTube's quota expansion form process if it is the former?
Its a long process google says it takes twenty days my experience is three to six months average. You need to be prepared to get a NO. You also need to be prepared to have your quota shut down suddenly because they detect something they identity as spam or a violation. In the event of a shutdown you will need to apply for a new extension. Which again will take time.
Expedia Hotel pricing queries are rate limited. Is the Amadeus API /shopping/hotel-offers - FIND HOTELS also rate limited in terms of number of API queries per hour?
In the Self-Service Amadeus APIs for the test environment there is a limited number of free monthly calls per API.
The platform has the following rate limits: in the test environment you cannot do more than 1 request per 100ms (10 transactions per second per user) and in production no more than 1 request per 50ms (20 transactions per second per user).
There is a monthly usage cap at 1 million minutes per month. Does that mean if my users (of my project/app which is using this api) avg. usage is 60 minutes/1 hour per day => 1440 minutes per month, then I can have maximum ~700 (1000,000/1440) users for my apps? If this is right, then my user base can never grow beyond 700 right?
Is my calculation right? or I misread something in the document:
https://cloud.google.com/speech/
In the doc there is
Processing per day 480 hours of audio
"These limits apply to each Cloud Speech API developer project, and are shared across all applications and IP addresses using a given a developer project."
So you might have a problem here.
Does all your use every day 1 hour ?
If yes, you still have the possibility to send a Quota request as written in this page : https://cloud.google.com/speech/pricing
Good luck
My Ruby on Rails application offers users to write short text messages and upload images.
I set a daily upload limit of 7 images per user and I created a background job to delete once a month all images older than 30 days.
I also set a limit of 300 users for my application.
Even considering that an image rarely exceeds 1 MB of size (and often is far from that), I would anyway take 1 MB as the average image size, if you think this is a sensible approximation.
7 images per user it is 7x300 = 2100 daily images uploaded to AWS from all users.
2100 daily images mean 2100x30 = 63000 monthly images uploaded to AWS
63000 images per month of 1 MB each require about 62 GB. I would take 100 GB as a safety value.
I inserted in the simple monthly calculator the following values in the standard storage section for US-East (Virginia):
Storage: 100 GB
PUT/COPY/POST/LIST Requests: 70000 (instead of 63000)
GET and Other Requests: 2000000 (instead of 63000x30=1890000)
NOTE: 1890000 as far as I understand can express daily requests considering all users using the application once a day per 30 days, and thus every day asking AWS to show their images. Is that correct?
What I get is an estimated monthly bill of 3.31 $: is this a sensible estimate?
If instead my estimate is wrong, because I missed some point, considering that my monthly budget can be between 5 $ and 10 $, how would you suggest me to configure my application (number of users, number of images per day and their size limit)?
I would like to use a spreadsheet to show data to 50000 to 100000 people at a time so Can Any One tell me how many people can download the json file of a spreadsheet at a time.
The Sheets API has a has a default limit of 40,000 queries per day.
You also have
Write/Read requests per 100 seconds 500
Write/Read requests per 100 seconds per user 100
Write/Read requests per day 40,000
As long as you don't exceed those, you'll be fine. However if you go past the limit, you need to create a billing account so you can ask for additional quota.