Opensearch document upload limit - quota

I can see that the API call for batch upload documents in Cloudsearch is limited to once every 10 seconds which means data will become searchable at least after 10 seconds.
Is there any such limit on OpenSearch uploads as well?

Related

Using Google Sheets API v4 above the quota

I'm trying to find the pricing for Google Sheets API. The limit is 300 requests/min per project and there's also a per-user limit.
I read you can request a quota increase but I'm willing to pay with some pricing model so there's no cap on the number of requests. Is there any information on that?
You can take an example of sheetdb.io, an app like that will need to send thousands of requests in a minute as it will be used by many users at the same time.
PS: I'm not looking for BigQuery, optimization, DBMS solution. The app is built around Google Sheets.
As per the documentation:
This version of the Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user. Limits for reads and writes are tracked separately. There is no daily usage limit.
To view or change usage limits for your project, or to request an increase to your quota, do the following:
If you don't already have a billing account for your project, then create one.
Visit the Enabled APIs page of the API library in the API Console, and select an API from the list.
To view and change quota-related settings, select Quotas. To view usage statistics, select Usage.

Google Spreadsheet Api. Limit per user per 100 sec not reached but getting ratelimited

Im trying to work with the google spreadsheet api (V4) which states to have the following limitation.
This version of the Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user. Limits for reads and writes are tracked separately. There is no daily usage limit.
Somehow my code gets ratelimited after 21-34 requestes.
I am able to see the send requests to the api in the google api dashboard which shows the same numbers on a 1 to 1 match. Thus, i am sending less than 100 requests and i am still getting rate limited.
The Endpoint which is failing is the v4/spreadsheets/{id} ep. Maybe this has an influence on it as i read that there f.e. was a not documented rate limitation for the creation of calenders via the calender api.
What could be the reason for that?

How to configure my application for a suitable AWS S3 storage plan?

My Ruby on Rails application offers users to write short text messages and upload images.
I set a daily upload limit of 7 images per user and I created a background job to delete once a month all images older than 30 days.
I also set a limit of 300 users for my application.
Even considering that an image rarely exceeds 1 MB of size (and often is far from that), I would anyway take 1 MB as the average image size, if you think this is a sensible approximation.
7 images per user it is 7x300 = 2100 daily images uploaded to AWS from all users.
2100 daily images mean 2100x30 = 63000 monthly images uploaded to AWS
63000 images per month of 1 MB each require about 62 GB. I would take 100 GB as a safety value.
I inserted in the simple monthly calculator the following values in the standard storage section for US-East (Virginia):
Storage: 100 GB
PUT/COPY/POST/LIST Requests: 70000 (instead of 63000)
GET and Other Requests: 2000000 (instead of 63000x30=1890000)
NOTE: 1890000 as far as I understand can express daily requests considering all users using the application once a day per 30 days, and thus every day asking AWS to show their images. Is that correct?
What I get is an estimated monthly bill of 3.31 $: is this a sensible estimate?
If instead my estimate is wrong, because I missed some point, considering that my monthly budget can be between 5 $ and 10 $, how would you suggest me to configure my application (number of users, number of images per day and their size limit)?

simultanious connection of spreadsheet?

I would like to use a spreadsheet to show data to 50000 to 100000 people at a time so Can Any One tell me how many people can download the json file of a spreadsheet at a time.
The Sheets API has a has a default limit of 40,000 queries per day.
You also have
Write/Read requests per 100 seconds 500
Write/Read requests per 100 seconds per user 100
Write/Read requests per day 40,000
As long as you don't exceed those, you'll be fine. However if you go past the limit, you need to create a billing account so you can ask for additional quota.

Parse 100GB File Storage Limit

Hey guys so I developed a social network on iOS and used parse for the back end. Our app has taken off and over 50,000 images have been posted in ten days. Aside from hitting the 600 req/sec api limit soon it appears we might fill up the 100gb storage limit sooner. Does this limit (file storage) reset monthly, or once you hit 100gb you are done. It seems like a tiny amount of storage for a PaaS company.
According to the Parse.com website, you receive 2TB file storage in with any package, not 100GB. If you're asking if they give you an additional 2TB each month, the answer would be no. At the beginning of the next month, you are still using the space, it does not reset (unlike, for example, bandwidth). This is the case with (probably) all cloud (SaaS, IaaS, PaaS, etc.) providers. You can increase the amount of file storage for 10c/GB per month.
As for database storage, it seems that 100GB is the hard limit. Again, being storage, you do not get an extra 100GB per month.
If your database is larger than 100GB and you are hitting more than 600 req/sec (averaged over a minute - i.e. 36000 req/min) then you may want to consider building your own infrastructure, perhaps in AWS or similar, so you can scale it properly. You may also want to consider moving your uploaded images outside of the database if they are not already - DB storage is considerably more expensive than file storage - both in terms of cost and performance.
Parse.com has larger plans available - up to a point.
HOWEVER - if you are going to be doing 600 requests a second (wow, since that's 50 MILLION requests a day) you'll need to look at two possibilities:
You can keep you requests under this limit by using local caching, streamlined calls, etc.
You will eventually need to migrate off of the Parse ecosystem.
If memory serves, there used to be an option to get a custom plan with more requests/second. It seems to have disappeared from the Pricing page, to be replaced with this:
What is the cost for an app with a burst limit above 600 requests per second? What happens if I require more than 600 requests/second?
We do not provide custom plans for apps that require more than 600 requests per second.
UPDATE: It looks like there is also a hard limit of 100GB of file storage...
The overage rate for database size is $10/GB but we only allow increases in increments of 20GB. When you you exceed 20GB of database size we will increase your soft limit to 40GB and begin charging you an incremental $200/month. When you hit your soft limit of 40GB we will increase your soft limit to 60GB... and so on up to a hard limit of 100GB.

Resources