What are Travis-CI API rate limits - travis-ci

Since attack on its API, travis-ci introduced limits on its REST API . We have a monitoring system that queries this API to get statuses of some projects, this is now hitting the roof. This has also an impact on the web interface.
What are the limits?

Currently, the limit is 10/hour on .org, and 50/hour on .com.

As of 2017 August, the current limit on .org are (as coded in https://github.com/travis-ci/travis-api/blob/master/lib/travis/api/attack.rb):
Blacklist:
<2 authentications (POST /auth/github) within 5 minutes, otherwise banned for 5 hours.
<10 POST requests within 30 seconds, otherwise banned for 1 hour
Throttling:
<1 authentication per minute
<500 requests per minute when not authenticated
<2000 requests per minute when authenticated with GitHub token

Related

Need clarification on how the Google Sheets API limits are applied

From Usage Limits help page:
This version of the Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user.
Let’s take it apart:
500 requests per 100 seconds per project - This is applied to my project. I use my project credentials to make each request.
100 requests per 100 seconds per user - When I make a request, I also include the OAuth token of a user that permitted me to update their workbook.
Question about the per-user part:
Is there anything the user can do themselves (like reach out to Google) to increase the quota just for them? Or is it me who needs to talk to Google to increase the quota for all users of my project simultaneously?
Thanks!
500 requests per 100 seconds per project - This is applied to my project. I use my project credentials to make each request.
100 requests per 100 seconds per user - When I make a request, I also include the OAuth token of a user that permitted me to update their workbook.
As you can see there are two types of quotas project based quotas these are the quotas that are applied to your project as a whole. Then there are user based quotes these quotas a are applied to the users of your project / application.
Project based quotas can be extended you can apply for an extension and google may grant you that extension which will increase the number of requests your project as a whole can make.
User based quotas are more like flood protection they ensure that a single user of your application can not make to many requests at once flooding the server. User based quotas can not be extended.
Is there anything the user can do themselves (like reach out to Google) to increase the quota just for them? Or is it me who needs to talk to Google to increase the quota for all users of my project simultaneously?
To answer your question there is nothing the user can do to increase the quota this is your project and only you have access to increase the project based quota.
There is nothing you can do to increase he user based quotas.

Does google charges for API requests made to Google Sheet, Docs, Calendar etc.,

I just wanted to know Does Google Charges us for using API for Sheets, Docs, Calendar etc. Since when I search in google for sheets I do see information regarding Request limitations but not for pricing.
For Docs I see **All the API are free of charge". But I missed that for Google Sheet.
So rather than asking only for Google Sheet. I just wanted to know will google changes for using there API's of all the Applications(Sheets, Docs, Profile, Calendar etc).
The Sheets API is free of charge if you don't exceed the limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user.
Same goes with the Docs API, if the below limits are respected:
Read requests:
3000 per project per 60 seconds;
300 per user per 60 seconds;
Write requests:
600 per project per 60 seconds;
60 per user per 60 seconds;
The Calendar API has the same policy, it is free of charge if the limit of 1,000,000 queries per day is not exceeded.
Essentially, you can check the status of your quota by accessing the Google Cloud Project which uses the API in question and checking the Quotas section.
Reference
Sheets API Usage Limits;
Docs API Usage Limits;
Calendar API Usage Limits.
2022/November/22
Now I see "All use of the Google Sheets API is available at no additional cost. Exceeding the quota request limits doesn't incur extra charges and your account is not billed.".
Please update your information.
https://developers.google.com/sheets/api/limits#pricing

Handle status 429 in Rails API

I did a Twitter clone using rails api + react, just for study purposes.
I have quite simple logic of requests: click in a user, load its informations and tweets, requesting for the api. However, If I do this fast like 3 times, I receive the status 429 (too many requests) with the header Retry-After: 5.
There is a way to increase the number of requests in a given time? How would be the correct approach to handle with this in such common situation?
From my understanding, the error information you have shown is correct, It means request cannot be served due to the application's rate limit having been peaked for the resource.
Rate limits are divided into 15 minute intervals. All endpoints
require authentication, so there is no concept of unauthenticated
calls and rate limits.
To overcome this situation, here is an example from the documentation itself.

Using goo.gl url shortener without api key. What quota limits are there?

Goo.gl nicely mentions that you have a limit of 1.000.000 when using an API key.
https://developers.google.com/url-shortener/v1/getting_started
// Quotas:
By default, your registered project gets 1,000,000 requests per day for the URL Shortener API (see the Developers console for more details).
I can't find what the quota limits are when you "don't" use an API key.
The reason for this is, I could use the API key, but the server(s) will be set among various clients, and communicating in uncertain conditions where the API key could be sniffed. Aside that, OAuth would require user interaction, which wouldn't be acceptable in an automated process (which may not even haven a UI).
https://support.google.com/cloud/answer/6158857?hl=en
My expected usage would be maybe 10-20 / minute during peak hours, or max 1000 a day. Would this hit goo.gl's non API limits?

getting all tweets of a twitter user, rate limit problem

I've been trying to get all tweets of a some public(unlocked) twitter user.
I'm using the REST API:
http://api.twitter.com/1/statuses/user_timeline.json?screen_name=andy_murray&count=200&page=1'
While going over the 16 pages (page param) it allows, thus getting 3200 tweets which is ok.
BUT then I discovered the rate limit for such calls is 150 per hour(!!!), meaning like less than 10 user queries in an hour (16 pages each). (350 are allowed if u authenticate, still very low number)
Any ideas on how to solve this? the streaming\search APIs don't seem appropriate(?), and there are some web services out there that do seem to have this data.
Thanks
You can either queue up the requests and make them as the rate limit allows or you can make authenticated requests as multiple users. Each users has 350 requests/hour.
One approach would be to use the streaming API (or perhaps the more specific user streams, if that's better suited to your application) to start collecting all tweets as they occur from your target user(s) without having to bother with the traditional rate limits, and then use the REST API to backfill those users' historical tweets.
Granted, you only have 350 authenticated requests per hour, but if you run your harvester around the clock, that's still 1,680,000 tweets per day (350 requests/hour * 24 hours/day * 200 tweets/request).
So, for example, if you decided to pull 1,000 tweets per user per day (5 API calls # 200 tweets per call), you could run through 1,680 user timelines per day (70 timelines per hour). Then, on the next day, begin where you left off by harvesting the next 1,000 tweets using the oldest status ID per user as the max_id parameter in your statuses/user_timeline request.
The streaming API will keep you abreast of any new statuses your target users tweet, and the REST API calls will pretty quickly, in about four days, start running into Twitter's fetch limit for those users' historical tweets. After that, you can add additional users to fetch going forward from the streaming endpoint by adding them to the follow list, and you can stop fetching historical tweets for those users that have maxed out, and start fetching a new target group's tweets.
The Search API would seem to be appropriate for your needs, since you can search on screen name. The Search API rate limit is higher than the REST API rate limit.

Resources