So I'm aware that twitter has a rate limit of 150 requests per hour.
But for some reason I keep getting the error from twitter that I have reached my rate limit, which is impossible considering the amount of times I accessed it.
I started monitoring the changes in hits left per hour, and realized that it decreases to 0 within half an hour or so. At that point I thought the problem is with my website not having a dedicated ip, so I requested this change to be made with my hosting company.
However even after my website has moved to a dedicated ip the remaining hits per hour still decreases at the same rate without me using it.. I honestly have no idea why this is happening.
And an interesting thing:
I tried using the javascript code supplied by:
http://code.google.com/p/twitterjs/
and found that even after the limit reached 0, it still seems to be able to load tweets.
Anyone know why this is happening?
Test page I was working on:
http://ice3studio.com/twitterTesting/
- 1st section in the white box is js with php caching (which cannot grab twitter feed after limit is reached)
- 2nd section is the js code from google code
I am very new at this so I appreciate any help!
Thanks in advance :D
If the GET request is authenticated then rate limit applies to the user otherwise its the IP.
Only GET request has rate limit. POST request has no rate limit.
Twitter JS can load tweets because its running on clients end and every client has different IP. If you use this library with a same account it'll be rate limited as you are sending authenticated request.
You can always white list your IP and account. It'll increase your rate limit greatly.
Related
I'm trying to collect and update data using the Business Information API.
In order to get the API Calls to work, I'm only trying to get information from my business by using "Get-requests". However when calling several methods, I keep receiving the following errors:
"Quota exceeded for quota metric 'Requests' and limit 'Requests per minute' ".
Both in the Postman-calls or the OAuth 2.0 Playground (which in my eyes: should be a sandbox, ready for testing - very frustrating…).
When I look for my quota in the API settings: I'm not even able to change the requests per minute other than '0'. This makes it really hard to test/use the API.
I can't even find out which categories there are for a business location…
For your information: I've already asked for increase of the quota using the forms. But it seems google isn't really responsive in this matter.
Can this be solved?
The API shall be used to update a group of 50 (or more) locations, this instead of bulk-editing with a csv-file.
Any help would be welcome.
Thanks in advance,
Kind Regards,
Seppe
If the quota approval form was ignored, you might still have a chance via the API support (https://support.google.com/business/contact/api_default).
They might be reluctant to grant you a quota if your maximum location count is this low though - the API is designed for larger use cases.
Is it documented anywhere that it's meant for larger users? I got approved being very clear it was only for a handful of locations.
BUT even though I got approved and have access there are 3 specific quotas (all per-minute) that are set to zero, even though I have tonnes of allowance for all the non-per-minute quotas. Seems like a bug to me.
I can make 10000 "Update Location requests per day" but zero per minute.
I am trying to fetch all the subscriptions ids of a youtube channel that has 100k+ subscribers. When fetching the first page of results, youtube returns properly the total amount of subscriptions and the next page token.
After a few hundred calls (because you can only fetch 50 results per call), the api doesn't provide anymore the nextPageToken, and the listing stops with only ~20k subscriptions listed.
I tried this on several big youtube channels, and always the same behaviour when I reach around 20k subscriptions listed.
In the documentation I couldn't find anything about any limit on listing subscriptions...
Anybody encountered the same issue ? :-)
Thanks
I think you need to check for an error, If you have reached the quota limit you should get an error when you try and make another request.
Daily quota is 50,000,000 units/day
depending upon which part you request from subscriptions.list some of them like snippet count double against the quota.
Math time
If you have 100000 subscriptions and you have to fetch them in 50 subscription bites that's going to take you 2000 requests. Even if you are using one of the double parts its still should only be around 4000 requests.
I don't think the problem is quota check to see if you are getting an error.
Googling found
issue request might be related. Youtube api impossible to get all results
YouTube api page tokens
possible hack
I am not clear about what the Twitter rate limit, "350 requests per hour per access token/user", means. How they are limiting the request? In 1 request how much data i can get?
The rate limits are based on request, not the amount of data (e.g. bytes) you receive. With that in mind, you can maximize requests by using the available parameters of the particular endpoint you're calling. I'll give you a couple examples to explain what I mean:
One way is to set count, if supported, to the highest available value. On statuses/home_timeline, you can max out count at 200. If you aren't using it now, you're getting the default of 20, which means that you would (theoretically) need to do 10 more queries the get the same amount of data. More queries mean you eat up rate limit.
Using statuses/home_timeline again, notice that you can page through data using since_id and max_id, as described in Working with Timelines. Essentially, you keep track of the tweets you already requested so you can save on requests by only getting the newest tweets.
Rate limits are in 15 minute windows, so you can pace your requests to minimize the chance of running out in any give time window.
Use a combination of Streams and requests, which increases your rate limit.
There are more optimizations like this that help you save request limit, some more subtle than others. Looking at the rate limits per API, studying parameters, and thinking about how the API is used can help you minimize rate limit usage.
I am a little confused on the Facebook rate limits and need some clarification.
To my knowledge each application gets 100 million api calls per day per application and 600 calls per second per access token.
According to Insight I am currently making about 500K calls per day total for my application however am receiving a large number of "Application request limit reached". Also in Insight I see a table that has a column called "Fraction of Budget". Four of the endpoints listed in there are over 100% (one is around 3000%).
Is Facebook limited per endpoint as well and is there any way to make sure I don't receive these Application request limit reached errors? To my knowledge I'm not even close to the 100M api calls per day per application that Facebook lists as the upper limit.
EDIT: As a clarification, I am receiving error code 4 (API Too many calls) not error code 17 (API User too many calls). https://developers.facebook.com/docs/reference/api/errors/
I'm using the Twitter search API (for example: http://search.twitter.com/search.rss?q=%23juventus&rpp=100&page=4)
I read here: http://search.twitter.com/api/ this:
We do not rate limit the search API under ordinary circumstances, however we have put measures in place to limit the abuse of our API. If you find yourself encountering these limits, please contact us and describe your app's requirements.
The limit seems random: sometimes I do 150 requests sometimes 300, generally, after 5 minutes I can do other requests.
I was wondering if is it possible do more requests
They'll detect floods and throttle accordingly rather than publisher defined limits, which is why it appears random. It'll also no doubt be based on load from other sources at the time.
If you need lots more, then they gave you the answer - contact them telling them why.