Microsoft Graph throttling Excel updates - microsoft-graph-api

I am developing a Node.js app which connects to the Microsoft Graph API.
Often times, I get back a 429 status code, which is described as "Too Many Requests" in the error documentation.
Sometimes the message returned is:
TooManyRequests. Client application has been throttled and should not attempt to repeat the request until an amount of time has elapsed.
Other times, it returns:
"TooManyRequests. The server is busy. Please try again later.".
Unfortunately, it is not returning a Retry-After field in the headers, even though their best practices claims that it should do so.
This is entirely in development, and I have not been hitting the service much, as it has all been during debugging. I realize Microsoft is often changing how this works. I just find it difficult to develop an app around a service which does not even provide a Retry-After field, and seems to have a lot of problems (I am using the v1.0 endpoint).
When I wait 5 minutes (as I have seen recommended), the service still errors out. Here is an example return response:
{
"error": {
"code": "TooManyRequests",
"message": "The server is busy. Please try again later.",
"innerError": {
"request-id": "d963bb00-6bdf-4d6b-87f9-973ef00de211",
"date": "2017-08-31T23:09:32"
}
}
}
Could this relate at all to the operation being carried out?
I am updating a range from A2:L3533. They are all text values. I am wondering if this could impact the throttling. I have not found any guidance regarding using "smaller" operation sets.

Without seeing your code, it is hard to diagnose exactly what is going on. That said, you're Range here is enormous and almost certainly will result in issues.
From the documentation:
Large Range implies a Range of a size that is too large for a single API call. Many factors such as number of cells, values, numberFormat, and formulas contained in the range can make the response so large that it becomes unsuitable for API interaction. The API makes a best attempt to return or write to the requested data. However, the large size involved might result in an API error condition because of the large resource utilization.
To avoid this, we recommend that you read or write for large Range in multiple smaller range sizes.

Related

Quota exceeded for quota metric 'Requests' and limit 'Requests per minute' of service 'mybusinessbusinessinformation.googleapis.com' for consumer

I'm trying to collect and update data using the Business Information API.
In order to get the API Calls to work, I'm only trying to get information from my business by using "Get-requests". However when calling several methods, I keep receiving the following errors:
"Quota exceeded for quota metric 'Requests' and limit 'Requests per minute' ".
Both in the Postman-calls or the OAuth 2.0 Playground (which in my eyes: should be a sandbox, ready for testing - very frustrating…).
When I look for my quota in the API settings: I'm not even able to change the requests per minute other than '0'. This makes it really hard to test/use the API.
I can't even find out which categories there are for a business location… 
For your information: I've already asked for increase of the quota using the forms. But it seems google isn't really responsive in this matter.
Can this be solved?
The API shall be used to update a group of 50 (or more) locations, this instead of bulk-editing with a csv-file.
Any help would be welcome.
Thanks in advance,
Kind Regards,
Seppe
If the quota approval form was ignored, you might still have a chance via the API support (https://support.google.com/business/contact/api_default).
They might be reluctant to grant you a quota if your maximum location count is this low though - the API is designed for larger use cases.
Is it documented anywhere that it's meant for larger users? I got approved being very clear it was only for a handful of locations.
BUT even though I got approved and have access there are 3 specific quotas (all per-minute) that are set to zero, even though I have tonnes of allowance for all the non-per-minute quotas. Seems like a bug to me.
I can make 10000 "Update Location requests per day" but zero per minute.

Throttling of OneNote (Graph) API

We have developed an importing solution for one of our clients. It parses and converts data contained in many OneNote notebooks, to required proprietary data structures, for the client to store and use within another information system.
There is substantial amount of data across many notebooks, requiring a considerable amount of Graph API queries to be performed, in order to retrieve all of the data.
In essence, we built a bulk-importing (batch process, essentially) solution, which goes through all OneNote notebooks under a client's account, parses sections and pages data of each, as well as downloads and stores all page content - including linked documents and images. The linked documents and images require the most amount of Graph API queries.
When performing these imports, the Graph API throttling issue arises. After certain time, even though we are sending queries at a relatively low rate, we start getting the 429 errors.
Regarding data volume, average section size of a client notebook is 50-70 pages. Each page contains links to about 5 documents for download, on average. Thus, it requires up to 70+350 requests to retrieve all the pages content and files of a single notebook section. And our client has many such sections in a notebook. In turn, there are many notebooks.
In total, there are approximately 150 such sections across several notebooks that we need to import for our client. Considering the stats above, this means that our import needs to make a total of 60000-65000 Graph API queries, estimated.
To not flood the Graph API service and keep within the throttling limits, we have experimented a lot and gradually decreased our request rate to be just 1 query for every 4 seconds. That is, at max 900 Graph API requests are made per hour.
This already makes each section import noticeably slow - but it is endurable, even though it means that our full import would take up to 72 continuous hours to complete.
However - even with our throttling logic at this rate implemented and proven working, we still get 429 "too many requests" errors from the Graph API, after about 1hr 10mins, about 1100 consequtive queries. As a result, we are unable to proceed our import on all remaining, unfinished notebook sections. This enables us to only import a few sections consequtively, having then to wait for some random while before we can manually attempt to continue the importing again.
So this is our problem that we seek help with - especially from Microsoft representatives. Can Microsoft provide a way for us to be able to perform this importing of these 60...65K pages+documents, at a reasonably fast query rate, without getting throttled, so we could just get the job done in a continuous batch process, for our client? In example, as either a separate access point (dedicated service endpoint), perhaps time-constrained eg configured for our use within a certain period - so we could within that period, perform all the necessary imports?
For additional information - we currently load the data using the following Graph API URL-s (placeholders of actual different values are brought in uppercase letters between curly braces):
Pages under the notebook section:
https://graph.microsoft.com/v1.0/users/{USER}/onenote/sections/{SECTION_ID}/pages?...
Content of a page:
https://graph.microsoft.com/v1.0/users/{USER}/onenote/pages/{PAGE_ID}/content
A file (document or image) eg link from the page content:
https://graph.microsoft.com/v1.0/{USER}/onenote/resources/{RESOURCE_ID}/$value
which call is most likely to cause the throttling?
What can you retrieve before throttling - just pageids (150 calls total) or pageids+content (10000 calls)? If the latter can you store the results (eg sql database) so that you don't have to call these again.
If you can get pageids+content can you then access the resources using preAuthenticated=true (maybe this is less likely to be throttled). I don't actually offline images as I usually deal with ink or print.
I find the onenote API is very sensitive to multiple calls without waiting for them to complete, I find more than 12 simultaneous calls via a curl multi technique problematic. Once you get throttled if you don't back off immediately you can be throttled for a long, long time. I usually have my scripts bail if I get too many 429 in a row (I have it set for 10 simultaneous 429s and it bails for 10 minutes).
We now have the solution released & working in production. Turns out that indeed adding ?preAuthenticated=true to the page requests returns the page content having resource links (for contained documents, images) in a different format. Then, as it seems, querying these resource links will not impact the API throttling counters - as we've had no 429 errors since.
We even managed to bring the call rate down to 2 seconds from 4, without any problems. So I have marked codeeye's answer as the accepted one.

How to Change Microsoft Graph requests from throttling Excel requests

I am calling the Microsoft Graph REST API from Node.js (JavaScript). I receive the result of GET operations for a single cell which is empty as returned with a status code 429 "TooManyRequests - The server is busy. Please try again later." error. Another SO question [ Microsoft Graph throttling Excel updates ] has answers that point to MS documentation about making smaller requests. Unfortunately, the suggestions are rather vague.
My question is does the size of the file in OneDrive have an impact on throttling? The file I am attempting to update is over 4 MB in size. However, the updates (PATCH) that I have attempted are only 251 bytes (12 cells) and I continue to get the error. Even a GET for a single cell receives this. This happened after 72 hours of inactivity. I am using a business account, and unfortunately MS support will not help, as they will only speak to Admins.
Assuming this is an unrelated issue, as I do have about 3500 rows (of about 12 columns) to update, what is the best "chunk size" to update them in? Is 50 ok? Is 100 ok? Thank You!
NOTE: This same throttling happens in the Graph Explorer, not just via code. Also, there is no Retry-After field returned in the Response Headers.

How to get estimated time of arrival to multipe destinations on iOS?

I have an App that has the locations of 10 different places.
Given your current location, the app should return the estimated arrival time for those 10 locations.
However, Apple has said that
Note: Directions requests made using the MKDirections API are server-based and require a network connection.
There are no request limits per app or developer ID, so well-written apps that operate correctly should experience no problems. However, throttling may occur in a poorly written app that creates an extremely large number of requests.
The problem is that they make no definition on what a well written app is. Is 10 requests bad? Is 20 requests an extremely large number?
Has any one done an app like this before to provide some guidance? If Apple does begin throttling the requests, then people will blame my app and not Apple. Some advice please..
Hi investigate class MKRoute this class contains all information you need.
This object contains
expectedTravelTime
Also you should consider LoadingThrottled
The data was not loaded because data throttling is in effect. This
error can occur if an app makes frequent requests for data over a
short period of time.
For prevent your request from bing throttled, reduce number of requests.
Try to use Completion Handlers to know if you request is finished and only after send another request or cancel previous. From my experience try to handle this request just as regular network request just be sure you are not spamming unnecessary requested to the Apple API. But this is not 100% guarantee that Apple won't throttle your requests.

iOS app getting throttled from local searches

I am implementing autocomplete (one search per new character added) in an app that searches for addresses, and I keep getting MKErrorDomain error 3, which is MKErrorLoadingThrottled. This error, according to Apple dev, occurs when
The data was not loaded because data throttling is in effect. This
error can occur if an app makes frequent requests for data over a
short period of time.
I know exactly how many requests are being made, one for each new charachter in the search query (just like you would expect autocomplete to work). Sure, I am a fast typer, but being able to hit the limit after just 10 or 15 requests seems absurd. Looking at the following two source references, I do not understand why I keep getting throttled.
According to Apple dev:
There are no request limits per app or developer ID, so well-written
apps that operate correctly should experience no problems. However,
throttling may occur in a poorly written app that creates an extremely
large number of requests.
and as James Howard said at a WWDC:
And the other thing I want to talk about is the Usage Limits on this
API.
So, I'm happy to announce that there's no application or developer
identifier wide usage limits.
So, if you have a app that has a lot of users and you want to do a lot
of requests, that's fine.
It'll work.
And the throttling that we do have is really just a first line of
defense against buggy apps.
So, if you put directions requests or local search requests in an
infinite loop, you've got a bug, eventually you're going to get
throttled.
But if you do something reasonable, you say oh, I'm going to just do
directions in response to user input and you know you can do a few of
those because we showed them that example.
Like we did two directions request in response to one user input,
that's fine.
But, you know if you're doing 10,000 every time the user taps on the
screen, then you're going to get throttled.
But, just keep it reasonable and you'll be fine.
Any ideas to why this is happening??
Autocompletion requires a special APIs. MapKit doesn't offer such an interface. Just firing off dozens of requests to the normal search API causes a tremendous load.
You basically have two options:
Go with Google Places. They have a dedicated Places Autocompletion API. There is even a complete library for iOS on GitHub.
Reduce the number of requests, e.g. by only sending a request if the user has paused typing for 300ms and only if no earlier request is outstanding. But that's still no guarantee that Apple won't throttle your requests.
MKLocalSearch is primarily intended for finding points of interest (businesses, etc.) within a map's bounds. CLGeocoder is for structured address and location lookups.
The CLGeocoder documentation specifies that CLGeocoder requests are rate limited, and the documentation provides guidance on how to be a good citizen.
Of particular note is the first item in the guidelines: "Send at most one request for any user action". This should be applied to MKLocalSearch as well - if you have multiple requests in flight at the same time, you are VERY likely to get throttled.
This is actually pretty easy to implement: Before a new MKLocalSearchRequest is sent, cancel any pending requests. This makes a lot of sense for implementing autocomplete like you describe: if the user is entering the 4th character, you probably don't need the request or response for the 3rd character.
Run your app in the Time Profiler Instrument an see how many calls to that method are being made when you type.
I'm just wrote Helper on Swift to help make suggests with Apple MapKit API. It's call search request when user stop typing request. https://github.com/ArniDexian/GeocodeHelper
The usage is pretty simply:
func searchBar(searchBar: UISearchBar, textDidChange searchText: String) {
GeocodeHelper.shared.decode(searchText.trimmed(), completion: { [weak self](places) -> () in
self?.dataSource.locations = places
self?.tableView.reloadData()
return
})
}

Resources