I am implementing autocomplete (one search per new character added) in an app that searches for addresses, and I keep getting MKErrorDomain error 3, which is MKErrorLoadingThrottled. This error, according to Apple dev, occurs when
The data was not loaded because data throttling is in effect. This
error can occur if an app makes frequent requests for data over a
short period of time.
I know exactly how many requests are being made, one for each new charachter in the search query (just like you would expect autocomplete to work). Sure, I am a fast typer, but being able to hit the limit after just 10 or 15 requests seems absurd. Looking at the following two source references, I do not understand why I keep getting throttled.
According to Apple dev:
There are no request limits per app or developer ID, so well-written
apps that operate correctly should experience no problems. However,
throttling may occur in a poorly written app that creates an extremely
large number of requests.
and as James Howard said at a WWDC:
And the other thing I want to talk about is the Usage Limits on this
API.
So, I'm happy to announce that there's no application or developer
identifier wide usage limits.
So, if you have a app that has a lot of users and you want to do a lot
of requests, that's fine.
It'll work.
And the throttling that we do have is really just a first line of
defense against buggy apps.
So, if you put directions requests or local search requests in an
infinite loop, you've got a bug, eventually you're going to get
throttled.
But if you do something reasonable, you say oh, I'm going to just do
directions in response to user input and you know you can do a few of
those because we showed them that example.
Like we did two directions request in response to one user input,
that's fine.
But, you know if you're doing 10,000 every time the user taps on the
screen, then you're going to get throttled.
But, just keep it reasonable and you'll be fine.
Any ideas to why this is happening??
Autocompletion requires a special APIs. MapKit doesn't offer such an interface. Just firing off dozens of requests to the normal search API causes a tremendous load.
You basically have two options:
Go with Google Places. They have a dedicated Places Autocompletion API. There is even a complete library for iOS on GitHub.
Reduce the number of requests, e.g. by only sending a request if the user has paused typing for 300ms and only if no earlier request is outstanding. But that's still no guarantee that Apple won't throttle your requests.
MKLocalSearch is primarily intended for finding points of interest (businesses, etc.) within a map's bounds. CLGeocoder is for structured address and location lookups.
The CLGeocoder documentation specifies that CLGeocoder requests are rate limited, and the documentation provides guidance on how to be a good citizen.
Of particular note is the first item in the guidelines: "Send at most one request for any user action". This should be applied to MKLocalSearch as well - if you have multiple requests in flight at the same time, you are VERY likely to get throttled.
This is actually pretty easy to implement: Before a new MKLocalSearchRequest is sent, cancel any pending requests. This makes a lot of sense for implementing autocomplete like you describe: if the user is entering the 4th character, you probably don't need the request or response for the 3rd character.
Run your app in the Time Profiler Instrument an see how many calls to that method are being made when you type.
I'm just wrote Helper on Swift to help make suggests with Apple MapKit API. It's call search request when user stop typing request. https://github.com/ArniDexian/GeocodeHelper
The usage is pretty simply:
func searchBar(searchBar: UISearchBar, textDidChange searchText: String) {
GeocodeHelper.shared.decode(searchText.trimmed(), completion: { [weak self](places) -> () in
self?.dataSource.locations = places
self?.tableView.reloadData()
return
})
}
Related
Hi I have an app which uses Mapbox, I am also using geocoding to search places and to navigate to the location. It was working smoothly until I tried the keyword "Nayara" in the searchField.
I am getting this error when I search "Nayara" in the textfield, the textFieldIsChanging delegate is connected to the GeoCoding API(which is async communication and result is populated in the tableView). I can successfully search all other places but not this one. Is this a bug in the map box? Is this the only one keyword which has problems or are there any other keywords which makes the app behave like this? Expert advices needed. Thanks in advance. Happy coding.
When making multiple async requests, it's possible for the responses to be returned in a different order from the order in which they were requested. This is particularly an issue when requests take a variable amount of time (as is the case for geocoding queries).
In this situation, a query for Nayar probably takes longer than a query for Nayara, and the difference is enough that the results arrive out of order, so the Nayar response overwrites Nayara in the UI dropdown.
Typical solutions to this problem involve either adding a debounce (so that you only make a new API request if an arbitrary amount of time has elapsed between keystrokes), or tracking the timestamp of both the request and the response, and discarding stale responses that arrive out of order.
An example of the latter approach can be seen here: https://github.com/mapbox/react-geocoder/pull/9
I have an App that has the locations of 10 different places.
Given your current location, the app should return the estimated arrival time for those 10 locations.
However, Apple has said that
Note: Directions requests made using the MKDirections API are server-based and require a network connection.
There are no request limits per app or developer ID, so well-written apps that operate correctly should experience no problems. However, throttling may occur in a poorly written app that creates an extremely large number of requests.
The problem is that they make no definition on what a well written app is. Is 10 requests bad? Is 20 requests an extremely large number?
Has any one done an app like this before to provide some guidance? If Apple does begin throttling the requests, then people will blame my app and not Apple. Some advice please..
Hi investigate class MKRoute this class contains all information you need.
This object contains
expectedTravelTime
Also you should consider LoadingThrottled
The data was not loaded because data throttling is in effect. This
error can occur if an app makes frequent requests for data over a
short period of time.
For prevent your request from bing throttled, reduce number of requests.
Try to use Completion Handlers to know if you request is finished and only after send another request or cancel previous. From my experience try to handle this request just as regular network request just be sure you are not spamming unnecessary requested to the Apple API. But this is not 100% guarantee that Apple won't throttle your requests.
As the title says, apple doesn't provide any explicit answer to that question. I don't want to use google api because of it's request's limitation and I wonder if MapKit in iOS 8 got any? (so far there were no such limitations, but with each release of iOS things may change).
If there's no such limits, what are the drawbacks of using MapKit in iOS8 release? Are there any cases when Google Maps API become more helpful?
Thanks in advance
https://developer.apple.com/library/mac/documentation/UserExperience/Conceptual/LocationAwarenessPG/ProvidingDirections/ProvidingDirections.html
Look for the part - Getting General-Purpose Directions Information
"There are no request limits per app or developer ID, so well-written apps that operate correctly should experience no problems. However, throttling may occur in a poorly written app that creates an extremely large number of requests"
The only limitation I'm familiar with is the reverse geocoding of coordinates via CLGeocoder which says:
Geocoding requests are rate-limited for each app, so making too many requests in a short period of time may cause some of the requests to fail. When the maximum rate is exceeded, the geocoder passes an error object with the value kCLErrorNetwork to your completion handler.
Unfortunately, I've never seen this limitation quantified.
Personally, I always assumed this was a caveat to prevent people from writing code that tried to abuse the API, using it to programmatically mine the geocode database by repeatedly reverse geocoding every point on a grid, or what have you. I've never run up against this limitation in standard user interaction with a map.
As far as I know there is no limitation in Apple Maps requests. And the advantage with google maps Api is better map data.
I'm struggling with a problem of implementing dynamic search.
Here what i want to achieve:
In my application there is an option that user (program manager) will be able to search his team members. Each PM has its account on the server side (web service) where it is a table team_members which contains all the team members that correspond to this manager.
Their amount can be more than hundreds.
And client side app which I develop has an option search team members.
I want to implement it dynamically:
e.g When the user print first letter a I make a query to the server and get all the matches with a letter: Antuan, BArrow, etc.
Then user print ab and I make a query which must return Abraham, Abdulla, etc. And so on. All the matching results is shown in UITableView.
HTTP query to server is made each time text in UITextField shanges. I implement it with dispatch_async: on UITextFieldTextDidChangeNotification I create a dispatch_async where i make an HTTP request with searchbar.text.
The problem: it works very slow. I often get an exception bad selector was sent to the instance.
So my question:
Why is my approach bad? What is a better solution for this?
Or dynamic search it is a VERY bad idea and I must do search only on clicking some button?
Thanks. I hope some experienced iOS developers will give me good advice.
Making an HTTP request every time someone types a character is probably never going to be fast enough (nor does it really make sense -- read on). For a certain size of list, the answer would be to pull over the whole list in the background as soon as you present the field (but before the user starts typing in it). Once you have the list, you can start matching, in memory, on the local device. "More than hundreds" isn't very specific, and it depends on network speed, but I'd guess that if your list is less than 50K in payload size, pulling the whole thing will be the easiest way.
If the list is too big for that to be practical, but the list limited by the first character the user types is not too big, then the best approach might be to wait for the first character, fire off your HTTP request asynchronously, and only start the dynamic match once you've received the response containing all items that start with that letter. One thing to keep in mind is this: if you have the list limited by the first character, that's the only HTTP request you ever have to make (unless the user changes the first character) because all possibilities starting with that letter will be in that list. From there, you can pare down the list locally without any further HTTP requests.
As for why you're getting exceptions, it's hard to say without seeing your actual code. Try setting an exception breakpoint in Xcode. This will allow you to stop in the debugger when the exception is thrown, which will show you what's causing it.
Let's imagine app which is not just another way to post tweets, but something like aggregator and need to store/have access to tweets posted throught.
Since twitter added a limit for API calls, app should/may use some cache, then it should periodically check if tweet was not deleted etc.
How do you manage limits? How do you think good trafficed apps live while not whitelistted?
To name a few.
Aggressive caching. Don't call out to the API unless you have to.
I generally pull down as much data as I can upfront and store it somewhere. Then I operate off the local store until it runs out and needs to be refreshed.
Avoid doing things in real time. Queue up requests and make them on a timer.
If you're on Linux, cronjobs are the easiest way to do this.
Combine requests as much as possible.
Well you have 100 requests per hour, so the question is how do you balance it between the various types of requests. I think the best option is the way is how TweetDeck which allows you to set the percentage and saves the rest of the % for posting (because that is important too):
(source: livefilestore.com)
Around the caching a database would be good, and I would ignore deleted ones - once you have downloaded the tweet it doesn't matter if it was deleted. If you wanted to, you could in theory just try to open the page with the tweet and if you get a 404 then it's been deleted. That means no cost against the API.