Google Places Api Daily Limit Issue - ios

Hi I have made an app and enable Google Places api for that project.
I create a project on google developer portal and add the key in to my iOS app After that I start to send request to google Api and My search Works very well as I also know that google has per day request limit on each Api
Now today I send Max. 30 request and after that from google Api I was given error that your daily quota is reached which is 1k request / day.
I did not find any solution on any where Kindly help me to solve this problem.

You need to request more quota:
https://developers.google.com/places/uplift
If you reached the quota while testing your app, make sure that you have used the developer key for that purpose.

What kind of Search are you using?
Some types of searches cost more than others.
For example, one of the autocomplete search types, Text Search, has a 10x multiplier. (https://developers.google.com/places/web-service/usage)
Make sure that you look at the usage limits specified for each API/service you are using.
Additionally, if you are looking at your online dashboard you can see the real-time number of requests that have been made / are being made to each API/service by project. This can help debug when these requests may be being made and how many are occurring per call you believe to be making.

Related

Is youtube data api totally free

I am going to implement a python client that search videos on youtube with different queries. Apparently I should use youtube data api for this. Even though I read quata cost I just want to be sure that using youtube api is totally free of charge. Sorry it is too basic.
Yes, using the YouTube API does not incur any monetary cost for the entity calling the API. If you go over your quota an 403 Error will be returned by the API.
Links:
YouTube API Quota DetailsYouTube Quota Calculator
Google already provides a Python client for all of its APIs, including YouTube, which handles authentication, forming and making the API request as well as some datatype translation (i.e. JSON to dictionary, etc.). (link)
Yes it is, but some restrictions like limit you can use only 100000 units per day.
and 3000 per second per 100 user per day. For more quotas you have to apply for it. You can apply key or oauth id at HERE. Hope it will help you.

Google Sheets Java API 3 quota

I'm using Java SpreadsheetService class from gdata 1.47.1 and from time to time my service account gets blocked and asked to solve some CAPTCHA. In the block message Google asks me to prove that I'm not a robot and I violated some terms. This happens usually when I exceeded 18-22 requests from one IP and from one service account per one second.
Does anybody know is there a way to avoid such blocking and/or increase my quota. I found that I can control the quota for Drive API and many others, but still can't get how to control Spreadsheet quota.
UPDATED
The interesting moment, that using spreadsheet API doesn't affect to any quota included in Google developer console. For example when I create document it hits to Drive's quota (I can see it in 'Usage quota'), but when I update cells/get worksheets via spreadsheet API it doesn't hit any.
Project quotas can be seen on the Project's developer console page. Typically a project is given
1,000,000,000 requests/day
1,000 requests/100seconds/user
In which you can add more if necessary (you'll incur charges though). In terms of quota specific to Spreadsheet, I don't think there's any references to that. I'll probably just assume its the same quota the Drive API is using.

What's the quota for Document List API requests?

We're currently integrating Google Drive/Docs access in our mobile Apps and use the Google Document List API for this purpose. Are there any restrictions on the number of requests allowed for single API key?
I can't find any information in the Google API Console as the Document List API is not listed there. I can only activate the Google Drive API (which does not yet support functionalities we need).
The Documents List API does not use API keys in the same way as the newer APIs such as Drive. We (Google) do not give exact quota details for this API, but in general the value is extremely high. You may encounter 503 responses which indicate that you should perform exponential backoff. If, despite this, you are hitting an absolute ceiling, you should contact us and we will investigate, and look to increase your quota.

Twitter - Constant search for term

I wonder if anyone can help me, I'm getting a little confused as to
which API to use. If anyone can offer some guidance I would really
appreciate it.
I'm trying to create an website where users can monitor Twitter for
certain hashtags. The site will continually search twitter for any new
updates and store any tweet related to that particular hashtag. This process will run for up to 60 days.
As far as I can gather, my two options are:
Using the Search API
The problem with this API is that if I have a 1000 users all
monitoring different hashtags, I am quickly going to reach my API
limit since I will be making a fair few requests, potentially once
every 2-3 minutes. Is there a way to use oauth in conjunction with the
search API so that the limits are user based and not application
based? That way, the limit will be user specific and I won't have to
worry.
Using the Stream API
I thought this might be a better solution, but it seems you are
limited to how many connections you can have open. The documentation
seems unclear as to how this works... is the connection limit per twitter account
or service ip? For example, if my site had 1000 users each of those users was
monitoring a hashtag, would those 1000 stream api connections be
against my servers ip or would they against the user?
You will want to use the Streaming API. You will open a single connection that will track the terms for all of the users. When users add new terms to track you will restart the stream with the new terms. The single stream will be for a bot Twitter account you create and not your users accounts.

google geocoding limit

there is a limit of 2400 geocoding request for google service. even if each request is cached and not duplicated its possible to exceed this limit if the request is being made from a rails app.
short of purchasing the premium package(which i dont know the cost of), what else can one do?
thanks
I believe that if you geocode on the client side that you won't have any issues with geocoding limits. Calls to google.maps.Geocoder() and the google.loader.ClientLocation() both count against the IP of the client machine rather than your server IP. If you need to some on the server side I would second Geoff's suggestion to use Geokit's multigeocoder.
I do all of my geocoding from the application servers using geokit. That allows me a backup of yahoo maps using their multigeocoder. That way - if one fails, the other succeeds. Geokit also provides an identical interface to the two services, so you only need to code to the one abstraction layer. The google limit is per server per day, so if you have multiple app servers you can spread out the load to increase your limit. Yahoo's limits are 5000/server/day.
Hope this helps, good luck!
This is what I did:
1. Try to use the Navigator's Location functionality.
2. If that fails, use Google Gears (I dont think it consumes google limit)
3. If that fails use Google Maps API
4. If that fails (limit exceeded), geolocate based on user IP and maxmind database (free)
The source here may be of some help: gvkalra.appspot.com/reachme
You can use google fusion tables to geocode your data. Limits are pretty high there.

Resources