I know that google used to provide a service some time back where you could enter a keyword and google would tell you how many searches have been made using the exact and related keywords. This was a very useful feature while performing SEO(Search Engine Optimization). However I cannot find that particular tool anywhere. Has the service been discontinued? If yes, then what are the alternatives?
It has been replaced by the 'Keyword Planner:
https://adwords.google.com/o/KeywordTool
Some of the options are different, but you cans till get keyword and number of searched from it.
Related
I've skimmed through the Keywords Performance Report of the API documentation, and couldn't understand whether it would be possible for me to use this report to determine daily keyword costs.
What I want is basically to be able to look for keyword to an API request result and get the cost associated with it. Is such a thing possible? Am I looking in the right place?
Apparently, it's not possible to do so, since all costs on all Display Network items are listed with a special ID (3000000) in costs, meant to capture all GDN displays.
I want to be able to run queries locally comparing latitude and longitude of locations so I can run queries for certain addresses I've captured based on distance.
I found a free database that has this information for zip codes but I want this information for more specific addresses. I've looked at google's geolocation service and it appears it's against the TOS to store these values in my database or to use them for anything other than doing stuff with google maps. (If somebody's looked deeper into this and I'm incorrect let me know)
Am I likely to find any (free or pay) service that will let me store these lat/lon values locally? The number of addresses I need is currently pretty small but if my site becomes popular it could expand quite a bit over time to a large number. I just need to get the coordinates of each address entered once though.
This question hasn't received enough attention...
You're correct -- it can't be done with Google's service and still conform to the TOS. Cheers to you for honestly seeking to comply with the TOS.
I work at a company called SmartyStreets where we process addresses and verify addresses -- and geocode them, too. Google's terms don't allow you to store the data returned from the API, and there's pretty strict usage limits before they throttle or cut off your access.
Screen scraping presents many challenges and problems which are both technical and ethical, and I don't suppose I'll get into them here. The Microsoft library linked to by Giorgio is for .NET only.
If you're still serious about doing this, we have a service called LiveAddress which is accessible from any platform or language. It's a RESTful API which can be called using GET or POST for example, and the output is JSON which is easy to parse in pretty much every common language/platform.
Our terms allow you to store the data you collect as long as you don't re-manufacture our product or build your own database in an attempt to duplicate ours (or something of the like). For what you've described, though, it shouldn't be a problem.
Let me know if you have further questions about address geocoding; I'll be happy to help.
By the way, there's some sample code at our GitHub repo: https://github.com/smartystreets/LiveAddressSamples
http://www.zip-info.com/cgi-local/zipsrch.exe?ll=ll&zip=13206&Go=Go could use a screen scraper if you just need to get them once.
Also Microsoft provides this service. Check if this can help you http://msdn.microsoft.com/en-us/library/cc966913.aspx
Basically I'm looking for a search engine that searches through a given database. The content will be text being searched.
You will probably want to use a service such as Solr. The easiest way to get started using it is to find a 'cloud' based version, such as Websolr. However, the solution will depend on what language you wish to use when programming your site.
Solutions depend somewhat on language:
1. For java/c#, you have lucene/solr
2. for python you have haystack
You could do text search in the DB directly via LIKE/ILIKE, but the performance depends on DB.
Iconfinder was coded specifically for icon search and at the time (launched in 2007) there were no scripts that worked well for this.
Building a search engine like Iconfinder is not rocket science. I think the hardest part is getting the SQL tuned and figure out how to rank the content. At the moment I collect data about impressions and downloads and calculate a value from that. The icons' rank is based on this value (download/impression) and how well keywords match the tags for the given icon.
What is the difference between:
Google Keyword Tool and
Google Search-Based Keyword Tool
For example, when I search for "photo editing", I get completely two different Google report for number of monthly searches: 1,500,000 vs 54,000. Why does Google reporting two different number for exact keywords? Any idea?
Here's the difference between the two, as stated by Google here.
The main difference between the
Search-based Keyword Tool and the
Keyword Tool currently in AdWords is
that the former generates keyword
ideas based on your website, and
identifies those currently not being
used in your AdWords account.
Additionally, the Search-based Keyword
Tool provides more detailed data for
each keyword, such as category
information, suggested bid that may
place the ad in the top three spots of
a search results page, and ad/search
share. Both tools, however, offer the
option of browsing all keywords across
all categories.
You may also notice that some of the
data (like such as the monthly search
volume) may vary slightly between the
two tools, which is due to different
methods of calculation at this time.
Let's say I'm just wondering around with my cellphone and I want to know exactly which place of business I'm in. This would seem to be easy, but I don't see away to do it. It's possible to reverse geocode but this gives an address range. Google doesn't seem to have http base local search using local information, because you could kind of guess from the local search or points of interest. It needs to be through an http API, not an ajax driven map. Is there a way to do this?
You might look at GeoAPI, which lets you search for businesses near a particular lat/lon coordinate and returns detailed information about the business (name, type, hours, etc.). It's a simple JSON API with good documentation and examples.
There's likely more APIs out there for local business data -- which I personally would love to hear about if people want to add them as answers to this question or comments on my answer. What's your favorite? What are the advantages and disadvantages?