Site indexing by Google (Google Webmaster Tools) - ruby-on-rails

In brief: my site is not being indexed by Google. Google has a lot of docs in their FAQ but they do not tell how long it takes until a site is indexed: hours? days? weeks? (the latter would be an explaination, the site is online for just a week).
What I did until now:
registered with Google Webmastertools and validated both without and with www.
put my site's atom feed to sitemaps in the Webmastertools and provided a basic textfile with 5 basic urls of the site (main, contact, about, ...) --> show 19 urls, 0 indexed.
robots.txt set to all (last fetched by google 4 hours ago) [User-agent: * Allow: / ]
Webmastertools "fetch as google bot" tested ok
The site is a Rails site and pretty accessible. It consists mostly of articles so there should not be any problem to fetch it for google (http://www.communityguides.eu).
Clearly there are no sites with high pagerank linking to it, however, my question is about indexing at all. Did I forget something or do I just have to be patient?

It's definitely more weeks than days. Keep patient, it might take several week for a first index!

Related

Twitter Search API for around 3 month data

I am working since last 3 days to find solution about how we can get old data from twitter api around 3 month over limitation of twitter api about 1 week. can some one help me to know best solution.
You cannot get any tweet for the last 3 months apart if you store them in your own database.
The only solution for you would be to pay Twitter GNIP to access historical data.
The Twitter API only provides the most recent tweets for a given search, up to ~3200.
The so-called "firehose" and some datasets are available mostly for research or developer purposes.
There are some services which sell custom datasets, including Twitter itself (which purchased Gnip). The overview by Justin Littman at GWU is rather comprehensive.

Google CSE does not find recent pages added to my website

I added a Google CSE to my website, that searchs only in my website: *.mywebsite.com/*
I verified my website in Webmasters Tools
and after 3-4 hours the engine was able to find all the pages properly...
All this happened about 2 weeks ago.
Two days ago I added more 5 pages to my website, but today,
the engine still does not find the new pages...
What steps have I to follow, to fix this problem?
Try to resend the sitemap in webmaster tools, or add the new links in Google CSE control panel

Using search engines to get URLs

I'm building a portal that lists certain products and automatically gets the prices from product pages of listed vendors. To get the URL for the product page on a vendor's website, I've been using Google search API and it's been working great - the first result is invariably the page of the product. However, now I'm getting errors saying that Google has blocked my website (actually my develoment machine's IP) from the API because I've been making automated requests such as scraping (the only item that applies).
Fine, Google can go jump off a cliff, but... how do product portals generally get URLs for thewir products? I can enter the URLs manually but that can be a problem if the vendor's website changes the URL scheme somehow. I obviously need an automated way to do this.
I'm making no more than 50-60 requests per day so I don't get what Google wants. Do they want money?
First, they want you to use one of their APIs, not scrape their web page directly. Their custom search API is documented here. Once you register they'll give you an API key. You can get results in JSON format by requesting
https://www.googleapis.com/customsearch/v1?q=SEARCH_TERMS&key=YOUR_KEY
Second, they do like money, but you might be okay. You're allowed 100 searches per day for free; beyond that you're you're going to be charged $5 per thousand searches.

Tracking Page Popularity in a Time Frame in Rails

I'm very new to web development and this seems like a basic question, so perhaps I just lack the correct terminology to search it on Google.
On my site I plan to have many dynamically generated pages, based off data in a MySQL server, and I would like to know which ones people have been visiting the most, in say, the last 24 hours, so that I can place these most popular page on the front page of the site. How would I/would I be able to accomplish this in a Rails application.
What you're looking for is a web analytics solution to analyze your traffic, and possibly your marketing effectiveness. Here are some of the most prominent services you could use with your website:
Google Analytics
Chartbeat
Reinvigorate
HaveAMint
GetClicky
Piwik
Woopra
Personally, I use Google Analytics as its setup is darn simple: configure your account, add a Javascript snippet on each of the pages you want to track, and you're done.
You could also look out for web analytics software that you would host. All in all, take a look at this Wikipedia page for more information.

Finding top twitter users?

There is a large number of sites like Twitaholic or Twittergrader that offer rankings of Twitter users depending on the number of followers, influence, etc. I haven't found much information, though, on how do they compute these rankings.
My guess is that they begin with a handful users and keep exploring the followers' graph, while periodically updating the information of the users they already know of.
So the question is: is this the right approach or is there a more trivial way of doing it?
The sites you mention started years ago, and at that time they were given whitelisting by Twitter, which means that they can make tens of thousands of API requests per hour. Twitter no longer gives out new whitelisted accounts, so this type of analysis cannot be done by new sites. New accounts are only allowed to make 350 API requests per hour.
It is in fact possible just to use the Twitter API to examine and remember everything about every user, which is what quite a few sites do. twitter streaming api

Resources