SEO Strategies: Directory, separate domain, or sub-domain? - url

What is the optimal SEO strategy for storing a blog?
1) In a directory: www.example.com/blog
2) In a separate domain: www.exampleblog.com
3) In a sub-domain: www.blog.example.com
With a directory, the repetitions earned by the blog are directly transferred to the main domain (www.example.com). With a separate domain, any links to my site would count as backlinks.
I'm leaning towards option 1. What other pros and cons should I consider?

This is comprehensive, Sub-domain versus sub-directory (via Webmasters SE). It was updated in November 2012. Look at this answer too, as it specifically describes, with a huge chart, what effect sub-folders (meaning sub-directory in this context) versus sub-domains have on SEO, and how use of reverse proxy can affect blog SEO. The gist of it is that a sub-domain is preferable to a sub-directory.
EDIT
I may have mis-read the question. If the choice is between
mywebsitename.com/blog
versus
mywebsitenameblog.com
then I would definitely recommend using the sub-directory. This is why:
If you use an entirely different domain name, even if it is your website's domain with the four letters blog concatenated, it will be confusing to users, as no one does that!
You will need to pay for a second domain and that costs more money.
You'll be doing something that is inconsistent with typical website naming conventions, which I'd avoid if I were concerned about SEO and were developing an e-commerce website. I don't know if it will negatively affect SEO ranking, but it won't help, as it will be an entirely different domain name, without any of the positive reputation or credibility of your primary domain name.
It will be four characters longer, which is never good, as it will be less convenient, more difficult to remember, etc.
Better yet, use a sub-domain of your primary website for your blog. To summarize, you should do the following, in order of best to worst:
blog.mywebsitename.com
mywebsitename.com/blog
mywebsitenameblog.com

There is slight difference depending on which search engine is going to look at and add or subtract value to your blog on basis of this decision.
Read this blog post from Matt Cutts. or Watch this video for summary
If you go for another domain then search engine expect it to be separate content and not much relation in terms of your main domain rank.
I would install the blog in sub-directory called blog and stop worrying about actual juice from search engines as it may vary from one to another.

Related

How to organize Rails routes to divide content by cities?

I have a restaurants directory Rails app in which I need to categorize the content (restaurant description pages) by cities. The cities are stored in the database. The questions that I have:
What is the Rails way of doing this? Is it best to add a scope in routes as for ex. the language locale? For ex.: example.com/en/new-york/restaurants...
Is it better to translate, transliterate or leave the city names as-is provided that the content is targeted for the locals. For ex.: example.com/moscow vs example.com/moskva vs example.com/москва in terms of "Rails-wayness" and SEO friendliness?
In terms of SEO, is it better to use subdomains (new-york.example.com) or subdirectories (example.com/new-york).
I would appreciate if you could share your experience about this matter!
You probably don't want the locale/language to be embedded in the URL.
For SEO purposes you probably want to pick one version and go with it all the time. That way you're aggregating all of your "link juice" to one URL. Some search engines will penalize you for having the same content at multiple URLs.
This is a good question, and I'm not entirely sure. I'd be kind of surprised if either one makes a huge difference. (It wouldn't be the first time I've been surprised...)

Scalability of multi-site rails app

I am beginning work on a new Rails project that is based on the premise of allowing users to create their own "sites." Each "site" would be a subdomain of the root domain (we'll use example.com). So if user Foo wants to create his own site at bar.example.com, each page request to a bar.example.com page would require fetching the a row in a sites table based on the subdomain.
My question is not how to code a multisite app, I think I have a pretty good grasp on that. My question is, from a scalability and performance perspective, would it be better to simply generate a new rails project for each site a user creates? Or is it ok run all sites out of one rails app. If numbers are necessary, let's assume I have 1 million users, each with a maximum of 5 sites, with each site bringing in around 1,000 hits a day.
I realize this is kind of a broad question, and mostly depends on my implementation of either method to reach a feasible solution, but any suggestions in terms of the best way to write this, including optimizing the DB, etc. would be appreciated.
It would be exponentially easier to have 1 rails app with millions of subdomains compared to millions of rails apps.
Check out this railscast for how to start with subdomains: http://railscasts.com/episodes/221-subdomains-in-rails-3
I wouldn't ever consider doing something like this with multiple Rails projects, because of the need to maintain all the code. By keeping it centralized, you can change the functionality of everybody's sites at once.
I think you might also run into memory issues by having all of those copies of Rails instantiated, too.
#Solomon is right. Heroku.com is using same concept for it's users to demonstrate users' applications.

Creating a different "genre" of my website (i.e., I have a stackoverflow-equivalent and I want to create a serverfault-equivalent)

My site, Rap Genius, explains rap lyrics. I want to create a new site, Rock Genius, that explains rock lyrics – otherwise it'll be the same (same layout, same DB schema; like Serverfault is to Stackoverflow)
What's the best way to do this?
Approach 1: Fork the code
Fork the Rap Genius code, change the relevant parts (e.g., "Rap" -> "Rock"), create a new database and go to town.
Pros: Can get it working quickly
Cons: It'll be somewhat painful to add a feature to both applications. Also it'll be impossible to give Rap Genius access to Rock Genius' data at the DB level
Approach 2: Keep it a single application
Whenever a request comes into my application, check the domain. If it's rapgenius.com, set the SITE_NAME constant to "rapgenius". Create a genre field on user-facing entities (songs, blog posts, etc) and update my queries to use the correct genre based on the SITE_NAME
Create a layer of abstraction above user-facing strings to that I can write <%= welcome_message %> instead of Welcome to Rap Genius! and have welcome_message() take SITE_NAME into account
Pros: Lots of flexibility
Cons: Lots of work!
Thoughts?
The second approach sounds better to me.
You've already highlighted the main pros and cons - it will definitely be more work, but will be much friendlier to maintain. Is there any chance of a third, fourth, fifth site? If so, there's no question that this is the right way to go.
You'll likely also be able to share user accounts, reputation, and any other kind of community based functionality more easily.
It might be worth looking at Rails i18n stuff for 'translating' static text, based on the domain name. That way you could avoid writing helper methods for every string you want to display.
Then you should be able to 'franchise' the site really easily - add translations of static strings, a handle for the new domain, and maybe some site specific images or CSS and you're done!

SEO and URI Structure

Standard SEO caveat: It's a black box, and the algorithms are proprietary, and trying to predict and game the search engines is a crappy way to make a living.
That said, what are the baseline steps you want to take to make sure your content is visible to the major search engines (Google, Bing, etc.)
I'm specifically curious as to what role your URI Information Architecture plays. It's common wisdom that you want keywords in your URI, and you want to avoid the query-string laden approach, but what else beyond that?
A quick example of what I'm talking about. Based on something I read on a forum, I recently exposed a /category/* hierarchy on my site. In the following weeks I noticed a sharp uptick in my page views.
I'm curious what other basic steps a site/weblog should take with its URIs to ensure a baseline visibility.
A few URI tips that have kept me ranking:
Write URIs in English but include a unique ID. SO does this well: http://stackoverflow.com/questions/1278157/seo-and-uri-structure
Stay consistent when linking to a page: domain.com/, domain.com/index and domain.com/index.php are different URIs
Use .html extensions, or purely /one/two/ directories for pages
There's probably hundreds of other tips! The structure of linking plays a very important role too...
Logically break your site down into many categories/subcategories
Link all pages back to your homepage
Don't link to hundreds of pages from your homepage
EDIT: Oh I forgot a very important one - a proper 404 response!
Hopefully that helps a bit
some simple things ...
meaningful and accurate meta fields (especially description, keywords)
a valid hn hierarchy on every page (e.g. h1 h2 h3 h2 h2 h3 h3 h4 h3 h2)
all (text) content accessible to a text browser
check spellings
keep content and display functionality separated (e.g. use HTML and CSS fully)
validate CSS and (X)HTML and use standard DOCTYPES
relevant <title> for each page
sensible site hierarchy and no orphan pages
1) Don't use www subdomain if you do not have to. If you or your company has made the mistake of using subdomains for asset management then you likely forced into using www just to be safe.
2) The biggest problem faced by search engines is redundant URIs for the same page. This problem is solved by using a canonical link tag in your HTML. This will perhaps help you more than any other single SEO factor.
3) Make your URIs meaningful. If people can remember URIs well enough to type them out your SEO will be significantly improved.
The most important factors with URIs is easy to remember and the ability to specify uniqueness to the search engine. Nothing else matters with regard to URIs and SEO.

Django, Rails Routing...Point?

I'm a student of web development (and college), so my apologies if this comes off sounding naive and offensive, I certainly don't mean it that way. My experience has been with PHP and with a smallish project on the horizon (a glorified shift calendar) I hoped to learn one of the higher level frameworks to ease the code burden. So far, I looked at CakePHP Symfony Django and Rails.
With PHP, the URLs mapped very simply to the files, and it "just worked". It was quick for the server, and intuitive. But with all of these frameworks, there is this inclination to "pretty up" the URLs by making them map to different functions and route the parameters to different variables in different files.
"The Rails Way" book that I'm reading admits that this is dog slow and is the cause of most performance pains on largish projects. My question is "why have it in the first place?"? Is there a specific point in the url-maps-to-a-file paradigm (or mod_rewrite to a single file) that necessitates regexes and complicated routing schemes? Am I missing out on something by not using them?
Thanks in advance!
URLs should be easy to remember and say. And the user should know what to expect when she see that URL. Mapping URL directly to file doesn't always allow that.
You might want to use diffrent URLs for the same, or at least similar, information displayed. If your server forces you to use 1 url <-> 1 file mapping, you need to create additional files with all their function being to redirect to other file. Or you use stuff like mod_rewrite which isn't easier then Rails' url mappings.
In one of my applications I use URL that looks like http://www.example.com/username/some additional stuff/. This can be also made with mod_rewrite, but at least for me it's easier to configure urls in django project then in every apache instance I run application at.
just my 2 cents...
Most of it has already been covered, but nobody has mentioned SEO yet. Google puts alot of weight on the URL itself, if that url is widgets.com/browse.php?17, that is not very SEO friendly. If your URL is widgets.com/products/buttons/ that will have a positive impact on your page rank for buttons
Storing application code in the document tree of the web server is a security concern.
a misconfiguration might accidentally reveal source code to visitors
files injected through a security vulnerability are immediately executable by HTTP requests
backup files (created e.g. by text editors) may reveal code or be executable in case of misconfiguration
old files which the administrator has failed to delete can reveal unintended functionality
requests to library files must be explicitly denied
URLs reveal implementation details (which language/framework was used)
Note that all of the above are not a problem as long as other things don't go wrong (and some of these mistakes would be serious even alone). But something always goes wrong, and extra lines of defense are good to have.
Django URLs are also very customizable. With PHP frameworks like Code Igniter (I'm not sure about Rails) your forced into the /class/method/extra/ URL structure. While this may be good for small projects and apps, as soon as you try and make it larger/more dynamic you run into problems and have to rewrite some of the framework code to handle it.
Also, routers are like mod_rewrite, but much more flexible. They are not regular expression-bound, and thus, have more options for different types of routes.
Depends on how big your application is. We've got a fairly large app (50+ models) and it isn't causing us any problems. When it does, we'll worry about it then.

Resources