SEO help for replacing a website - asp.net-mvc

I run a small e-commerce site that over the last few years has built up a reasonable search engine status.
I've been working on a new site that uses new URL formats and I am worried about how to deal with all the broken links and customer frustration for users finding out dated links through search engines.
Can anyone offer advice on how to mitigate / minimize the damage? The old site was done in ASP.NET the new in ASP.NET MVC
Thanks for any help you can be.

You will need some sort of parallel structure. Ideally, the old site with the old URLs remains fully accessible for some time, but does not get indexed any more.
If that's not feasible, and since you are saying that the site is small, you could establish a URL mapping old-new and have a 404 handler that attempts to redirect to the new content.

You should create permanent redirects for the links you want to preserve (routelevel). This way searchengines will update their references to the new locations.

As cdonner says, you want to have a handler that reroutes the traffic to its appropriate destination. Even more important though, is you want to make sure when you redirect the client, you send a status code of 301 (permanently moved) instead of 404. The search engines will rate you negatively if there are a lot of 404 errors on your site and you will see your standing decrease instead of increase.

You could set up your old site's .htaccess file to redirect traffic to the new site. Beyond that, you could use mod_rewrite to map requests to pages on the old site to the same (or similar) pages on the new one.

This is the way I do it migrating from an old ASP classic site:
Sub Application_BeginRequest(ByVal sender As Object, ByVal e As System.EventArgs)
Dim fullOriginalpath As String = Request.Url.ToString.ToLower
If (fullOriginalpath.Contains("/viewitem.asp?itemid=")) Then
Context.Response.StatusCode = 301
Context.Response.Redirect("/item/" + getItemIDFromPath(fullOriginalpath))
ElseIf (fullOriginalpath.Contains("/search.asp")) Then
Context.Response.StatusCode = 301
Context.Response.Redirect("/search/")
ElseIf (fullOriginalpath.EndsWith("/default.asp")) Then
Context.Response.StatusCode = 301
Context.Response.Redirect("/")
End If
End Sub

Sounds like you have it figured out, but just wanted to add one more option - the canonical tag - which may have advantages if for any reason you needed to keep both the old url and the new URL active. You can create a copy of the page at the old URL and then add the "canonical" tag, which tells the search engines "please credit the link credits of this page to the following page: www.site.com/newpage"
<link rel="canonical" href="http://www.yoursite.com" /> this line goes in before </head>
For example if you have lots of links to certain key pages and those links are pointed to the old URL's, this may be a help.
A 301 also redirects the link credits, and generally moved pages you'll want to use a 301 redirect. Oh and if you use a URL rewrite rule and all the URL's change in the same way, you can probably use regex in the rewrite rule to handle all of them in a single step.

Related

If I change my SEO-friendly URLs, how do I tell Google to forget the old ones?

I have an ecommerce site with over 3000 products. Currently, my URLs look like this:
http://www.muszakiarena.hu/termekek/sencor-sle-1958-led-tv/3112
Now, 'termekek' means 'products', the second one is the name of the product, while the 3rd one is the ID of the product.
I want to remove 'termekek' from the URLs, because it is unneccessary and I hope I'll get better rankings without it. So new URLs would look like
http://www.muszakiarena.hu/sencor-sle-1958-led-tv/3112
Now, the system already works this way (the product pages show up at www.muszakiarena.hu/sencor-sle-1958-led-tv/3112), but if I change my product links in the navigation to the new type and ask Google to recrawl, I'm afraid it will detect duplicate content.
How do I tell Google to forget the old URLs and only keep the new ones?
You should redirect (with 301) from the old to the new URLs.
That way all search engines that indexed the pages under the old URLs will learn that the URLs changed as soon as they try to crawl them again. The same goes for users that bookmarked/published the old URLs: when visiting them, they get redirected to the new URL.
If using a 301 redirect is not possible in your case, you can use the canonical link type. (But a 301 redirect is preferable.)

Remove multiple indexed URLs (duplicates) with redirect

I am managing a website that has only about 20-50 pages (articles, links and etc.). Somehow, Google indexed over 1000 links (duplicates, same page with different string in the URL). I found that those links contain ?date= in url. I already blocked by writing Disallow: *date* in robots.txt, made an XML map (which I did not had before) placed it into root folder and imported to Google Webmaster Tools. But the problem still stays: links are (and probably will be) in search results. I would easily remove URLs in GWT, but they can only remove one link at the time, and removing >1000 one by one is not an option.
The question: Is it possible to make dynamic 301 redirects from every page that contains $date= in url to the original one, and how? I am thinking that Google will re-index those pages, redirect to original ones, and delete those numerous pages from search results.
Example:
bad page: www.website.com/article?date=1961-11-1 and n same pages with different "date"
good page: www.website.com/article
automatically redirect all bad pages to good ones.
I have spent whole work day trying to solve this problem, would be nice to get some support. Thank you!
P.S. As far as I think this coding question is the right one to ask in stackoverflow, but if I am wrong (forgive me) redirect me to right place where I can ask this one.
You're looking for the canonical link element, that's the way Google suggests to solve this problem (here's the Webmasters help page about it), and it's used by most if not all search engines. When you place an element like
<link rel='canonical' href='http://www.website.com/article'>
in the header of the page, the URI in the href attribute will be considered the 'canonical' version of the page, the one to be indexed and so on.
For the record: if the duplicate content is not a html page (say, it's a dynamically generated image), and supposing you're using Apache, you can use .htaccess to redirect to the canonical version. Unfortunately the Redirect and RedirectMatch directives don't handle query strings (they're strictly for URIs), but you could use mod_rewrite to strip parts of the query string. See, for example, this answer for a way to do it.

Url rewriter is messing up my form posts

I need some ideas i am using url rewriter for seo purposes.. maybe I shouldnt be doing this i am not sure if you have any comments on this let me know
I am making sure that all requests get directed to www.mydomai.com for
www.mydomian.com/home and www.mydomain.com/home/index as there all the same page but with mvc obviously you could get to them by all of these urls.. I am thinking this could cause duplicate content issues with seo so i wrote some rules
This works fine any request to any of these urls redirects to www.mydomin.com
the problem that i have is that i have a partial form post that updates to the home controller It will not post back as in the net panel it says url permenatly moved I am guessing this is to do with my url rewriting. Any ideas
If anyone else has this problem I have fixed it... I am using the url writer to make urls friendly ie making them all lowercase to avoid duplicates and taking the trailing slash off to ensure that I have no links that are slightly different going to the same place.. Forms do not like this if you call them with capital letters in the names.. It initiates the form via ajax but when you go to update it says 301 removed in the net tab for updatepost. You need to ensure you call them with lower case controller and action..

How to improve the structure of URLs

From the article at google's webmaster center and SEO's pdf, I think I should improve my website's URLs structure.
Now the news url looks like "news.php?id=127591". I need to rewrite it to something like "/news/127591/this-is-article-subject"
The problem is if I change the structure of url to the new one. Can I still keep the old one working? If both url working, how to avoid search engine like google and bing to search twice times for one article?
Thanks!
HTTP 301 permanent redirect from the old URL to the new URL
an HTTP 301 redirect has the property of communicate a new (permanent) URL for an old (outdated) ressource to google (and other clients). google will transfer most/all of the allocated value from the old URL to the new URL.
Also, in order to improve the arquitecture of your website, you must keep a clean structure by inserting links within all its pages/posts. But be careful, you must not do this lightly, or Google´s robot will get confused and leave.
Structure is key to your SEO
1. Find one page which is the "really important page" for any given keyword
2. direct relevant content from other pages which is relevant to that particular kw
3. repeat with every relevan kw
I´m gonna leave this post for you, where I explain this more in depth, hoping that you understand spanish. http://coach2coach.es/la-estructura-web-es-la-base-del-posicionamiento/
Yep.. you can use robots.txt to exclude news.php, and create an xml sitemap with the new URLs. mod_rewrite can be set to only change directories, with trailing slashes.. so all files in your root directory should work fine.

Google indexing urls redirect 301

Let say my site has the following URLs indexed in Google:
/test/1
/test/2
/test/3
For some reasons, I want those same pages to have the following URLs:
/test/abc
/test/def
/test/ghi
I noticed that even if I use a 301 redirect from /test/1 to /test/abc, the URL /test/1 stays in the Google index for a while after the robot hits the redirect and discovers the change.
Is it normal that it takes few weeks for the old URLs to disappear from the search engine index or is there a better way to let him know about the changes.
Should I use the URL removal tool ?
Will a new sitemap in the Google webmaster tools help to get rid of the old URLs ?
Help me see inside the Google black box :)
Answering your questions:
Yes it's normal for this process to take a few weeks, this is nothing to worry about.
The URL removal tool is only for URLs that no longer exist, you can't use it for URLs that now return a 301 (see: http://www.google.com/support/webmasters/bin/answer.py?answer=59819&hl=en)
An XML sitemap is mainly for telling Google about new pages and pages that have changed recently, so I don't think it will help you here
In short, the index will update naturally, you just need to let Google do its thing.

Resources