Google indexing urls redirect 301 - url

Let say my site has the following URLs indexed in Google:
/test/1
/test/2
/test/3
For some reasons, I want those same pages to have the following URLs:
/test/abc
/test/def
/test/ghi
I noticed that even if I use a 301 redirect from /test/1 to /test/abc, the URL /test/1 stays in the Google index for a while after the robot hits the redirect and discovers the change.
Is it normal that it takes few weeks for the old URLs to disappear from the search engine index or is there a better way to let him know about the changes.
Should I use the URL removal tool ?
Will a new sitemap in the Google webmaster tools help to get rid of the old URLs ?
Help me see inside the Google black box :)

Answering your questions:
Yes it's normal for this process to take a few weeks, this is nothing to worry about.
The URL removal tool is only for URLs that no longer exist, you can't use it for URLs that now return a 301 (see: http://www.google.com/support/webmasters/bin/answer.py?answer=59819&hl=en)
An XML sitemap is mainly for telling Google about new pages and pages that have changed recently, so I don't think it will help you here
In short, the index will update naturally, you just need to let Google do its thing.

Related

How to remove all indexes Url from google and reindex again

My Website was hacked with japanese Seo Virus. I have cleaned the virus and started to resubmit the website to google.com.
What is best option here to clean all cached Link and snippet and to start the rindex because the google show all japanase Links in site:url
Google hast 2 Option:
Temporary remove url
Clear Cache Url
How to flash all indexes from google and to force resubmititing?
Thanks in advance!
Log in to the Google Search Console.
Select the resource you want.
Then find the "Remove URLs" subsection in Google Index.
Here we create a new request for deletion, and then enter the desired link in the window that opens and click "Submit".
Also, you can immediately delete the path to these urls if there are a lot of them /your-url/ *
https://support.google.com/webmasters/answer/9689846?hl=en
You can use 301 redirects from old URLs to new ones if necessary.
It would be nice to get advice from SEO specialists, since the situation may be specific, this source can help you.
To speed up the indexing of new urls or information, I would recommend creating the correct structure on the site so that the search robot can find your urls. Also, create and submit a sitemap.xml and add it to Google Search Console

Get rid of old links to a retired website in Google search

I have a website that has been replaced by another website with a different domain name.
In Google search, I am able to find links to the pages on the old site, and I hope they will not show up in future Google search.
Here is what I did, but I am not sure whether it is correct or enough.
Access to any page on the old website will be immediately redirected to the homepage of the new website. There is no one-to-one page mapping between the two sites. Here is the code for the redirect on the old website:
<meta http-equiv="refresh" content="0;url=http://example.com" >
I went to Google Webmasters site. For the old website, I went to Fetch as Google, clicked "Fetch and Render" and "Reindex".
Really appreciate any input.
A few things you'll want to do here:
You need to use permanent server redirects, not meta refresh. Also I suggest you do provide one-to-one page mapping. It's a better user experience, and large numbers of redirects to root are often interpreted as soft 404s. Consult Google's guide to site migrations for more details.
Rather than Fetch & Render, use Google Search Console's (Webmaster Tools) Change of Address tool. Bing have a similar tool.
A common mistake is blocking crawler access to an retired site. That has the opposite of the intended effect: old URLs need to be accessible to search engines for the redirects to be "seen".

How to delete old Google Urls with parameters

a month ago i relaunched a Website in Typo3 CMS. Before that, the site was hosted with Joomla CMS.
In Joomla Config, SEO Links were disabled, so Google indexed the Page Urls this:
www.domain.de/index.php?com_component&itemid=123....
for example.
Now, a month later (after the Typo3 Relaunch), these Links are still visible in Google because the Urls don't return a 404-Error. That's because "index.php" also exists on Typo3 and Typo3 doesnt care about the additional query string/variables - it returns a 200 status code and shows the front page.
In Google Webmaster Tools it's possible to delete single Urls from the Google Index, but that way i have to delete about 10000 Urls manually...
My Question is: Is there a way to remove these old Urls from the Google Index?
Greetings
With this amount of URL's there is only one sensible solution, implement the proper 404 handling in your TYPO3, or even better redirections to same content placed in TYPO3.
You can use TYPO3's handler (search for it in Install Tool > All configuration) it's called pageNotFound_handling, you can use options like REDIRECT for redirecting to some page or even USER_FUNCTION, which allow you to use own PHP script, check the description in the Install Tool.
You can also write a simple condition in TypoScript and check if Joomla typical params exists in the URL - so that easy way you can return custom 404 page. If it's important to you to make more sophisticated condition (for an example, you want to redirect links which previously pointed to some gallery in Joomla, to new gallery in TYPO3) you can make usage of userFunc condition and that would be probably best option for SEO
If these urls contain an acceptable number of common indicators, you could redirect these links with a rule in your virtual host or .htaccess so that google will run into the correct error message.
I wrote a google chrome extension to remove urls in bulk in google webmaster tools. Check it out here: https://github.com/noitcudni/google-webmaster-tools-bulk-url-removal.
Basically, it's a glorified for loop. You put all the urls in a text file. For example,
http://your-domain/link-1
http://your-domain/link-2
Having installed the extension as described in the README, you'll find a new "choose a file" button.
Select the file you just created. The extension reads it in, loops thru all the urls and submits them for removal.

How to improve the structure of URLs

From the article at google's webmaster center and SEO's pdf, I think I should improve my website's URLs structure.
Now the news url looks like "news.php?id=127591". I need to rewrite it to something like "/news/127591/this-is-article-subject"
The problem is if I change the structure of url to the new one. Can I still keep the old one working? If both url working, how to avoid search engine like google and bing to search twice times for one article?
Thanks!
HTTP 301 permanent redirect from the old URL to the new URL
an HTTP 301 redirect has the property of communicate a new (permanent) URL for an old (outdated) ressource to google (and other clients). google will transfer most/all of the allocated value from the old URL to the new URL.
Also, in order to improve the arquitecture of your website, you must keep a clean structure by inserting links within all its pages/posts. But be careful, you must not do this lightly, or Google´s robot will get confused and leave.
Structure is key to your SEO
1. Find one page which is the "really important page" for any given keyword
2. direct relevant content from other pages which is relevant to that particular kw
3. repeat with every relevan kw
I´m gonna leave this post for you, where I explain this more in depth, hoping that you understand spanish. http://coach2coach.es/la-estructura-web-es-la-base-del-posicionamiento/
Yep.. you can use robots.txt to exclude news.php, and create an xml sitemap with the new URLs. mod_rewrite can be set to only change directories, with trailing slashes.. so all files in your root directory should work fine.

SEO help for replacing a website

I run a small e-commerce site that over the last few years has built up a reasonable search engine status.
I've been working on a new site that uses new URL formats and I am worried about how to deal with all the broken links and customer frustration for users finding out dated links through search engines.
Can anyone offer advice on how to mitigate / minimize the damage? The old site was done in ASP.NET the new in ASP.NET MVC
Thanks for any help you can be.
You will need some sort of parallel structure. Ideally, the old site with the old URLs remains fully accessible for some time, but does not get indexed any more.
If that's not feasible, and since you are saying that the site is small, you could establish a URL mapping old-new and have a 404 handler that attempts to redirect to the new content.
You should create permanent redirects for the links you want to preserve (routelevel). This way searchengines will update their references to the new locations.
As cdonner says, you want to have a handler that reroutes the traffic to its appropriate destination. Even more important though, is you want to make sure when you redirect the client, you send a status code of 301 (permanently moved) instead of 404. The search engines will rate you negatively if there are a lot of 404 errors on your site and you will see your standing decrease instead of increase.
You could set up your old site's .htaccess file to redirect traffic to the new site. Beyond that, you could use mod_rewrite to map requests to pages on the old site to the same (or similar) pages on the new one.
This is the way I do it migrating from an old ASP classic site:
Sub Application_BeginRequest(ByVal sender As Object, ByVal e As System.EventArgs)
Dim fullOriginalpath As String = Request.Url.ToString.ToLower
If (fullOriginalpath.Contains("/viewitem.asp?itemid=")) Then
Context.Response.StatusCode = 301
Context.Response.Redirect("/item/" + getItemIDFromPath(fullOriginalpath))
ElseIf (fullOriginalpath.Contains("/search.asp")) Then
Context.Response.StatusCode = 301
Context.Response.Redirect("/search/")
ElseIf (fullOriginalpath.EndsWith("/default.asp")) Then
Context.Response.StatusCode = 301
Context.Response.Redirect("/")
End If
End Sub
Sounds like you have it figured out, but just wanted to add one more option - the canonical tag - which may have advantages if for any reason you needed to keep both the old url and the new URL active. You can create a copy of the page at the old URL and then add the "canonical" tag, which tells the search engines "please credit the link credits of this page to the following page: www.site.com/newpage"
<link rel="canonical" href="http://www.yoursite.com" /> this line goes in before </head>
For example if you have lots of links to certain key pages and those links are pointed to the old URL's, this may be a help.
A 301 also redirects the link credits, and generally moved pages you'll want to use a 301 redirect. Oh and if you use a URL rewrite rule and all the URL's change in the same way, you can probably use regex in the rewrite rule to handle all of them in a single step.

Resources