I am moving an old site to a Joomla site and when I redid the site, the page URL's changed.
Old URL:
http://www.wengerswanderers.com/bransontour.html
New URL:
http://www.wengerswanderers.com/branson-tour
Is there a way that the old URL (www.wengerswanderers.com/bransontour.html) be redirected to the new URL (www.wengerswanderers.com/branson-tour) so they will not lose their rankings in the search engines?
You could use .htaccess:
RewriteEngine On
RewriteRule ^bransontour.html http://www.wengerswanderers.com/branson-tour [R=301]
Have a look here for Redirect 301 meaning.
Although, it seems that some PR loss is still inevitable with Redirect 301, too. But I don't think there's a best solution.
Use com_redirect.
You can go in and manually enter the from and to for as many urls as you want to forward.
Related
Google has indexed the url login cpanel hosting of Hostgator. Ex: mysite.com:2082
Also indexed 5 pages of my site with www. So I'm with duplicate content.
Is indexed, eg mysite.com/page1 and www.mysite.com/page1
I've tried removing the Webmaster Tools, but always add a slash (/) after the domain.
When trying to send mysite.com:2082 Removal is added /, getting mysite.com/:2082
Has anyone had this problem?
Can anything be done to remove these pages?
Thank.
Google has indexed the url login cpanel hosting of Hostgator. Ex: mysite.com:2082
If you are on a shared host I don't think you can do anything about this unfortunately.
cPanel blocks the crawling of these pages with robots.txt. Unfortunately this can still result in a link-only entry in Google SERPs, with a description such as:
A description for this result is not available because of this site's robots.txt – learn more.
To prevent these pages from being indexed they either need a noindex robots meta tag, or a similar noindex X-Robots-Tag HTTP response header. And remove the Disallow directive in robots.txt (which prevents the pages from being crawled). As far as I'm aware the cPanel pages do not return an appropriate robots meta tag.
This issue has been discussed in the cPanel forums (some years ago!) and "fixes" have supposedly been released, however, I have seen no change in this behaviour.
To be honest, using robots.txt to block the crawling of these pages is arguably the most efficient method as it simply blocks the (good) bots from requesting the pages and thus reducing (just a little bit) the load on the server. In order to block these pages from Google's index you need to allow the pages to be crawled so the robots meta tags can be detected (which doesn't currently exist). Bit of a catch 22.
If you are thinking in terms of security, then preventing these page from being indexed does not really help. It's just security by obscurity at best. The cPanel login pages can easily be found by requesting the standard URL, example.com:2082.
Also indexed 5 pages of my site with www. So I'm with duplicate content.
You can set a preference for either www or no-www in Google Webmaster Tools. Or you can redirect one to the other in .htaccess. Which is your preferred URL is up to you. For instance, to redirect from none-www to www...
RewriteEngine On
RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteRule (.*) http://www.%{HTTP_HOST}/$1 [R=301,L]
Although to be honest, Google does a pretty good job of resolving this issue anyway (it is very common). There is no duplicate content "penalty", it's just that if you don't specify a preference then either could be indexed.
For my ASP.NET MVC website, I want to allow access to a certain page, only if the user comes from a link on another certain page (that page could be from completely different URL).
Example:
I want to allow a user access to www.MySite.com/thispage, only if they come from a link on www.MySite.com/thatpage or www.MyOtherSite.com/thatpage
How can this be accomplished?
You'll want to check the HTTP_REFERER header
You can do that with
Request.UrlReferrer
That said, this isn't real security. Someone could set the referer header of their browser manually.
If this is just a means of preventing hotlinking, it's fine. But if you're only using this to keep people out of private/secure information, you'll want to implement some real form of authentication/authorization.
You can restrict it with the Referer however this can easily be manipulated so if this is very important to you... I would go another route.
What you can do is use .htaccess to prevent the incoming referring link from reaching your site. You could deny the incoming traffic from the site or link in question completely. Or, maybe you simply redirect or forward the incoming traffic from the site or link to somewhere else.
This can be applied to any web site with .htaccess as long as your host has mod_rewrite enabled.
If you are using Dolphin and it is installed in your root/main/http docs/public_html directory then you would add to:
yoursite()com/.htaccess
If you are using Dolphin and it is installed in a subfolder/subdirectory then you would add to:
yoursite()com/dolphin-directory/.htaccess
Add to .htaccess file after the line RewriteEngine on like:
RewriteEngine on
RewriteCond %{HTTP_REFERER} baddomain.com [NC]
RewriteRule .* - [F]
To forward or redirect traffic from the unwanted site or link to somewhere else then add:
RewriteEngine on
RewriteCond %{HTTP_REFERER} baddomain.com [NC]
RewriteRule .* http://en.wikipedia.org/ [R,L]
Note:
-Change RewriteCond line: baddomain.com to the link or site you want to prevent.
-Change RewriteRule line: http://en.wikipedia.org/ to the link or site you would like to redirect or forward it to.
--Be sure to download and backup your original .htaccess file prior to editing it. Htaccess is extremely sensitive and needs to be exactly right, or it can make your entire site error out.
---And be sure you test afterward to ensure everything is working properly. You do not want to accidentally block all traffic to your site.
You could always track what sites they visit like the big agencies do thus block them if they aren't coming from your utopia :)
How can i change my SEO result URL to my new Urls? I am using IIS 7. Is there any configuration to rewrite URLs?
Thanks.
It's not exactly clear how your scenario looks like but my best guess is that you want to rewrite URLs like http://example.com/article?id=123&title=foo to sth. more SEO friendly like http://example.com/article/123/foo with a HTTP response status code 301 Moved Permanently.
Have a look at URL Rewrite for IIS and this blog post for details:
Introduction to URL Rewriting using IIS URL Rewrite Module and Regular Expressions
I have a web site with .htacess file looks like this:
RewriteEngine On
RewriteCond %{HTTP_HOST} !^wendabang.com$ [NC]
RewriteCond %{HTTP_HOST} !^www.wendabang.com$ [NC]
RewriteRule ^(.*)$ http://www.wendabang.com/$1 [L,R=301]
This morning when I was checking Google web master tool I fond in the back link section Google listed wendabang.com as a site has links back to www.wendabang.com site.
I think this issue should be rested in my .htacess file. But I can not spot the issue.
Another shameless question what I should do if I want to redirect http://www.wendabang.com/ and www.wendabang.com to wendabang.com, I think ditching http:// and www is nice.
Thank you in advance.
the site has multiple issues
the http://www.wendebang.com/ redirects HTTP 302 temp. redirect to http://wendebang.com/. a HTTP 302 comunicate the start url to google as the right url. this means you tell google to see http://www.wendebang.com/ as the url it should display to its users.
additionally, inside your HTML you have
<link rel="canonical" href="http://www.wendabang.com">
and your links look like this
<a href="http://www.wendabang.com/web-analytics-beautiful-shortcomings.html"
as the canonical tag is treated by google like an HTTP 301 permanent redirect you are google again redirecting to the www version.
and a lot of the links point to the www version as well.
at this stage i believe it is a only due to google magnificent error handling that your site is still indexed by google.
1) please see this answer to see about the right way to da an www to non www redirect via .htaccess. Generic htaccess redirect www to non-www
2) change your canonical tags to non www urls
3) change your internal links to non www urls.
and http is a protocol, you can not ditch it (said that, you could ditch is and invent your own protocol on top of the tcp/ip layer, but that is probably not what you want)
I created a rails app for my client. It was PHP and I totally rebuild it from the scratch with rails. The problem is that the site is old and many old pages are ranked in google. Naturally many people will click the page link in google and the page won't be available.
How do you usually handle such a problem?
I need to redirect such requests (missing old pages) to the main front page of the new app(rails). How can I do that?
Thanks.
Sam
A 301 redirect is meant to be the most efficient and Google friendly way and should preserve your search page ranking.
That said, I haven't tried it in real-life as the next release of my application will be using this approach to restructure a web site.
You might google ".htaccess", apache and "permanent redirect".
Redirecting the user to the front-page would be kind of disorienting without a note (flash[:notice]) to let him know what went wrong.
I think it would be better to write some routes in config/routes.rb to handle the old pages and return the new versions of the pages (if they still exist) else fallback to a 404 page.
If you have been able to maintain the URLs in the new application(eg /members.php is now /members), you can do the following if you use apache:
RewriteRule ^(.*).php $1 [R=301,L]
This will remove the php extension and do a 301 redirect, and should transfer the pagerank to the new page.
If this is not possible and you must redirect to the new main page this MIGHT work, I have not tried it myself:
RewriteRule ^(.*).php http://www.example.com/ [R=301,L]