I have changed my domain name and i really don't want to go through the trouble of making all the links i have posted in the articles in my website to the domain name. Is there a code i can use to make all the links to the old domain change.
For example, if i have a link somewhere that is oldurl.com/faq
and i want it to change to newurl.com/faq
But without rewriting it manually, so it does it for all the links on my website that start with oldurl.com
How would i do this?
You can point the DNS for the old domain to the new domain, and place a redirect rule in the webserver to rewrite to the proper domain.
This is probably something you should do anyway to have any external links that you have no control over be pointed to the proper new location.
This however is not a substitute for fixing your actual links. You should fix these as well. In fact it is usually best proactice to not include the full URL on internal site links, instead using only URI's or relative paths for such links.
Most any good IDE will give you the ability to search and replace across all file in a site, so doing this should not be too painful.
This is the quick and lazy way to do it.
$(function(){
$('a').attr('href', function(x, url){
return url.replace('oldurl.com', 'newurl.com');
});
});
Related
I have a problem with Google indexing.
The thing is that I have several domains (language mutations) example: www.example.com, www.example.co.uk, www.example.de.
So each site has its own language and its own SEO links, example:
search/en-us/building/window/77/1/
search/en-gb/building/window/77/1/
search/de-de/gebaude/fenster/77/1/
Now Google is indexing those SEO url links with wrong domains like www.example.co.uk/search/de-de/gebaude/fenster/77/1/ which should obviously be correctly: www.example.de/search/de-de/gebaude/fenster/77/1/
Every language is bound to its own domain so there is no possibility that EN-GB leads to a German language mutation on www.example.de
I am open to all advice. Thank you in advance.
This kind of thing happens. Its easier than you think to create a bad link. One hard coded domain name in the wrong place and you have a link to somewhere you didn't mean to link. Once Googlebot finds a site served on a wrong domain, it will crawl the whole thing. Even if you are correct, that you would never create such a link, maybe there is an external link created by a user in which they got the domain name confused.
The technique that you need to apply is called URL canonicalization. You have two options:
Put the rel canonical meta tag on each of your pages. search/de-de/gebaude/fenster/77/1/ would have a canonical URL in the tag which has the correct domain: http://www.example.de/search/de-de/gebaude/fenster/77/1/ You'd have to make sure that the domain in the canonical tag is the .de domain whether that page were accessed on the .co.uk domain or on the .de domain.
Use 301 redirects to correct any URLs that are wrong. Have your server detect that a domain name is wrong. 301 redirect www.example.co.uk/search/de-de/gebaude/fenster/77/1/ to www.example.de/search/de-de/gebaude/fenster/77/1/
Another random piece of SEO advice: I would remove the word "search" from your URLs. Google doesn't like to index site search results in its search results. I've seen cases where it assumes that the word "search" in the URL indicates that the page shouldn't be indexed for this reason.
I'm trying to transfer an existing .NET site into Umbraco and trying to use the umbracoUrlName to set the URLs for pages to map to their existing URLs so that inbound links still work and it doesn't affect SEO. I'd really rather avoid forwarding, but the problem is that some of these pages are in different folders and umbracoUrlName seems to ignore slashes.
You are right, umbracoUrlName ignores slashes. You will either have to put redirects in place using something like the 301 Url Tracker or place the nodes inside other nodes to simulate the folder structure.
Alternatively, you could use umbracoUrlAlias which does accept forward slashes. This doesn't change the original url, but it does give the page an alias that can also be used to access the node.
I know this is an old issue and it depends what version of Umbraco you're on but look at the IUrlProvider to solve your issue - this blog post has all the details:
http://24days.in/umbraco/2014/urlprovider-and-contentfinder/
I would like to hide the webpage name in the url and only display either the domain name or parts of it.
For example:
I have a website called "MyWebSite". The url is: localhost:8080/mywebsite/welcome.xhtml. I would like to display only the "localhost:8080/mywebsite/".
However if the page is at, for example, localhost:8080/mywebsite/restricted/restricted.xhtml then I would like to display localhost:8080/mywebsite/restricted/.
I believe this can be done in the web.xml file.
I believe that you want URL rewriting. Check out this link: http://en.wikipedia.org/wiki/Rewrite_engine - there are many approaches to URL rewriting, you need to decide what is appropriate for you. Some of the approaches do make use of the web.config file.
You can do this in several ways. The one I see most is to have a "front door" called a rewrite engine that parses the URL dynamically to internally redirect the request, without exposing details about how that might happen as you would see if you used simple query strings, etc. This allows the URL you specify to be digested into a request for a master page with specific content, instead of just looking up a physical page at that location to serve.
The StackExchange sites do this so that you can link to a question in a semi-permanent fashion (and thus can use search engines with crawlers that log these URLs) without them having to have a real page in the file system for every question that's ever been asked (we're up to 9,387,788 questions as of this one).
I am not particularly familiar with DotNetNuke, so please correct me if I am using any wrong terminology.
I have a client who has a bunch of links that are hardcoded in an HTML module. The URL's looks like the following:
http://www.siteurl.org/level1/level2/level3/level4/pageName.aspx
So the URL for the page is basically made from how the menu is constructed. When I change any order in the menu, this breaks the hardcoded links. Is there a way to use something like an ID instead for the URL so no matter what my menu looks like, the page will be resolved properly?
You could use an ID for the pages, linking to
http://www.siteurl.org/default.aspx?tabid=## where ## is the ID for each page.
Now the key will be to find the proper IDs which you could do by looking at the HTML source of the Admin/Pages page.
That being said, the proper thing to do, would be to not MOVE or RENAME pages, this breaks all the old URLS (as you're experiencing) as well as those pages/urls in any search indexes.
A better way, though more work, would be to create a new page at the new PATH (where you move things to) and then redirect the old page to the new page (in the page settings). This requires quite a bit of work, but is the best way currently to handle old URLs, I have a video example of this at http://www.dotnetnuke.com/Resources/Video-Library/Viewer/VideoId/213/Renaming-A-Page-In-DotNetNuke-.aspx
From the article at google's webmaster center and SEO's pdf, I think I should improve my website's URLs structure.
Now the news url looks like "news.php?id=127591". I need to rewrite it to something like "/news/127591/this-is-article-subject"
The problem is if I change the structure of url to the new one. Can I still keep the old one working? If both url working, how to avoid search engine like google and bing to search twice times for one article?
Thanks!
HTTP 301 permanent redirect from the old URL to the new URL
an HTTP 301 redirect has the property of communicate a new (permanent) URL for an old (outdated) ressource to google (and other clients). google will transfer most/all of the allocated value from the old URL to the new URL.
Also, in order to improve the arquitecture of your website, you must keep a clean structure by inserting links within all its pages/posts. But be careful, you must not do this lightly, or Google´s robot will get confused and leave.
Structure is key to your SEO
1. Find one page which is the "really important page" for any given keyword
2. direct relevant content from other pages which is relevant to that particular kw
3. repeat with every relevan kw
I´m gonna leave this post for you, where I explain this more in depth, hoping that you understand spanish. http://coach2coach.es/la-estructura-web-es-la-base-del-posicionamiento/
Yep.. you can use robots.txt to exclude news.php, and create an xml sitemap with the new URLs. mod_rewrite can be set to only change directories, with trailing slashes.. so all files in your root directory should work fine.