Using search engine friendly URLs - url

My website will be using search engine friendly URLs made up from the subject line of postings by members. A subject could be how to create a class at runtime so the URL will be:
www.example.com/topic/how-to-create-a-class-at-runtime
Ok so that gets stored and hopefully spidered and listed in the search engines. The user can edit their posting at any time and they can also change their subject line.
My question is, if they change their subject line, should the old link still be active and a new one added to the database both pointing at the same article or should the link only be with the new subject line? If the latter I would have a lot of dead links from google etc if many users changed their subject line.

Use a server-side redirect to redirect from the old URL(s) to the new URL.
In your example, you should use a 301 redirect ("Moved Permanently").

Related

Redirecting from old to new site, changing canonical adress

I have never done redirecting and such stuff.
I have site A: www.sitea.org.
This site had to change the branding, name, logo, url etc.
So i bought a new domain B: www.siteb.org.
They are on different hostings. I copied all the files from sitea, to siteb.
On siteb I changed the names. So now they almost have exact content, expect the names.
When someone searched for content on sitea over google, that sitea would appear high in google ranking. Now the problem is, I want when someone googles that content again, that the new url appears www.siteb.org.
How can I do that?!
Should I delete the content from the server of sitea?
Firstly, modify your first site to send a 301 to new site for all the urls. Next, you need to tell google about the change in address as described here. Finally be patient. It would take somedays to get the change reflected.

Why doesn't pmwiki have an ID system like MediaWiki?

Every single site that runs on Mediawiki that I have ever visited has the option of replacing the title of an article with the following phrase in the url:?&curid=[any number]. For example: http://rationalwiki.org/wiki/?&curid=1999, https://en.m.wikipedia.org/w/index.php?&curid=2001
So, since PMWiki is wiki software like MediaWiki and has a url structure that is similar to Mediawiki, why don't pmwiki site urls have any kind of ID system?
In PmWiki, page urls are deduced from page names themselves.
For example, the page MyGroup.MyPage may be reached via either:
http://mywiki/?n=MyGroup.MyPage
http://mywiki/MyGroup/MyPage
(according to wiki's Clean Url configuration)
The SQLite PageStore cookbook recipe (ie. PmWiki's addon), would provide shortened urls.
It also should be noted that page names could remain short, using (:title ...:) markup in the page itself to provide a more detailed title.

If I change my SEO-friendly URLs, how do I tell Google to forget the old ones?

I have an ecommerce site with over 3000 products. Currently, my URLs look like this:
http://www.muszakiarena.hu/termekek/sencor-sle-1958-led-tv/3112
Now, 'termekek' means 'products', the second one is the name of the product, while the 3rd one is the ID of the product.
I want to remove 'termekek' from the URLs, because it is unneccessary and I hope I'll get better rankings without it. So new URLs would look like
http://www.muszakiarena.hu/sencor-sle-1958-led-tv/3112
Now, the system already works this way (the product pages show up at www.muszakiarena.hu/sencor-sle-1958-led-tv/3112), but if I change my product links in the navigation to the new type and ask Google to recrawl, I'm afraid it will detect duplicate content.
How do I tell Google to forget the old URLs and only keep the new ones?
You should redirect (with 301) from the old to the new URLs.
That way all search engines that indexed the pages under the old URLs will learn that the URLs changed as soon as they try to crawl them again. The same goes for users that bookmarked/published the old URLs: when visiting them, they get redirected to the new URL.
If using a 301 redirect is not possible in your case, you can use the canonical link type. (But a 301 redirect is preferable.)

How to improve the structure of URLs

From the article at google's webmaster center and SEO's pdf, I think I should improve my website's URLs structure.
Now the news url looks like "news.php?id=127591". I need to rewrite it to something like "/news/127591/this-is-article-subject"
The problem is if I change the structure of url to the new one. Can I still keep the old one working? If both url working, how to avoid search engine like google and bing to search twice times for one article?
Thanks!
HTTP 301 permanent redirect from the old URL to the new URL
an HTTP 301 redirect has the property of communicate a new (permanent) URL for an old (outdated) ressource to google (and other clients). google will transfer most/all of the allocated value from the old URL to the new URL.
Also, in order to improve the arquitecture of your website, you must keep a clean structure by inserting links within all its pages/posts. But be careful, you must not do this lightly, or Google´s robot will get confused and leave.
Structure is key to your SEO
1. Find one page which is the "really important page" for any given keyword
2. direct relevant content from other pages which is relevant to that particular kw
3. repeat with every relevan kw
I´m gonna leave this post for you, where I explain this more in depth, hoping that you understand spanish. http://coach2coach.es/la-estructura-web-es-la-base-del-posicionamiento/
Yep.. you can use robots.txt to exclude news.php, and create an xml sitemap with the new URLs. mod_rewrite can be set to only change directories, with trailing slashes.. so all files in your root directory should work fine.

Good 404 in Rails, with search result on moved pages for new links

I am trying to replace lots of pages in my database at once, and lots of pages that are indexed by Google will have new URLs. So in result, old pages will be redirected to a 404 page.
So I need to design a new 404 page, by including a search box in it. Also, I want the 404 page to grab the keywords in the broken URL in the address bar, and show the search result based on the keywords in the broken links, so that user will have an idea where to go next to the new link.
Old URL:
http://abc.com/123-good-books-on-rails
New URL:
http://abc.com/good-books-on-rails
Then when a user comes from search engine, it shows the old URL. The 404 page will do a search on "good books on rails" keywords and return with a list of search result. So the user know the latest url of that link.
How do I implement this? I will be using Friendly ID, Sphinx and Rails 2.3.8.
Thanks.
You are far better off simply generating the appropriate redirects yourself than to expect your users to do anything weird if a Google link fails. This won't be indefinite - Google will eventually reindex you. If you use 301 (permanent) redirects, Google will be smart enough to NOT follow the link when reindexing your site. If you don't want to manually create redirects for hundreds of pages, then you'll need to try to figure out the algorithm for how your old pages map to new pages.

Resources