URL structure containing /-/ - url

As a full-stack developer, I constantly like to see how popular website structure their DOM or URLs etc.
I've noticed GitLab like to use /-/ in their URL, for example, https://gitlab.com/project/repo/-/branches
Is there any design purpose to this or simply their own convention?

Related

Prefix or part of URL

I will develop tourist site about region. This site will consist of parts about cities in this region. Parts have same design and features. Home page of site will has information from all parts about all cities.
How do I design path structure of site? I saw that some sites for same purpose use prefixes like:
city.region.com
But another sites just add part to URL:
region.com/city
What is the best solution? (from SEO and Rails development points of view)
When you add a prefix to the site name, that is considered a subdomain. Subdomains are considered separate website so If SEO is a goal of yours, you will have a higher chance of showing up multiple times for a search.
On the other hand, I consider it a lot easier to add to the URL. This is the approach I would take, avoiding any premature optimization.
Source : http://www.ameravant.com/article/3398-subdomains-and-seo-pros-and-cons-of-subdomains-vs-subdirectories

Does site seo benefit from having a sitemap on top of using microdata?

I have recently been doing a lot of reading on SEO with HTML5 (I am a Rails web developer), and have been doing a lot of work with microdata as I have seen that the Schema.org format is the preferred format of Google.
What I am wondering, is if somebody can explain to me the importance of also including a sitemap?
From what I understand, the crawlers just go through all the links on a page from wherever they come to your site, and then are able to gather all the data they need from well written microdata tags.
So what is the additional benefit of including a sitemap, and is it really worthwhile? It is possible that I am misunderstanding the purpose of a sitemap or the functionality of search engine crawlers.
A consumer can only read the Microdata if it found the document which contains it.
A sitemap is one way (of many ways) that allows consumers to find the document. A common other way is to follow hyperlinks (from plain HTML, no Microdata needed), but there may be sites that don’t link to every document, so consumers would not find these documents that way.
(If it’s worthwhile, e.g. if there’s a SEO benefit, depends on the consumer. A consumer can be a search engine bot, any other bot, a tool, a browser extension, etc.)

Special kind of Server Side Include of Asp.net MVC

i have to create a new asp.net mvc page that integrates content provided by a cms on the server side static. my mvc page provides a masterpage with the navigation and certain links should point to pages of the cms (which is installed on the same server). it should be something like a "server side iframe".
my idea is to create a controller which loads the page of the cms using a webrequest, extracts the body part of the page and passes the extracted data to the view. the view simply outputs the passed html. i also plan to add some logic to pass post requests to the cms (for news letter subscriptions, contact forms, ...)
now my question is: is it possible to implement this solution? or is there a better way to do this on the server side?
Could you use Application Request Routing to just hand off requests to your CMS, or do you need to include the externally provided content within an existing masterpage?
If you need to use the masterpage I would stick to the solution you suggest, although I might investigate the most robust and efficient option for querying the content from the CMS and perhaps if caching would be a good option.
It is undoubtedly possible, but keeping track of users, authentication, cookies etc. seems like a really tedious job. Also, embedding css classes, hard-coded styling etc. from the CMS in your MVC site could give you a severe headache.
If the CMS isn't home-brewed it probably has an API. In that case I would much prefer to use the API to get at the data I needed and then render that data using pure MVC. This will give you a much cleaner and more stable integration with the CMS.

Use ASP.NET MVC for html brochure websites?

I have a project that will basically be a large brochure html website. Although some content could possibly be database driven in the future. I use ASP.NET MVC for any database driven websites usually, but not sure whether to use it for brochure html websites.
You'd probably want to use Master Pages even if the content is static. Might as well use MVC to keep headers and footers consistent across the site. (Same goes for any language, really.)
You host only plain old html files in it for now. If the need arises for database-driven content, ASP.NET MVC's routing options make it easy to switch to a dynamic site without breaking the links.
We used the same approach for setting up a dummy website for SEO purposes until the real app was developed and the switch to dynamic content was effortless.
The good thing about ASP.NET MVC (as apposed to Webforms? I assume you're asking) is that you can just use basic html and have a designer design up brochure required. If this needs to be more "dynamic" at some stage with forms or CMS etc, using the existing plain html will be easier.
Also if you're using MVC already its a no-brainer...
Danny,
it's possibly also a choice over whether your project sponsor is wanting to pay for windows hosting or whether they go down the linux route. if you know for sure that the site would NEVER be required to take data from a database then you could actually create an app in mvc (your developer app) and then have that app generate the 'flat file' site out to html files. that way, you could store the elements that make up the content in your developer database and regenerate the entire site when required. This approach would reap dividends - for example you decided to add some jquery across the site, then this would do it all in a single hit.
this way of generating flat sites would mean that you could in theory have an engine that you used for multiple clients, changing only the css and content as required.
just my tuppence worth...
jim

Geeky urls to Search Engine Friendly urls in IIS without sacrificing incoming links

I have a website where my present "geeky" urls look like:
http://www.bestatdubaiholidays.co.uk/pages/Quote/Details.aspx?GUID=01a25b0c-e0ac-40ba-abd1-298f3abd9612
I want to change these to Search Engine Friendly ones - something like:
http://www.bestatdubaiholidays.co.uk/the-palm-atlantis.aspx
or
http://www.bestatdubaiholidays.co.uk/the-palm-atlantis
I have hundreds of incoming links (from ad campaigns and other sites) to my geeky urls that I want to retain.
So if someone types a geeky url, I want the address bar to show the equivalent search engine friendly url.
Can anyone help? Referring to other articles won't help. Believe me, I've read every one of them. Any example urls will be helpful.
Use something like this http://blog.eworldui.net/post/2008/04/ASPNET-MVC---Legacy-Url-Routing.aspx
You don't have to use MVC, the routing classes are standalone now
I suggest using a Front controller. This means that you use the rewrite engine of whatever httpd server you're using to redirect ALL requests to a single file (index.php or index.aspx or whathaveyou) and then you use code in that file to dispatch to the appropriate page. You can do a redirect from the geeky URLs to the friendly URLs, and if it's a friendly url then you load the appropriate page.
This would be way easier than writing huge rewrite rules for each type of page you might have. And this way all of the work is done in the same language your site is already running, so you don't have to learn and maintain a new file that is in its own language just for the redirection.

Resources