Good day!
I cannot find a complete description of the very items that make this concreate schema useful for business in SERP. I realy don't understand why should irganization markup schema if it provide no benefits in search result? isn't it easier to create an account in Google My Business or in some Catalog with reviews on it? In this case we can see the snippet with 'rating stars'.
For example, there are two sniipets from search result:
organization1 has Schema.org/Organization markup on its' page:
Search result snippet1
organization2 has no markup on its' site, but has it's page in catalog Yelp:
Search result snippet2
Moreover, I cannot understand how can "aggregateRating" (based on a collection of reviews or ratings, of the item) calculate this rating?
Please, can anyone explain it to me?
Check the FAQ section with regard to Schema.org in the About Section Answers related to your question are:
Q: What is the purpose of schema.org?
Schema.org is a joint effort, in the spirit of sitemaps.org, to
improve the web by creating a structured data markup schema supported
by major search engines. On-page markup helps search engines
understand the information on web pages and provide richer search
results. A shared markup vocabulary makes easier for webmasters to
decide on a markup schema and get the maximum benefit for their
efforts. Search engines want to make it easier for people to find
relevant information on the web. Markup can also enable new tools and
applications that make use of the structure.
Q: Why are Google, Bing, Yandex and Yahoo! collaborating? Aren't you competitors?
Currently, there are many standards and schemas for marking up
different types of information on web pages. As a result, it is
difficult for webmasters to decide on the most relevant and supported
markup standards to use. Creating a schema supported by all the major
search engines makes it easier for webmasters to add markup, which
makes it easier for search engines to create rich search features for
users.
There's also a video on youtube about using schema for SEO for you business.
Structured data is a standardized format of code that is added to a web page. It communicates specific information about a page to Google. This makes it easier for search engines to crawl and index your content faster. In other words, it provides the context search engines need to properly categorize your site and recommend it more accurately for relevant search queries.
Google is using this data to make their search engine more accurate by creating a knowledge graph. This graph is an interconnected map of entities that follows the relationship between different terms, facts, data, dates, and more. This allows Google to go from keyword matching to a context-rich search engine, capable of differentiating the Taj Mahal monument from the Taj Mahal casino in Atlantic City.
What it means for SEOs is that Google has given you a way to introduce your client’s brands and companies into their knowledge graph, making them real objects Google knows about and can recommend to users. Check out our structured data guide on how to implement it on your site, including the recommended format for SEO and more on schema markups and aggregateRating.
Related
Suppose I have MVC application with some static and dynamic web pages. How to add search feature for such site?
I dont want to create simple page searching for the data contained in database, I want to be able to index whole pages as they are displayed to customer.
Any solution for ASP.NET MVC4/5?
Shell I use existing solution (which?) or create my own one ?
Disclaimer: it's the product of the company I work for.
You can use SearchUnit for indexing/searching MVC web sites. There's a free Community version, and a more powerful paid version.
I don't know the specifics of what you need, but it's easier to use and more rounded (eg. includes spell checking, many document format parsers) than other options such as Lucene (IMHO, let me know if you disagree).
MVC specifics are here.
I have recently been doing a lot of reading on SEO with HTML5 (I am a Rails web developer), and have been doing a lot of work with microdata as I have seen that the Schema.org format is the preferred format of Google.
What I am wondering, is if somebody can explain to me the importance of also including a sitemap?
From what I understand, the crawlers just go through all the links on a page from wherever they come to your site, and then are able to gather all the data they need from well written microdata tags.
So what is the additional benefit of including a sitemap, and is it really worthwhile? It is possible that I am misunderstanding the purpose of a sitemap or the functionality of search engine crawlers.
A consumer can only read the Microdata if it found the document which contains it.
A sitemap is one way (of many ways) that allows consumers to find the document. A common other way is to follow hyperlinks (from plain HTML, no Microdata needed), but there may be sites that don’t link to every document, so consumers would not find these documents that way.
(If it’s worthwhile, e.g. if there’s a SEO benefit, depends on the consumer. A consumer can be a search engine bot, any other bot, a tool, a browser extension, etc.)
I have an existing website. I have pages on this website that represent products with reviews. I want to use microdata to expose the aggregate reviews on Google. My challenge is each product page is fairly complex. We spent a lot of money getting the design the way we wanted it. While our search results look good now, we're not sure how to add the aggregate review information.
I reviewed the information found here. However, that looks like we would have to change our page design. We don't want to do that. Is there a way to get the aggregate review search engine result without changing our design? Ideally, I would really like to just put something in the .
You can scope metadata to encapsulate you whole page, so even on the body tag. This way the information can be spread around your whole design and still be part of the same entity.
If your information can't be hierarchical to form the correct entity you can use the itemref attribute. To quote the standard:
Note: The itemref attribute is not part of the microdata data model. It is merely a syntactic construct to aid authors in adding annotations to pages where the data to be annotated does not follow a convenient tree structure.
I've been told to understand how to maximize the visibility of an upcoming web application that is initially available in multiple languages, specifically French and English.
I am interested in understanding how the robots, like the google bot, scrapes a site that is available in multiple language.
I have a few questions concerning the behaviour of robots and indexing engines:
Should a web site specify the language in the URL?
Will a robot scrape a site in both language if the language is set through cookies (supposing a link that can change the language)?
Should I use a distinct domain for each language?
What meta tag could be used to help a robot in understanding the language of a web site?
Am I missing anything that I should be aware of?
Yes
No
Not necessarily, Google will infer the language. But if you use different TLD you probably get better exposure in specific countries, but you loss PageRank diluted in different domains.
<meta http-equiv="content-language" content="en">
a. You should add a link in every page, to the same page in the other languages of the
site.
b. For SEO, it's better to use www.mysite.com/en/ that en.mysite.com because the PageRank is not diluted in different domains.
Should a web site specify the language in the URL?
No, not necessarily.
Will a robot scrape a site in both language if the language is set through cookies (supposing a link that can change the language)?
No. You should use a content-language attribute as suggested by Eduardo. Alternatively, <html lang='en'> will do the same job AFAIK.
What meta tag could be used to help a robot in understanding the language of a web site?
See above
Should I use a distinct domain for each language?
The Stack Overflow consensus (I'm sorry, I can't find for the life of me find the relevant questions! We had huge discussions on this, maybe they were closed as not programming related), is: Yes, have a different domain for each country if you want to maximize search engine visibility for that country.
I am developing an international web site - multiple countries, multiple languages. I am trying to create SEO friendly URLs.
For example the catalog consists of cartesian product Regions x Categories. A typical catalog url has 7 levels of hierarchy:
www.site.com/en/Catalog/Browse/10/28/London/Category1
The route format is as follows:
"/{culture}/{controller}/{action}/{regionId}/{branchId}/{region}/{branch}"
I have read somewhere that search engines give less relevance to pages deep in the site hierarchy (determined by number of slashes in path). Is this true? Does anybody have info on how much relevance do deep sites lose?
I have thought about simplifying the URLs (making them less deep) by using '-' and '+' as delimiters, so now I have routes like for example:
"/{culture}/friendlyActionPlusControllerName/{regionId}-{branchId}/{region}+{branch}"
ending up with urls still 4 levels deep in "folder" hierarchy.
www.site.com/en/services/10-28/London+Category1
Is using + and - in URLs considered a viable approach? Does this kind of shortening help in SEO? Does anyone see any options for me how to further simplify the URLs?
Additional note: Catalog is going to be the main source of search engine traffic. There are going to be a few content pages as well (with urls like www.site.com/en/Service1) but these two are going to be the only search traffic generators, so I would like to have them optimized as much as possible.
From my experience, I suggest you use the - (hyphenated). If you can keep the number of hyphens to 2 or 3, that is probably the safe route to take. Although I have seen people who go beyond that and it works just as well.
Basically I think if you do it in a way that is purely descriptive of the page content without going overboard, you're okay. However it should also be noted that the more keywords in your url's then the more diverse your page is going to be for Search Engines.
Also remember that keywords in the title, h1, meta data should all match up so the more you have the more difficult this is to manage.
I realise that this does not answer your question 100% and leads to more questions!
There is an open source project that implements shortening URL in aspnet mvc on codeplex, you can get it here:
http://miniurl.codeplex.com/