A web developer created our new intranet in Sharepoint 2007, none of the pages have metadata titles or descriptions (or any metadata tags) also the page has no semantic structure, divs are used instead of H1, H2, H3 etc.
He said this was a quickie out of the box creation but i'm really concerned about the quality of the page structure and the impact this will have on the effectiveness of the internal search engine.
Does a Sharepoint 2007 website come out of the box in this condition and am I right or wrong to be concerned about this?
Any advice would be appreciated as I would not accept the build of an external site like but he said "sharepoint is just like this" and I have no idea.
Many thanks
John
Edit the default master page. Add your meta tags there if the site is going to have same meta keywords and descriptions. If you want to customize the meta tags for each and every page then you will have to find the contentplaceholder ID that will hold the head section of the page and put all your meta tags there in each and every page.
Related
In just about any wiki server you can create a hyperlink to another page on the wiki. How can one create a window that pulls the other page's content into my page?
I can add a HTML Snippet with an iframe but it pulls an entire page not just the content?
The code for the javascript behind the page is not obfuscated but it seems to use Prototype and I don't understant it?
Has anyone had any experience with this?!
This can't be done reliably, one can create iframes and whitelisting the domains it points to but it just pulls the full page into a small window. It's completely unusable.
What I was looking for is called an "inclusion" or "transclusion" and OS X Server does not have the feature that is apparently standard in MediaWiki and has been for years.
More info here:
Wikis - Is there a wiki in which one can create, within a page, "windows" (partial frames) to other pages?
In jQuery Mobile, we can use the "data-title" attribute to specify a page title for each individual page within a multi-page HTML document, like...
<section id="home" data-role="page" data-title="Home Page Title">
Which of course makes a lot of sense, because there may be several individual mobile pages contained within a single HTML document, so it's good to be able to specify a unique page title for each one, from an SEO perspective.
But what if we also want to specify a META description for each individual mobile page (in a multi-page document). Is this possible? Or are we stuck using the single META description in the head section of the document? (which would obviously be less than ideal from a SEO perspective).
I've been researching this topic online for the past hour, but surprisingly have yet to find any info on it.
Anyone have the scoop on this?
I have a site that is very simple and has mostly images and a login form and a link to signup. No actual text exist in the body except for the footer which shows the link to usage terms and copyright notice.
My site is currently showing up on search engine results with the footer content showing instead of what I put in the <meta name="description"...> tag. Why is this?
How can I not allow the search engines to index my site with the footer content showing? Or at least show the meta description first? Do I need to put some text in the form of a title attribute or alt attribute somewhere?
As +Filburt pointed out, you could add your site to Webmaster Tools which will offer you valuable information about your site's presence on the web and in the Google Search results. It may also provide you hints about what do we think about your meta descriptions :)
Generally, you will want to
write the meta description to be as natural as possible, don't stuff keywords in it,
describe the page's content accurately in this tag,
and to have a unique meta description for each page.
While we can't guarantee that the meta description that you provided will be used as search result snippets, following the above tips will greatly increase the chance.
Here are some more information about the meta description tag: http://www.google.com/support/webmasters/bin/answer.py?answer=35264
It works to some extent to use <meta name="description" /> but Google will complain (and choose to ignore it) when every page has the same description.
If you are interested in how Google deals with your site you could sign up for their Webmaster Tools - they offer a good starting point for SEO-ing your site.
You could add content invisible to your visitor but Google checks this and considers hidden content as cheating for page rank because this used to be a common SEO technique.
meta tags were a failure and have been broadly ignored ever since the Google era began picked-up again with enthusiasm.
The problem was, humans would put stale, inaccurate, or irrelevant information in the meta tags. This was fifteen years ago when cataloging the Internet still seemed feasible.
Google came along and decided that what a web page actually says was more useful. Everybody else followed suit shortly after.
Now people are trying human-authored metadata again, they're calling it the "semantic web". My hopes are not high.
If you search for "richfaces" in google.com, the first result will be about www.jboss.org/richfaces. You may watch there that links (menus) like "Downloads", "Demos", "Documentations" are also displayed. How to have these links displayed in the search results?
(The "description" meta tag not enough I hope)
You are not able to make Google show links to your site (they will do this if they deem your site is relevant enough to warrant providing this feature). However, you can remove these links if they are present, if they are inappropriate.
See http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=47334 for more details.
These are called Google Site Links. Google is pretty tight-lipped about how this feature is automated, but there are a handful of HTML5 tags which are supposed to help make search engines smarter. You can read more about them at O'Reilly's Dive Into HTML5 website. Especially interesting are the "Google Rich Snippets", though they're not exactly what you're looking for.
It might help to put those links in the HTML5 nav tags, like
<nav>Home About FAQ</nav>
and I've heard it tossed around that the site navigation should be an unordered list, but I don't know how true that is. Still, it couldn't hurt to do it that way and style the list with CSS.
I have a large directory of individual names along with generic publicaly available and category specific information that I want indexed as much as possible in search engines. Listing these names on the site itself is not a concern to people but some don't want to be in search results when they "Google" themselves.
We want to continue listing these names within a page AND still index the page BUT not index specified names or keywords in search engines.
Can this be done page-by-page or would setting up two pages be a better work around:
Options available:
PHP can censor keywords if user-agent=robot/search engine
htaccess to restrict robots to non-censored content, but allowing to a second censored version
meta tags defining words not to index ?
JavaScript could hide keywords from robots but otherwise viewable
I will go through the options and tell you some problems I can see:
PHP: If you don't mind trusting user agent this will work well. I am unsure how some search engines will react to different content being displayed for their bots.
htaccess: You would probably need to redirect the bot to a different page. You could use the url parameters but this would be no different then using a pure PHP solution. The bot would index the page it is redirected to and not the page you wish to visit. You may be able to use the rewrite engine to over come this.
meta tags: Even if you could use meta tags to get the bot to ignore certain words, it wouldn't guarantee that search engines won't ignore it since there is no set "standard" for meta tags. But that doesn't matter since I don't no of any way to get a bot to ignore certain words or phrases using meta tags.
JavaScript: No bot I have ever heard of executes (or even reads) JavaScript when looking at a page, so I don't see this working. You could display the content you want hidden to the users using JavaScript and bots won't be able to see it but neither will users who have JavaScript disabled.
I would go the PHP route.
You can tell robots to skip indexing particular page by adding ROBOTS meta:
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
UPDATE: The ways to restrict indexing of particular words I can think of are:
Use JS to add those to the page (see below).
Add module to the server that would strip those words from the rendered page.
JavaScript could be something like this:
<p>
<span id="secretWord">
<SCRIPT TYPE="text/javascript">
<!--
document.write('you can protect the word by concating strings/having HEX codes etc')
//-->
</script>
</span>
</p>
The server module is probably best option. In ASP.NET it should be fairly easy to do that. Not sure about PHP though.
What's not clear from your posting is whether you want to protect your names and keywords against Google, or against all search engines. Google is general well-behaved. You can use the ROBOTS meta tag to prevent that page from being indexed. But it won't prevent search engines that ignore the ROBOTS tags from indexing your site.
Other approaches you did not suggest:
Having the content of the page fetched with client-side JavaScript.
Force the user to execute a CAPTCHA before displaying the text. I recommend the reCAPTCHA package, which is easy to use.
Of all these, the reCAPTCHA approach is probably the best, as it will also protect against ilbehaved spiders. But it is the most onerous on your users.