Expanding on my original SharePoint 2007 SEO article, here is a great follow up by Jean-Paul at Microsoft.
SEO involves configuring site structure, navigation, page content, metadata and labels to improve search engine relevance and ranking and aims at making it easier for customers and partners to find you through search engines, notably via Bing and Google.
Out-Of-The-Box, SharePoint 2010 is far more SEO-friendly: far more semantic HTML (which helps with WCAG 2.0 AA compliance), powerful metadata management and capture, SEO feature on publishing pages, are only a few of those enhancements. In addition, Microsoft released the new IIS 7.0 SEO Toolkit, which helps Web developers, hosting providers, and Web server administrators to improve their Web site’s relevance in search results by recommending how to make the site content more search engine-friendly. The Toolkit includes the Site Analysis module, the Robots Exclusion module, and the Sitemaps and Site Indexes module, which let you perform detailed analysis and offer recommendations and editing tools for managing your Robots and Sitemaps files. Let’s take a closer look at the benefits of this toolkit:
Improve the volume and quality of traffic to your Web site from search engines
The Site Analysis module allows users to analyze local and external Web sites with the purpose of optimizing the site’s content, structure, and URLs for search engine crawlers. In addition, the Site Analysis module can be used to discover common problems in the site content that negatively affects the site visitor experience. The Site Analysis tool includes a large set of pre-built reports to analyze the sites compliance with SEO recommendations and to discover problems on the site, such as broken links, duplicate resources, or performance issues. The Site Analysis module also supports building custom queries against the data gathered during crawling.
Control how search engines access and display Web content
The Robots Exclusion module enables Web site owners to manage the robots.txt file from within the IIS Manager interface. This file is used to control the indexing of specified URLs, by disallowing search engine crawlers from accessing them. Users have the choice to view their sites using a physical or a logical hierarchal view; and from within that view, they can choose to disallow specific files or folders of the Web application. In addition, users can manually enter a path or modify a selected path, including wildcards. By using a graphical interface, users benefit from having a clear understanding of what sections of the Web site are disallowed and from avoiding any typing mistakes.
Inform search engines about locations that are available for indexing
The Sitemaps and Site Indexes module enables Web site owners to manage the sitemap files and sitemap indexes on the site, application, and folder level to help keep search engines up to date. The Sitemaps and Site Indexes module allows the most important URLs to be listed and ranked in the sitemap.xml file. In addition, the Sitemaps and Site Indexes module helps to ensure the Sitemap.xml file does not contain any broken links.
Site Analysis Features
- Fully featured crawler engine
- Configurable number of concurrent requests to allow users to crawl their Web site without incurring additional processing. This can be configured from 1 to 16 concurrent requests.
- Support for Robots.txt, allowing you to customize the locations where the crawler should analyze and which locations should be ignored.
- Support for Sitemap files allowing you to specify additional locations to be analyzed.
- Support for overriding ‘noindex’ and ‘nofollow’ metatags to allow you to analyze pages to help improve customer experience even when search engines will not process them.
- Configurable limits for analysis, maximum number of URLs to download, and maximum number of kilobytes to download per URL.
- Configurable options for including content from only your directories or the entire site and sub domains.
- View detailed summary of Web site analysis results through a rich dashboard
- Feature rich Query Builder interface that allows you to build custom reports
- Quick access to common tasks
- Display of detailed information for each URL
- View detailed route analysis showing unique routes to better understand the way search engines reach your content
Robots Exclusion Features
- Display of robots content in a friendly user interface
- Support for filtering, grouping, and sorting
- Ability to add ‘disallow’ and ‘allow’ paths using a logical view of your Web site from the result of site analysis processing
- Ability to add sitemap locations
Sitemap and Sitemap Index Features
- Display of sitemaps and sitemap index files in a simple user interface
- Support for grouping and sorting
- Ability to add/edit/remove sitemap and sitemap index files
- Ability to add new URL’s to sitemap and sitemap index files using a physical or logical view of your Web site
- Ability to register a sitemap or sitemap index into the robots exclusion file
Read the original article @> SharePoint Experts Blog : A note on Search Engine Optimization (SEO)