Google is the first place many people go first when opening a web browser. It’s fast, reliable, and accurate. Search engine optimization could be the special sauce that makes or breaks your website. So let’s learn a little SEO from Google. Maile Ohye will be our driver.
Crawling
When you create a website, make it accessible without forms. Most often, search engine crawlers don’t index dropdowns.
Cookies
Most search engines don’t accept cookies. Allow non-cookie guest accounts to view basics on your site. Avoid the message, “You must enable cookies to use this site”.
Use static HTML links and textual based content. When you are making a search-friendly site, an all Flash interface won’t work.
Crawlable architecture
Consider progressive enhancement. Site structure, links, navigation in static HTML. It improves JS, forms, cookies, and Flash issues. It also reduces dilution of PageRank when sharing links between Flash and non-Flash version.
Add fancy bonuses like Ajax and Flash later. Link with descriptive anchor text. The most common anchor text is “click here”.
Guidelines for URLs
- Organized
- Sharable between users (each item referenceable among friends)
- No orphan pages
- Largely unique content per URL. Avoid serving different languages on the same URL
YouTube – one page per “thing”. Site navigation is in HTML; sepearate from rich media. Descriptive content along side rich media.
Consider sIFR for Flash. JavaScript can detect is Flash is installed, and show whichever version you want. The text matches content viewed by enabled users.
Ajax
Consider using Hijax. Format JS with a static URL as well as a JS function.
Click here
Images
Add descriptive alt-text. Use quality images. Include descriptive textual content near the image.
Nice:
Less nice:
Better to avoid:
Google Webmaster Central
There is a lot of reference here, discussion groups, blogs, and tips. You can review crawl errors in webmaster tools. You can correct your crawl mishaps. Use Xenu for detecting broken links on your site.
Be sure that 500 errors are expected or normal. Correct URLs restricted by robots.txt, if necessary. There is a robots.txt generator and analyzer on the site.
Utilize web server options. Enable “if-modified-since” response header. Use gzip for utilizing compression.
Blog post: First date with the Googlebot
Eliminate soft 404s
Correct your webserver if returning 200 rather than 404 (then often directs to homepage).
Drawbacks: it confuses users and causes duplicate content for search engines. Crawlers may not discover new pages or find modified pages as quickly.
Indexing
Set the preferred domain. Influence canonical URLs, use 301s properly.
You want to have one version of your website running. Set preferred domain to www or non-www. 301 redirect one version to the other. PageRank will dilute your value if you have both running at once.
Do not duplicate content through URL parameters. This is common with product pages, sort order, and affiliate IDs. Keep URLs as clean as possible. Remove unnecessary parameters. Track visitor information with 301s to redirect URLs with paramaters to the canonical version. Use cookie data to set variables. Keep the site accessible without cookies though.
How else can you influence the canonical? For Google, Sitemaps can influence our understanding of the canonical URL. XML Sitemaps are great too (sitemaps.org)
Use to make your sitemap. If you have video content, you can submit a video sitemap. News, mobile, code search, etc.
If you use a sitemap, you get a lot more stat tracking.
Proper use of response codes
It signals search engine to transfer properties, like link popularity, to the target URL. If you have to modify a URL, use a 301.
“301s are your friends.” -Maile Ohye
Blog post: Best practices when moving your site
Anatomy of a search result
Title, snippet, URL, and sitelinks.
Titles are very informative. They act as an informative signal of the URL’s contents to search engines. “Untitled” is the most common title on the web. Not exactly informative. Webmaster Tools will help you out with this.
Help your snippets. Snippets provide the user more context for each result. QUality of your snippet can impact click-through. Influence your snippet with a meta description. They can easily be utilized by Google in search results.
Improve snippets with a meta description makeover“Search within a site” is an experimental project right now.
Quick tips
Consider eliminating index.html or index.htm. Be consistent.
If you have a verified sub-domiain, you can use webmaster tools to target IPs by geography.
XML files are not indexed.