Search Engine Optimization is a valid practice which seeks to improve technical and content aspects of a website, making the content easier to find, relevant, and more accessible to the search engine crawlers.  Taken to extremes, some practices can be abused.  The vast majority of instances render a website more appealing to Bing, though performing SEO-related work is no guarantee of improving rankings or receive more traffic from Bing.  The main area of focus when optimizing a website should include:

  • <title> tags – keep these clear and relevant
  • <meta description> tags – keep these clear and relevant, though use the added space to expand on the <title> tag in a meaningful way
  • alt attributes – use this attribute on <img> tags to describe the image, so that we can understand the content of the image
  • <h1> tag – helps users understand the content of a page more clearly when properly used
  • Internal links – helps create a view of how content inside your website is related.  Also helps users navigate easily to related content.
  • Links to external sources – be careful who you link to as it’s a signal you trust them.  The number of links pointing from your page to external locations should be reasonable.
  • Social sharing – enabling social sharing encourages visitors to share your content with their networks
  • Crawlability
    • XML Sitemaps – make sure you have these set up and that you keep them fresh and current
    • Navigational structure – keep it clean, simple and easy to crawl
    • Rich media cautions – don’t bury links to content inside JavaScript
    • Graceful degradation – enable a clean down-level experience so crawlers can see your content
    • URL structure – avoid using session IDs, &, # and other characters when possible
    • Robots.txt – often placed at root of domain, be careful as its powerful; reference sitemap.xml (or your sitemap-index file) in this document
      • Verify that Bingbot is not disallowed or throttled in robots.txt: reference
    • Define high crawl rate hours in the Bing Webmaster Tools via the Crawl Control feature.
    • Verify that Bingbot is not blocked accidentally at the server level by doing a “Fetch as Bingbot”: reference
    • Webmasters are encouraged to use the Ignore URL Parameters (found under Configure My Site) tool inside Bing Webmaster Tools to help Bingbot understand which URLs are to be indexed and which URLs from a site may be ignored
  • Site Structure
    • Links – cross link liberally inside your site between relevant, related content; link to external sites as well
    • URL structure and keyword usage – keep it clean and keyword rich when possible
    • Clean URLs – no extraneous parameters (sessions, tracking, etc.)
    • HTML & XML sitemaps – enable both so users and crawlers can both find what they need – one does not replace the other
    • Content hierarchy – structure your content to keep valuable content close to the home page
    • Global navigation – springs from hierarchy planning + style of nav (breadcrumb, link lists, etc.) – helps ensure users can find all your content
  • Rich media warnings – don’t bury links in Javascript/flash/Silverlight;keep content out of these as well
  • On-Page
    • Head copy
      • Titles – unique, relevant, 65 characters or so long
      • Descriptions – unique, relevant, grammatically correct, roughly 160 or fewer characters
    • Body Copy
      • H1, H2 and other H* tag usage to show content structure on page
      • Only one <H1> tag per page
      • ALT tag usage – helps crawlers understand what is in an image
      • Keyword usage within the content/text – use the keyword/phrase you are targeting a few times; use variations as well
    • Anchor text – using targeted keywords as the linked text (anchor text) to support other internal pages
    • Content
      • Build based on keyword research – shows you what users are actually looking for
      • Down-level experience enhances discoverability – avoid housing content inside Flash or JavaScript – these block crawlers form finding the content
      • Keep out of rich media and images – don’t use images to house your content either
      • Create enough content to fully meet the visitor’s expectations.  There are no hard and fast rules on the number of words per page, but providing more relevant content is usually safe.
      • Produce new content frequently – crawlers respond to you posting fresh content by visiting more frequently
      • Make it unique – don’t reuse content from other sources – critical – content must be unique in its final form on your page
      • Content management – using 301s to reclaim value from retiring content/pages – a 301 redirect can pass some value from the old URL to the new URL
      • <rel canonical> to help engines understand which page should be indexed and have value attributed to it
      • 404 error page management can help cleanse old pages from search engine indexes; 404 page should return a 404 code, not a 200 OK code.Reference.
    • Links
      • Plan for incoming &  outgoing link generation – create a plan around how to build links internally and externally
      • Internal & external link management – execute by building internal links between related content; consider social media to help build external links, or simply ask websites for them; paying for links is risky
      • Content selection – planning where to link to – be thoughtful and link to only directly related/relevant items of content internally and externally
      • Link promotion via social spaces – these can drive direct traffic to you, and help users discover content to link to for you
      • Managing anchor text properly – carefully plan which actual words will be linked – use targeted keywords wherever possible
If you find it useful, please share. We appreciate your support.