SEO Best Practices

Mobile SEO Best practices

Mobile search has never been just one result type. It  provides different results and presentation formats, depending on whether the search query is from a feature phone, smartphone or tablet.

  • Google has just announced a specially designated crawler for smartphones apart from what it uses currently for feature phones, which foreshadows a deeper divergence of results between the two mobile types, as well as from desktop results.
  • Until recently, the results for the different mobile types has been assumed to be the same as those for the desktop or simply just more Google local results.  However, data and research findings detailed in a Covario whitepaper on Mobile Search validate that there is much more happening with search on mobile platforms.
  • In a search engine’s view, the difference between feature phones and smartphones, other than an ever expanding list of devices, is that smartphones have a WebKit browser. This WebKit browser is always declared in a smartphone’s user agent. It is also why BlackBerry devices before BlackBerry 6 OS, when the WebKit browser was introduced, are provided feature phone results.
  • Google’s number one rule is “Focus on the user and all else will follow.” As such, Google wants to be sure the resulting pages render properly on the user’s device and provides results based on the user’s search intent. The intention of inputting a search term on a desktop, feature phone, smartphone, or tablet can mean different things for the same keyword. For example, when typing in the term “tacos” on a desktop I may want information or recipes; but on a feature phone, I may want to call a local taco shop; on a smartphone, I want directions to a local taco place, and on a tablet, I want to check reviews or what different items on the menu look like.
  • Google already provides the greatest divergence of search results for feature phones via Googlebot-Mobile by specifically crawling for those devices. Google wrote that your pages may be filtered from those results if it doesn’t render properly and declares the proper mobile DocType. This will be Google’s approach with smartphones and ultimately tablets as well.
  • Standard inbound text links will be marginalized as a ranking factor for the different mobile search types as popularity will be determined more by sharing, such as through Google +, rendering via crawlers/headless browsers, and usability data from all those Android users.
  • Standard SEO ranking elements will be gathered by the desktop version of that page, especially if the mobile page is on the same URL as the desktop page.
  • Enable user agent detection to trigger a mobile type CSS on the same URL as your desktop pages if you can have a 1 to 1 relationship for the purposes of consolidating link juice in the current search environment.
  • If your mobile site will be smaller than your desktop site, using an m. subdomain is the next best option for both smartphone and feature phone results. These mobile results should be on the same m. URL, but triggering a different mobile type CSS based on the user agent.
  • When triggering the mobile type CSS it should also include the correct mobile format DocType.
  • Load time should be minimized not just via page size but in network/http requests, as well increasing compression and cache-control when possible.
  • Use semantic coding with microformats, especially address location tagging, and the formats used in HTML5.
  • Improve rendering and user experience by providing an app like experience with HTML5 and jQuery.

SEO for B2B Marketing
  • At the core of what we as B2B marketers promise in organic results is relevant content to answer the intent of a query. Important to remember that what customers are all looking for when a search is performed is information. In some form or fashion, we type keywords and find the information/content to answer our query.
  • People are constantly connected to the Web via their desktops, laptops, smartphones, e-Readers, or tablets. In order to reach them, marketers need to factor-in these different computing formats as they pertain to SEO. An effective B2B SEO strategy will ensure that information is properly served on each of these types of devices.
  • Once we have created the right content, rendered it successfully across channels and devices, we need to ensure that our information is worthy of being shared. Developing quality content is the most certain way to achieve this; however in a competitive arena, the advantage goes to those whose message inspires and compels people to share. As SEO evolves, share-ability and other social signals will be a major factor in how the search engines calculate relevance.  Not only can it be scored methodologically in an algorithm, but more importantly, it also represents market demand. B2B marketers should strive to develop information that is findable and able to inspire a like, a +1, a comment, or some other tip of the hat that says, yes, this is the type of information that I am interested in consuming.
[ Earlier ]
SEO Best Practices:

SEO Title Tag Format Best Practices:
Primary Keyword - Secondary Keywords | Brand 
Brand Name | Primary Keyword and Secondary Keywords
  • If you are trying to rank for a very competitive term, it is best to include the keyword at the beginning of the title tag. 
  • If you are competing for a less competitive term and branding can help make a difference in click through rates, it is best to put the brand name first. 
  • With regards to special characters, prefer pipes for aesthetic value but hyphens, n-dashes, m-dashes and subtraction signs are all fine.
  • The closer the keyword is to the front of the title tag, the higher the ranking.
SEO Usefulness of H1 Tags: H1 Tags Best Practices:
  • H1s are important for users but not necessarily for search engines anymore.
  • They are very important for establishing information hierarchy and helping with algorithmically determined semantics, but they seem to be less important for search engine optimization. 
  • Recommended on all pages as an aid for users but don’t stress the importance when other opportunities for SEO improvement are available.

SEO Usefulness of Nofollow: No Follow Best Practices:
  • Recommended use of rel=nofollow for thwarting would be spammers of user generated content. 
  • Recommend use of an incentive for creating active users.
SEO Best practice: Use of Alt text with Images
  • Recommend including alt text for all images on all publicly accessible pages. 
  • Add images with good alt text to pages targeting competitive rankings.
  • Use good images with good alt text for pages seeking to rank on competitive queries.
SEO Use of the Meta Keywords tag: Meta Tag best practices
  • If you are trying to rank highly in Yahoo, the Meta Keywords tag can be useful. 

The Use of Parameter Driven URLs (I.E.
  • Not recommended
  • The search engines crawlers can parse and crawl parameter driven URLs but it is much more difficult and
The Usefulness of Footer Links
  • Use footer links sparingly: no more than 25 relevant internal navigational links. 
  • Many examples of Google penalties tied directly to abusive footer links
  • Manipulative links in footers are easily detected algorithmically, and appear to have automated penalties applied to them by Google.
The Use of Javascript and Flash on Websites
  • Not recommended using Javascript or Flash for any navigation important to search engines.
  • Search engines ability to parse these languages is inferior to their ability to parse HTML and choosing to code in the former can lead to lower search engine rankings.
The Use of 301 Redirects
  • Recommended 301 redirects as the best way to redirect webpages 
  • 301 redirects deplete between 1% and 10% of link juice. This is an acceptable penalty if it is necessary to make one URL lead to another URL and other options are unavailable. 
  • Much better than the alternatives (javascript and 302 redirects) which pass very little if any juice at all. 
Blocking pages from Search Engines
  • The Meta Robots tag (noindex, follow) is generally a better option than robots.txt. 
  • Robots.txt files are useful but should be used sparingly and only if a meta robots tag is not an option.
  • Robots.txt do stop search engine crawlers from visiting a web page but they do not keep them from being indexed. They also create a black hole for link juice (as the engines cannot crawl these pages to see any links on them and pass that juice along).
  • No Index also allows link juice to be pass for all links
The Affect of Negative Links from “Bad Link Neighborhoods”
  • Affect of links from bad neighborhoods on good neighborhoods is minimal if the links are not reciprocal.
The Importance of Traffic on Rankings
  • The metric of visitors to a given site is not used to help determine rankings.