Basic tips and techniques for improving a website’s findabilty by search engines and your customers
Before trying more labour intensive and sometimes more expensive ways to raise your traffic from search engines, every web site owner should look at the basics: having good content that guides consumers through a sales process and knowing what search terms consumers will use to find you (or your competitors) on the search engines. Your mission-critical in-house skillset is (or should be) knowing your products, your customers, and your competitors. We recommend doing a competitive analysis so you can improve your site so it is more findable, and has better information once consumers are directed to your website. Remember, a poor website won’t improve business no mater how much you spend on SEO marketing.
SEO or search engine optimization is the science (some consider it a bit of a “black art”, as well) to optimizing the positioning of your website on the global search engines. When consumers are searching for your type of business and/or product/service you want them to find your website fairly high on the search results.
Devil’s advocate: First is not always best, when you are in a competitive business and you want people to look at competitors to realize you have a unique advantage over others. While some might believe the first listing’s claims, but the ease of getting second/third/fourth opinions on search engines means consumers are increasingly shopping around. They are comparing claims made on websites, assessing each competitor on the consumer’s own feature checklist.
There are three key components to Search Engine Optimization:
Content – text content relevant to the search is highly important (photos & flash may look good but index poorly).
META tags – these are unseen but coded into each page and can dramatically affect search results (plus or minus).
Social Linking – the more popular a site is, especially with key influencers, the more respectable and credible it is ranked, relative to the value of its content and META tags
Search engines all have different algorithms (formulas) for ranking sites in their directories, and these algorithms change from time to time, often without warning. Because of differences in algorithms, you can have excellent placement in one search engine and bad ranking in another.
The basic premise if content ranking is that a page that has multiple references of a word, words, or a phrase in a user’s search is more relevant than a page that mentions that word or phrase only once, or not at all. Some search engines look for exact matches (let’s use “car dealer” as an example), some look for plural and verb tense variations and synonyms (“car dealers”, “automobile dealers”), while others look at word combinations and permutations (“used car dealers”, “dealer in new and used cars”).
Keep in mind a keyword dump may look good to search engines for increasing word frequency, but do NOTHING for getting a consumer to buy your product or service. Search engines are getting increasingly sophisticated in identifying efforts to manipulate content rankings, including tactics like “doorway pages” which redirect users to pages with a lower content ranking, and “white on white” text that is not visible to humans reading a page (this also applies to text in a microscopic font size).
META Tag Issues
There are a number of key META tags that are used to not only rank you results but provide information in the search results:
Title tag – what is the page called, and is typically included in the window label (blue stripe) of the browser. Order of words is important, so don’t lead with a business’s name – unless that’s what your research indicates is what consumers are looking for. Keep this under about 60 characters
Description tag – more verbal description of what the page covers, and is often displayed as a sub-text in the search results, Keep this under about 120 characters
Keywords tag – this is a laundry list of key search words or phrases, again order is important, so place highest traffic words/phrases closer to start of list. Separate phrases with a comma and a space, and do not have duplicates (this includes avoiding case variations), consider localization of key phrases to include important community names. We have found that the keywords tags should have no more than 60 total words each. And yes, you can have more than one, but this can often dramatically slow page loading.
There are also a number of localization tags that help with search results and improve how many appropriate local users find and view your website, and specific pages within it:
<meta name=”geo.position” content=”53.547652;-113.489027″ />
<meta name=”ICBM” content=”53.547652,-113.489027″ />
<META NAME=”Geography” CONTENT=”Calgary, Alberta, Canada”>
<META NAME=”Language” CONTENT=”English”>
<META NAME=”country” CONTENT=”Canada”>
TIP: one tag everybody should have is an ICRA tag, which ensures (at least if your site is “PG” rated) that everybody can view your site behind school, office, and library firewalls or when browsers have parental control settings activated. The tag will typically look like this:
<link rel=”meta” href=”http://www.foundlocally.com/labels.rdf” type=”application/rdf+xml” title=”ICRA labels” />
Tools for webmasters
One of the best tools we’ve found for looking at your site the way search engines do, ishttp://www.seo-browser.com/
which provides a bunch of analyses of how your site (or a page within your site), especially in Advanced Mode.
Social Linking Issues
By now everyone has learned that sites like Google (and others) rank sites based on the number of links to them, especially by “authoritative” (high traffic) sites. It’s kind of like back in school, how the “popular girl” or “popular guy” was decided: it wasn’t based on how many people someone actually dated (which could have adverse consequences, if too many, or too wrong), but how many people said good things about you or thought you were “popular”.
There are a number of things you can do to increase the number of links to your site. First of all add yourself to all search engines and directories that will take you (keep track so you don’t make duplicate submissions, which wastes your time, and annoys webmasters), and ensuring key industry & professional directories have a link to your website. Many will do this as a benefit of membership.
There are a number of “search engine directories” called FFA (Free For All) sites which can submit to, to create links back to you, but most users don’t use these sites, and the search engines have largely ignored them because they add no value to the links. Most web submission software can submit to all of these pretty quickly and effortless, but are not worth the effort when doing manual submissions.
Are link trades worthwhile? They can be if you can get a prominent link from a more popular or more prestigious site. Otherwise, its not helpful. Its like the well-established “halo effect” where a person’s prestige can be improved by dating someone smarter./more beautiful/wealthier than they are. When offered a link trade, look at the other site’s page where your link will be. If it has many more links on that page, than you will have links on your reciprocal links page, then you will likely be a net loser.
You need to watch Google actually uses a formula to calculate the potential dilution effect of link exchanges:
Net PR improvement
Number of links them
Number of links you
Google assigns PageRanks (PR) to sites and pages, somewhere between 10 (wildly popular & important) and 0 (invisible), and you can see the Google PageRank on any page if you have the Google Toolbar
installed into yoiur browser