SEO Gap Analysis: How Does YOUR Site Rate? (Part 1 of 2)

An SEO Gap analysis is basically a deficiency assessment. It naturally flows from audit and benchmarking assessments based on the general expectations of average performance within the industry.

SEO gap analyses also provide significant insights than can assist in the bench-marking of SEO campaigns, as well as provide a basis for forecasting – assuming you choose to provide that service. While we feel that there is some validity in providing a range forecast when you are entirely knowledgeable about your industry, and have previous data and trending to utilize, in the majority we strongly recommend that you read the Oilman on SEO Forecasting.

An SEO gap analysis is broken into onpage and off page factors. This post with focus on on-page SEO elements. Tomorrows post will deal with off-page SEO gap analysis factors.

[Note; The objective of the SEO gap analysis is likely to be more defined, and the results more meaningful if keyword market research is completed prior to conducting the gap analysis.]

On-Page SEO – 16 Core Gap Analysis Elements

On-page SEO counts for roughly 20-30% of total rankings score for web pages. Without effective on-page SEO, unless you resort to Google bombing, you are unlikely to rank for any worthwhile terms.

1. Title tags – are they optimized for the right keywords? We recommend optimizing the title tag for Google and therefore keeping it to a maximum of 70 characters (including spaces). Makes sure the pages you want indexed have unique title tags. The duplicate content filter in Google is most often triggered by identical title tags. [Note: While Google is not the be-all-and-end-all of search, most sites receive at least 50-60% of their organic search traffic from Google, so I tend to focus on it.]

2. Keyword density might not be ‘all that’ anymore, but it is important from both a usability and a relevancy perspective. If the spiders cannot identify the core focus of your page from the textual content with a minimum keyword density of 3%, you’re not terribly relevant. Going over 15% might give you some spammy issues. Is the copy easy to read for the human user? Do you have at least enough textual information to identify what you do in sufficient detail for both types of users? Spiders love copy. Funny thing – people kind of like reading a bit about what you do too. Try out this great keyword density determination site.

3. Is your anchor text optimized for the pages to which the links are pointing? – Are you ensuring that your inbound link campaign includes your keywords and not only your brand name? Try Jim Boykin’s (Webuildpages.com) backlink text tool.

4. Are you using header tags effectively? – Yes, spiders are interested in the headings on your page. They, like most human type folks, assume that the header is actually the topical heading for a body of related textual content. It’s not a big thing, but then again SEO is made up of lots of little factors which work together cohesively to make a full impression on both human and spider visitors. Consistency, logic and topical relevance are key.

5. Are you using  keywords in URL names and images? File naming is also something to consider. We’ll address URLs first:

a. Most backlinks to a site either use the URL or the page title as the anchor text. Utilizing keywords without stuffing and making the URL’s totally unmanageable is a good idea. Google continues to parse dashes (hyphens) better than underscores.

b. Spiders cannot yet ‘see’ images. While there is work towards OCR, it’s in its infancy. Image search is also the second most popular search. Optimizing your images is well worth doing and image naming is part of that.

6. Alt attributes. – Referring back to image search above; are your core keywords on a page by page basis being included in your Alts – without detracting from their usability? It’s not that hard to include a keyword here and there in a relevant manner without spamming the heck out of both the spiders and the visually-impaired.

7. Are your page sizes to big? – Extended download times are bad for both human users and spiders. Broadband penetration has reached 82% of internet users in the US, and 79% in Canada. Check your page sizes using gsitecrawler or webpageanalyzer – or any number of sites. By the way, this is a great set of easy to use SEO tools.

8. What’s your code to text ratio like? – Again, webpageanalyzer provides clear insights.

9. URL structuring and session ids. – Avoid them. Keep folder depth to a minimum for the pages you want indexed.

10. Does your primary navigation utilize your core on page keywords? – Extend your navigation if necessary. A very basic example; ‘Contact us’, versus ‘Contact Telus’. If someone is looking to find out how to contact Telus, they probably won’t search for ‘contact us’ :).

Play safe and apply that same thought to all primary navigation for your primary keywords. It is highly unlikely you’re going to get a lot of extra traffic from a URL you max out; e.g. assuming you have good quality, relevant content it’s probable that the URL:

• www.optometrist.ca/eye-care-insurance/ will work as well as
• www.optometrist.ca/myopic-eye-care-insurance-optometrist-medicine-hat/

Make use of local search, utilize keywords in URL descriptions but not to excess.

11. Check out how many pages are indexed in the major websites. If there is an obvious issue, investigate it. Check for each major engine as follows:

a. Google: site:www.yourdomain.com
b. Yahoo!: site:www.yourdomain.com
c. MSN/Live: site:www.yourdomain.com
d. Ask: www.yourdomain.com+site:www.yourdomain.com

I realize that talking about Ask and Yahoo! as major engines at this precise time is a matter of contention, what with Micro-Hoo! and all… Here’s a great site for pretty-exhaustive info on advanced Google operators.

12. Website freshness is also a matter of contention. New pages with new content are great – when relevant. An html news section is fine. Fiddling around with existing copy that has been optimized on optimized pages is not such a plan as it can indicate lack of continuity to the engines. Optimize your primary pages and leave them alone for a bit, build backlinks, write news articles and get involved in social media marketing (SMM) and social influence marketing (SIM). Don’t keep changing your core content for no good reason.

13. Outbound links to related, relevant, quality sites are a good thing, but not to the detriment of you business. Ensure that links open in new windows, and that they supplement or complement your offering. Inmost cases it is not a good idea to place them on your home page. There’s been a bit of a hissy–fit in SEO land regarding the use of nofollows and the prevention of page rank bleed, and usability. [Personally I agree with Sugarrae, and want to thank her for putting it on my radar via her twitterings. :)]

14. Your domain extension is somewhat important. Getting great backlinks from .edu and .gov sites (where warranted) will be a nice boost as regards the spiders perception of your authority and topical relevance, whatever it may be.

15. Canonical issues and broken links. Check for non-www versions, capitalization issues, poor URL naming issues and broken links. Gsitecrawler is good for this, and there are lots of others out there. .301 to the version you want, robots.txt pages you don’t want at all, and fix your internal links by redirecting to the most relevant pages or by creating a custom 404 error page.

16. Check your robots.txt file. Many sites have excluded their /image/ folder. In most cases this is not a good move.

Off Page SEO elements in a gap analysis (with some vague thoughts) will be posted tomorrow, along with our recommendations on how to rate your site based on your findings.


Spread the word:

We'd be honored if you'd help support SEMI by Stumbling us or voting for us on Sphinn.


Stumble it!

8 Comments so far... perhaps you would like to leave one?

No comments yet.

RSS feed for comments on this post. TrackBack URL

Leave a comment

You must be logged in to post a comment.