Sunday, February 15, 2015

The Ultimate SEO Audit




An SEO Audit is a status check of the overall health of your SEO program. In order to move the dial in natural search, you need to know what you’re doing right, what needs to be improved and identify anything that may be hurting your rankings. If you’re familiar with SWOT analysis (Strengths, Weaknesses, Opportunities, and Threats) that’s the mindframe you should have as you conduct the SEO Audit.
What follows is a look at some of the criteria you’ll want to take into consideration.
Ok… on with it…

Before you start, sign up for a Webmaster Account with each of the major Search Engines, or just Google if you’re the “80/20 rule” kind of SEO.


google webmaster
.com

  1. Go to Google Webmaster Tools
  2. Create a Google account if you don’t already have one
  3. Verify your site
  4. Set your correct geographical target
  5. Set your preferred domain
  6. Enable image search (may drive traffic, but will unlikely be useful)
  7. Create and upload XML sitemaps
  8. Identify issues
  9. Look at your external links
  10. Look at the GoogleBot crawl rate
  11. Look at the search phrases
  12. Be alerted to duplicate titles
Bing Webmaster Tools (free) Sign up and authenticate your site. You’ll need to sign up for a Windows Live ID if you don’t already have one.

Yahoo Site Explorer (free) Same deal; sign up and authenticate your site. You’ll need to sign up for a Yahoo! account if you don’t already have one.
While you’re at it, sign up for an SEOmoz PRO account and create a new campaign. Their crawling reports and other tools will automate a heck of a lot of what follows.

TECHNICAL AUDIT

Image credit: CubicCompass
1. Check for page errors and poorly formatted code that might keep search engines from finding/crawling a web page. You don’t have to have 100% valid code, but eliminate as many errors as possible to maximize user experience, crawlability and browser compatibility.
a) Check the validation of your xHTML markup using the W3C Markup Validation Service.
b) Check your CSS using the W3C CSS Validator.
2. URLs should be succinct, descriptive, contain relevant keywords, and leave out non-essential words. Example: Matt Cutts (the head of Google’s WebSpam team) uses the URL “best-iphone-application” for a post titled “What are the best iPhone applications.”
3. URL Character Length: URLs shouldn’t be more than 255 characters in length.
4. Drop Sessions ID’s from URLs: They’re ugly, they make it hard for search engines to index content, create duplicate content issues, pose security risks. Peconic SEO discusses some alternatives.
5. No more than 2 or 3 query parameters (e.g. “?” or “=” or “&”): They can make it difficult for search engines to index pages.

Good = http://www.mysite.com/brands.php?nike
Bad = http://www.mysite.com/brands.php?object=1&type=2&kind=3&node=5&arg=6
6. Proximity of Page to Root Directory: If a web page is far (several directory levels) from your root directory (www.mysite.com), search engines will weigh its value less
Bad = http://www.mysite.com/places/america/california/san-francisco/russian-hill/
Good = http://www.mysite.com/san-francisco/russian-hill/
7. Clicks-from-homepage. Aside from proximity to root director, make sure every page on the site should is accessible from the homepage within 3 or 4 clicks, or it becomes harder to find for visitors and risks being ignored by Spiders.
8. Use hyphens in URLs, not underscores
Bad = www.thedailyanchor.com/this_is_not_seo_friendly/
Good = www.thedailyanchor.com/this-is-seo-friendly/
9. Page file size shouldn’t exceed 150kb
10. Page load time should be less than 10 seconds on DSL. While you won’t be penalized for a load time of 11-16 seconds, long page load times provide for poor user experience and may prevent spiders from crawling your site. You can test page load time using URI Valet.
11. Flash Navigation: Don’t do it, it makes your navigation difficult for search engines to crawl. If for some reason you need to (really? …really?) then be sure the same navigation elements appear in an HTML-only navigation bar in your footer.
12. JavaScript Navigation: Don’t do it. Same reason as for Flash above.
13. Turn off Flash, Javascript, and cookies on your browser and see if you can still navigate your site.
14. Look through the eyes of a SE spider. Run your site through the YellowPipe Lynx Viewer to see what your site looks like from a search engine spider’s perspective. Are any links / navigation elements missing?
15. Look at the Robots.txt file. If you’re using a Robots.txt file make sure you aren’t excluding important sections of content. A misplaced Robots.txt file can prevent whole sections of your site from being indexed. Tip: use the Robots.txt checker in Google’s Webmaster Tools.
16. Navigation location. Your primary navigation should be located at the top or top-left of your site and appear consistently at the same place on every page. Make sure your navigation links to all major content areas of your site and includes a link back to your homepage.
17. Breadcrumbs. Use ‘em if you can. Not only do breadcrumbs show visitors where in the site hierarchy the current page is located and provide a shortcut to other pages in the hierarchy, but optimizing your Breadcrumbs with keywords will help your SEO efforts.
18. Frames. Don’t use ‘em.
19. Splash pages. Don’t use ‘em.
20. Flash. Flash is OK in small doses and yes, Google is getting better at indexing Flash(especially recently), but does a site built entirely in flash have a chance against a well-optimized site built in HTML? Hint: No.
21. Restricted access. Are there any pages on the site that (1) require a login or are accessible only via (2) a search box or (3) a select form + submit button? These pages will not be indexed by Spiders. That’s not a problem unless there’s restricted content that youwant the spiders to index.
22. Broken Links. Use a tool such as the W3C Link Checker to check for broken links. Pages linked to by broken links might not be accessible to SE spiders.
23. Site: command search. Do a Google Site Command search for “site:www.mysite.com” to see how many of your pages have been indexed by Google. This data is notoriously inaccurate so don’t expect it to match up perfectly, but it could indicate a technical problem (or ranking penalty) if you know you have 2,000 pages of content and Google says you have 7.
24. Brand search. Google your company / website name. Even if you aren’t ranking well for keywords yet, you should rank #1 for your domain / brand. If not, could indicate a technical problem or ranking penalty.
25. 301 vs. 302 Redirects. Do a server header check using the W3C Link Checker again or the Live HTTP Headers add-on for Firefox and look for any 302 redirects. If you find any, replace them with 301 redirects. 302 redirects won’t transfer link juice (PageRank) to the destination URL, while 301 redirects pass (theoretically) 99%+.
26. Canonical Homepage URL issues. The URL of your homepage might be displayed in any number of variations, which is fine for visitors but not for SE spiders; they’ll count each page separately. Part of the search engine ranking equation is calculating how many external links a given page has. If you have even 2 iterations of your homepage (e.g. http://site.com and http://www.site.com) then Google will count your homepage as two separate pages. Thus, if you have 1,000 inbound links to http://site.com and 2,000 links to http://www.site.com, Google will count those separately (effectively watering down your link juice) when you really want them to count it as 3,000 links for one page. What to do? Consolidate all versions or iterations of the homepage to one URL.
a.) Decide whether you want a www or non-www URL and then redirect the loser to your preferred choice. Thus, if you choose “http://www.site.com/” then create a 301 direct to that URL from “http://site.com/”
b.) Remove default file names at the end of your URLs. Below are some examples. You should create a 301 redirect to your preferred URL for any variation.
    • http://www.site.com/index.html
    • http://www.site.com/index.htm
    • http://www.site.com/index.php
    • http://www.site.com/default.html
    • http://www.site.com/default.php
    • http://www.site.com/home
c.) Consistently use your preferred URL in every internal link on your site. If you choose the www version, then all links should build off of http://www.site.com/
d.) Go to Google Webmasters Tools > Site Configuration > Setting and set your preferred domain.

While the following aren’t technical audit requirements, they can help round out the big picture:

1. If you’re auditing an unfamiliar site, determine when the site underwent its last major redesign. Significant changes to navigation, content, and URLs can have an impact on SEO and it’s important to know the SEO history of a website.
2. Evaluate the Content Management System. Is it up-to-date? Easy-to-use? Is it going to give your in-house team or SEO consultant a mental breakdown 4 weeks into the project?
3. Evaluate your host and server. Are you on a shared server, VPN, or dedicated server? If you’re going to increase the traffic to your site, can your server handle it? Slow servers not only drive visitors away, but also impact indexing of your content by search engine spiders. If your servers are slow, the spiders will stop indexing in order to keep from crashing your servers.
4. Domain age. When was the domain first registered?
5. Domain expiration date. Check with your host to see how long until your domain registration expires. Best practice is to keep your domain registered for 5+ years out. If your domain registry expires within the next 3 years, renew it now.

CONTENT AUDIT



Image credit: TopDealerSEO
1. Create high-quality unique content. As I said in the  introduction to SEO article, you should only ever rank as high as you deserve to rank, and that means creating high-quality and unique content.
2. Human-focused. Content should be written for humans, not for search engines.
3. Include targeted keywords in your title tags, meta descriptions, header tags, alt attributes, on-page copy, etc. Pages should be optimized for 2-4 keywords per page, with the keywords used 4-5 times per 500 words and no keyword being targeted on more than one page. Example: it’s fine to target “blue widgets” and “green widgets” on either the same page or on separate pages, but don’t target “blue widgets” on two or more separate pages.
4. Internal Navigation and on-page links to other pages on your site should include targeted keyword as anchor text.
6. Enough Content. Every page should contain 200+ words of HTML text.
7. Title Tags are the most important on-page optimization element so spend time making them perfect:
a) Title Tags should include targeted keywords: In general, use your highest-value keywords at the beginning of the tag and repeat a second time if possible without sounding spammy. Example: “Duvet Covers, Pillow Shams, Duvet Sets & Duvet Cover Sets | Brand Name”
b) Title Tags should be relevant and descriptive and accurately reflect the content of that page. Include the brand name at the end of the title tag. (e.g. “Duvet Covers, Pillow Shams, Duvet Sets & Duvet Cover Sets | Brand Name“)
c) Title tags should be unique: a given keyword should only be targeted on one page. Over 20MM web pages on Google have no title and 7MM have a title that says “insert title here.” Don’t be that guy.
d) Title Tags should be succinct and not exceed 70 characters in length.
e) DMOZ hijacking. If you’re listed in DMOZ, be sure to use to prevent DMOZ titles and descriptions from being used by Google.
f). Yahoo! hijacking. If you’re listed in the Yahoo! Directory, to prevent Yahoo! titles and descriptions from being used by Google.
8. Meta Descriptions don’t affect keyword rankings, but they do affect click-through rates from search engines:
a) Meta Descriptions should be descriptive and describe the page content succinctly and accurately.
b) Meta Descriptions should be unique for every page
c) Meta Descriptions should be actionable and be written as an “ad headline” for each page. Encourage users to click on the link in search results. Write with sizzle, but be honest, accurate, descriptive. Don’t bait and switch!
d) Meta Descriptions should be succinct and not exceed 156 characters in length.
9. Meta Keywords are so 1993. Don’t bother. Seriously.
10. HeaderTags. Every page should contain an H1 tag that is optimized for relevant keywords.
11. Avoid Duplicate: If you have hundreds or thousands of products on your site, you probably have at least some repetitive copy. Flag it, do your best to make the copy unique and target different keywords on each page.
12. Clean up duplicate URLs with the Canonical Link Element. Google, Yahoo, and Microsoft support a link element to clean up duplicate URLs.
Let’s say you have a page about Organic Cotton Sheets that is accessible by multiple URLs (http://www.example.com/products/b335/?pkey=csheet-sets and http://www.example.com/products/b335/?cm_src=rel.) You can specify in the HEAD part of the document the following:
That tells search engines that the preferred location of this url (the “canonical” location) is http://example.com/organic-cotton-sheets.html instead of http://www.example.com/products/b335/?pkey=csheet-sets or http://www.example.com/products/b335/?cm_src=rel.
13. Image Alt attributes should be used on all images ”…”.
14. Image filenames should be short but descriptive and use targeted keywords. Don’t use long filenames or practice keyword stuffing.
Good = “organic-cotton-throw.jpg” or “spiked-eggnog-recipes.jpg”
Bad = “image27.jpg” or “eggnog.jpg”
15. Image files should be stored in one directory (e.g. “http://www.example.com/images/”)
16. Internal Links. The general rule of thumb is that there should be fewer than 100 links on a page, mainly to 1) keep from souring user experience and 2) to keep from dividing PageRank to so many links that the links carry only a minuscule amount of linkjuice. That said, Google may very well spider up to 200 or 300 links.
17. Internal No-followed Links. There probably isn’t a need; the days of PageRank sculpting with no-followed links are over. Use the nofollow attribute ONLY when linking out to sites you don’t trust (e.g. links in user comments) or to non-essential pages that wouldn’t be helpful to include in search results (e.g. “add-to-cart” links) That said, Matt Cutts recently said, “I would let PageRank flow even to your privacy and terms-of-service type pages. Even those sorts of pages can be useful for more searches than you would expect.”

LINK AUDIT



First, a quick note: as a blog owner, please don’t go on a comment spam tirade trying to grow backlinks by posting crap comments on people’s blogs. Google’s algorithm downplays the affect of blog comment links and the webmaster is probably going to no-follow or remove your link, anyway.  Include links in comments only when they contribute to the conversation.
1. Measure the number of inbound links to the site using SEOmoz Open Site Explorer or Yahoo! Site Explorer. The more links the better, but what really matters is the number of links from unique root domains: while it’s fine to have 20 links from one site, it won’t matter nearly as much as 1 link from 20 unique sites.
2. How many links are from .Edu sites? Search engines place higher value in .edu links because not everybody can register these top-level domains, only qualified institutions.  AKA more links from .edu sites = better.
3. How many links are from .Gov sites? Same as above, search engines place higher value in .gov links because not everybody can register these top-level domains, only qualified institutions.  AKA more links from .gov sites = better.
2. Directories. Has site been submitted to directories such as DMOZ (free, but practically impossible to get into because almost no editors are active anymore) Yahoo! Directory ($300/yr), BOTW ($100/yr) and other industry-relevant directories?
3. Inbound anchor text. Just as links within your site should use keyword-optimized anchor text, links to your site should use keyword-optimized anchor text. If the majority of links to your site use inadequate anchor text (e.g. “click here”) then flag this for later. During the link building stage you can reach out to webmasters of sites that link to you and ask them to modify that anchor text.
4. Is there a natural distribution of links on the site, or do 90% of the links directed to 1 page? Your homepage will naturally attract the most links,  so the more links you have to deeper pages the better.
5. Does the site have links from Link Farms or Free For All (FFA) sites? 1993 is calling. They want their tactics back.
6. Avoid excessive reciprocal links.
7. Paid Links. Don’t buy or sell paid links that will flow PageRank and attempt to game Google’s search results. Google is okay with some paid links (see quote below,) but be sure to use the nofollow attribute and clearly identify those links as being paid/sponsored.
If you want to sell a link, you should at least provide machine-readable disclosure for paid links by making your link in a way that doesn’t affect search engines… For example, you could make a paid link go through a redirect where the redirect url is robot’ed out using robots.txt. You could also use the rel=nofollow attribute… The other best practice I’d advise is to provide human readable disclosure that a link/review/article is paid. You could put a badge on your site to disclose that some links, posts, or reviews are paid, but including the disclosure on a per-post level would better. Even something as simple as “This is a paid review” fulfills the human-readable aspect of disclosing a paid article.
- Matt Cutts, the head of Google’s WebSpam team.
8. Anchor text. All internal and external links should make good use of Anchor text (the clickable text in a link that is placed within the anchor tag .) Anchor text helps users and search engines understand what page you’re linking to is about, and should be short, descriptive and on-topic.
Bad = “click here” or “read more”
Good = “The previous article in this series gave an introduction to SEO.”
______________
What’s on your SEO Audit checklist? Have anything to add? Leave a comment below.

No comments:

Post a Comment