LOADING

Type to search

 

The Complete SEO Website Audit Checklist for 2018: Part One

 

[siteorigin_widget class=”WP_Widget_Custom_HTML”][/siteorigin_widget]
[siteorigin_widget class=”WP_Widget_Custom_HTML”][/siteorigin_widget]
[siteorigin_widget class=”WP_Widget_Custom_HTML”][/siteorigin_widget]

SEO Checklist

When starting a new SEO project or taking a fresh look at an existing project, it can be hard to know where to start. That’s why we’ve decided to share our in-house checklist to save you the trouble of making your own and to help you avoid missing the important aspects along the way.

The following steps can be used for a comprehensive technical review and to help generate a plan of action to fix problems on most websites. Of course each website is different, but we intentionally made this checklist generic, so it should be applicable to most sites you come across.

We have used examples from the “New” version of Google Search Console where available. At this time, Google has not finished moving all features from the new to old system, so at times we will have to refer to the old reporting methods.

Index

1. Review # of indexed pages

  • The most accurate method to determine the number of indexed URLs (HTML not images) in Google is to view the Google Search Console Index Coverage Report . The Validsection of this shows URLs that are actually in the index.Google Search Console Indexed Pages

    You can also see the number of URLs that Google is aware of, but cannot reach because they are blocked either by Robots.txt, the meta noindex tag, or other error that is occurring. Click the excluded link to view the details on where problems maybe occurring at. This is one of the most comprehensive reports Google Search Console has available on what is going on when they attempt to spider and index the site. You can dig deeper into the data by clicking each line to see more details on why URLs were excluded for more information.

    Google Search Console Excluded Pages

  • Download the Chart Data for future reference.
  • Review the number of pages indexed, ideally the number will be climbing as new content is added. URLs indexed will fluctuate, especially on larger sites as Google changes their algorithm and/or other factors change.
  • The second most accurate method is to do a site:www.yourdomain.com search at Google. The value between what Google Search Console reports, and the number of results reported back from site:www.yourdomain.com searches will vary, sometimes greatly.Note that site:domain.com searches display both URLs from the root directory and any subdirectories, where site:forum.domain.com only displays URLs from forum.domain.com.
  • “IF” you have an XML sitemap for pages on the site, and it’s been submitted to Google, the Sitemaps report will display how many of the URLs in the Sitemap are in the index. If you use a sitemap index file as in this example, clicking the link will display each referenced sitemap and their status.Google Search Console Sitemap
  • Bing’s Webmaster Tools > Reports & Data Report will display the Historical Number of pages indexed, this value will likely not match a site:domain.com search at Bing in many cases. We found typically the reporting at Bing Webmaster Tools to be higher than what is reported with the site:domain.com search reported # of results, this may be due to duplicate content or other filtered search results.Bing Webmaster Tools Indexed URLs
  • Consider why the number of indexed URLs may be higher or lower between Bing and Google.

2. Review # of indexed images

  • Google isn’t great at reporting the number of indexed images, but if you do include your images in your sitemap or in a dedicated sitemap, they will report how many of the images in the submitted sitemap they have indexed. This information is only available in the old search console, it’s not available at this time in the new reporting.Google Sitemap Indexed Images Report

    Because the new Google Search Console reporting does not show images, you may wish to use a dedicated image sitemap for only images to get better reporting in the new Search Console reports.

  • Do a site:domain.com search in Google Image Search and add &sout=1 to the end of the URL, this will show the “Old” type of image search, which displays the approximate # of indexed images. For example –
    https://www.google.com/search?tbm=isch&q=site%3Adomain.com&sout=1

    This number is not all that accurate as image search often won’t display all the images it knows about – typically due to duplicate images and/or alternate image sizes. For example, the report above showed 845 images indexed in the Google Search Console Sitemap Report, but a search like we just described in image search only showed 720 images – that’s quite a discrepancy. If images are important, using a dedicated sitemap can help a lot in monitoring the number of indexed images.

  • Bing will not display the number of indexed images in a sitemap, but you can use the site:domain.com search in Bing Image Search (www.bing.com/images/discover) to do a quick visual review to see how things are going. Remember you can specify specific directories, i.e. site:www.domain.com/forum/images to see more specific results.
  • Review how well the images are indexed using page title, img alt text, caption or nearby text.

3. Review Data from Google and Bing

You should do a full review of all the reports in both Google Search Console (GSC) and Bing Webmaster Tools (BWT), this is valuable data direct from the engines that can be very helpful, make use of it! At the very least, review these most important sections in GSC.

  • Review any site message reports (Google Search Console & Bing Webmaster Tools).
  • Google Search Console Search Appearance Section, review the Structured Data Report for errors, and the HTML improvements report for miscellaneous problems.
    GSC Structured Data Errors

    To optimize our chances on getting Rich Results in Google Search, care should be taken to minimize and eliminate structured data errors. This is extra important, as Google’s requirements do change often. What passed with no errors last month, may not meet Google’s requirements this month, which can result in a loss of search traffic.

  • Also check the Search Appearance > Rich Cards Error report. Rich Cards are especially important in Mobile Search resulting in listings in the top Carousel as shown.
    Google Search Console Rich Card Error Report Screen Shot

    Rich Card Mobile Example

  • URLs listed as Enhanceable are marked up entities that have non-critical errors, such as missing author information. If you can add the missing data, you may improve chances on generating a Rich Card, such as the recipe example above.
  • Review the HTML improvements report for miscellaneous problems, especially note the Duplicate Title and Duplicate Meta Description sections. Those are both indications of duplicate content which should be avoided.
  • Google Search Console Search Traffic Section, check for any Manual Actions and record the number of Links to your site in the Links to Your Site subsection.
  • Google Search Console Search Traffic Section, closely review the Mobile Usability Report for errors. This is the ONLY method available from Google to scan the entire site for Mobile errors.Any errors in this area can affect mobile ranking for the URLs with errors. Clicking through on the reported errors will allow you to test with Google’s Mobile Friendly Test only check one URL at a time, but have more comprehensive reporting.
    GSC Mobile Usability Errors

    When trouble-shooting errors focus on fixing the ones that are mobile compatibility issues. This report can find a variety of errors that originate from many different issues. For example, if Google has spidered a directory index (www.yoursite.com/blog/wp-content/themes/name/css/directory) these Unix directory index pages are not intended for your users as mobile compatible.

    They don’t make use of your WordPress Theme. It’s a good idea to address issues like this regardless of how often the page may get visited. You really don’t want directory index pages in the search index. The point is, unformatted URLs like this do get spidered and will generate errors, but they AREN’T going to effect ranking on something that matters.

    Remember that Google’s Search Console is slow to update and this report may not be that fresh. After you make changes to your site fixing errors, Google does have to reindex your content plus update their reporting before your errors will be removed.

  • Google Search Console International Targeting Section, review settings and verify they are appropriate.
  • Google Search Console Google Index Section, review Index Status Trend and look for changes, it’s a good idea to download the chart data and save for record keeping. Also check the Remove URLs section.
  • Google Search Console Google Index Section, check Blocked Resources Section. Verify you have nothing blocked so that Google can render your pages.
  • Google Search Console Crawl Section, Review All Crawl Errors for both Desktop and Smartphone. Large numbers of 404 or other errors indicate poor site quality, something Google mentions in their Quality Raters Guidelines.
    GSC Crawl Errors

    Be careful of Redirect and 404 errors, as they are the most typical problem desktop sites that redirect to a mobile site have. These have the potential to trigger a Google penalty, or drop in rankings on a smartphone search. When you redirect a mobile user from a desktop site to a mobile site, users expect to land on the appropriate page, NOT be redirected to the home page or a missing (404) page.

  • Google Search Console Crawl Section, test a sample of page layouts (home, subpage, category, etc.) using Google’s Fetch and Render Tool as BOTH desktop and mobile. You’re looking for pages that don’t fully render under those tests.
    Fetch and Render GSC Test

    If you get a Partial Status, click the link to see what Google doesn’t have access to.

    Do not be concerned about resources that are not on your site that are blocked for things like Social Plugins, Tracking Scripts, Ad Networks, etc. Your goal is for Google to have access to everything that is necessary to render the page correctly. They understand some off site resources such as Facebook Like buttons are blocked. Due to blocked resources such as this, your URL will show Partial status and a green check mark. This is ok, see our result guide below:

  • Complete – Google can access everything it needs to render your page.
  • Partial – Google could not reach all the resources on your page. Review what resources it could not reach and be sure they are not located on your own site. Off-site resources (like Facebook) are not under your control. It is very common for off-site resources such ashttps://www.googleadservices.com/pagead/conversion_async.js to be blocked with robots.txt, those types of off-site resources should not be a problem as they don’t affect how your page is rendered.
  • Redirected – Re-enter the final URL your resource is redirecting to. You may want to check to be sure the page isn’t hacked and being redirected off site or someplace you aren’t aware of.
  • Not Found – Correct the URL error for page you submitted.
  • Not Authorized – Google is getting a 403-error code; it cannot access the page.
  • DNS Not Found – Google could not resolve the URL you submitted.
  • Blocked – You are blocking the URL from Google using Robots.txt
  • Unreachable – Your site may be down.
  • Temporarily Unreachable – Your site is responding very slowly or you have tested too many pages in a row and Google thinks you might be a bot.
  • Error – This indicates an internal error with the tool, not likely something wrong with your website.
  • Google Search Console Crawl Section, check for Robots.txt errors and warnings.
  • Google Search Console Crawl Section, review Sitemap Section – is a sitemap in use? When was it last processed? Are there any issues reported?
  • Record Pages Crawled per Day under > Crawl Stats for future reference. Does the crawl chart look consistent, trending up or down?
  • Also check Google Search Console Security Issues Section for Malware Warnings.
  • BWT – Review Site Messages.
  • BWT in the Reports & Data Section, Export the Inbound Links and Search Keywords reports for future records. Also review the Crawl Information and Malware reports.
  • BWT in the Diagnostics & Tools Section review the Mobile Friendliness Test Report to look for mobile rendering problems.
  • Malware – If you do run into Malware Warnings, see “Was your Site Hacked? How to Recover from a Google Malware Attack.

4. Check Cached Content for important pages

  • How recent are the cached dates? Important and frequently updated pages should be crawled frequently. You can check this normally by looking at the search result pages, you can do a site:domain.com search, then click the little arrow for the cached version of the page in most cases.

    When you view the cached page, at the top you’ll see the date it was spidered last…

  • Are there additional links showing in the cache that aren’t showing when you look at the same URL in a browser? That may be a sign of Malware or that the server has been compromised. You may want to use the View Source option to look at the actual page code.

5. Is the Site Ranking for Company Name, Brand and Unique Terms?

  • If the site isn’t ranking its home page or specific product pages for terms that it should rank highly for, there may be a penalty or severe indexing problem.

6. Check 404 Error Handler

  • If you request a nonsense page from the site such as http://www.domain.com/blablahbla does the site return a 404 Error Response Page?
  • Error pages should also generate a 404 Server Header that says HTTP/1.1 404 Not Found. You can use the SEN HTTP Header Analyzer Tool to check the 404 error response from the server. If you get any other response code for a missing URL request, this problem must be corrected.

7. Review robots.txt and .htaccess files

  • Robots.txt is easy to view, just load http://domain.com/robots.txt in your browser to view it. What URLs are blocked in robots.txt? URLs blocked in robots.txt can’t transfer page rank.
  • Verify no URLs necessary to render the page are included in robots.txt such as images, CSS and JavaScript. You can use the Google Search Console’s Fetch as Googlebot function to verify an individual URL’s access to everything it needs to render the page.
  • What URLs should be blocked, but aren’t in robots.txt?
  • Is there content that is blocked in robots.txt that shouldn’t be?
  • Are there settings in robots.txt that can be removed? For example perhaps you were blocking some content from spidering that has long since disappeared from the search engines and from the site.
  • Are there 301 redirects, or other commands in the .htaccess file that can be removed? For example, 301 redirects placed there 10 years ago probably are no longer needed.
  • Remember that Google no longer supports the linking to xml sitemaps in the robots.txt file. So if you’re doing that, this is a good time to remove those links, you’ll need to submit the XML sitemaps directly via Google Search Console if you are not already.

8. Review XML Sitemaps

  • If XML sitemaps are used, are they correct and kept up to date? When were they last updated?
  • Are proper and reasonably accurate update frequencies used?
  • Are Video or Image sitemaps in use? Note that Google quit supporting the XML Geo Sitemaps in 2012.
  • If it’s a WordPress site, check to make sure you’re not generating sitemaps for content you don’t want indexed and/or have set to meta noindex. For example image attachment sitemaps, slider sitemaps, category, tag or special feature sitemaps.

9. Does the site have an indexable navigation system?

  • Is JavaScript or Flash based technology used for the menu or other navigation links? Google CAN spider Javascript, but it should be checked to make sure they are not having issues with spidering the menu.
  • If you’re not sure, turn Javascript and CSS off in your browser, can you still use the links? If so, you’re good to go.

10. Check the site with a Smartphone such as an Android or iPhone

11. Conduct a Test Crawl to watch for errors

  • Use SEN’s Super SpiderXenu or Screaming Frog’s Desktop Spider to test crawl the site and observe results. Are these spiders reaching all levels of the site?
  • Observe the number of 404 and other errors, these should be fixed.
  • Note how many URLs are redirected (301 or 302), these redirects should NOT be necessary other than short term fixes. If you’re redirecting an on-site link, you really need to update the old link to point to the new URL, not redirect it.
  • If the desktop spiders can’t crawl the site, then search engines spiders will likely have problems as well.

Duplicate Content

1. Check for off-site duplicate content problems

  • Search for a unique snippet of text within quotes from recent, but not brand new content – does it show up anywhere else?
  • If there are copies of content, attempt to determine the source. For example, are they scrapping the site? RSS feed? Content Syndication?
  • How about images, are the site’s images copied to other websites? Use Google Image search to find duplicates. Use the Similar and Visually Similar Search Options to discover variations and alterations.
  • Conduct searches for the company’s phone numbers (toll free and standard), and postal address to help discover additional websites.
  • Has the company moved in recent years? Conduct Searches for previous phone and address references if so.

2. Are both the www and non-www versions of the site indexed?

  • Check using the site:domain.com search command.

3. Is the site indexed by IP address or by host name?

  • Find the site’s IP address using a tool like – whatismyipaddress.com’s Hostname IP Finder.
  • Do a search at Bing.com using the site search operator, example site:74.125.224.68
  • If the site is indexed by IP, that means you have a duplicate content issue on your server. You’ll need to 301 redirect your sites IP address to your domain name to prevent the IP address from getting spidered.
  • Look to see if the site is indexed by the server host name, some servers are setup to be reachable by servername.hostname.com. You can look for this error by searching for unique text on the page, like the phone number or other unique text on the page, and a filter to remove your site (and subdomains) from the results -yourdomain.com. This should return all sites that have that phone number and unique text string indexed EXCEPT for your domain name. Example search.Note: You may find some other problems, like site mirrors as well with this test.

4. Is there a duplicate issue with mobile content?

  • If a mobile site or mobile customization occurs for mobile, does that create a duplicate content situation on site, or on a m.domain.com mobile site? Is there a rel=canonical tagin place to address this problem?

5. Are there 301 redirects in place to the preferred (www or non-www) canonical version?

  • Check using the SEN HTTP Header Analyzer Tool to be sure the site is using the proper 301 redirect and not a 302 redirect.

6. Do a site:domain.com search

  • At the end of the results listed are you seeing the “in order to show you the most relevant results, we have omitted some entries very similar to the XXX already displayed” message? This is a strong indication of duplicate content on the site, but you may not see this message show up in the first page of results, click further within the search results to see if it shows up further into the result set.

7. Is the site a dynamic database driven site, or does it use tracking URLs?

  • URL variables and tracking URL’s can often cause content to be indexed multiple times. Take a snippet of body text from an indexed page, and then search for it using the site:domain.com “text snippet” method to see if you can find alternate URL structures.
  • Scan the site with a tool like SEN’s Super SpiderXenu or Screaming Frog’s Desktop Spider and sort the results by title and by URL to look for duplication.
  • If the site uses URL parameters, such as ?category=shoe&brand=nike&color=red&size=5, you might want to make use of URL Parameters Tool in Google Search Console to avoid problems with duplicate content. The use of the rel=canonical tag should also be used in this situation to avoid massive duplicate content.

8. Check for additional Domains, subdomains, subdirectories and secure servers for duplicate content.

  • Does the same content exist on different domains, subdomains, subdirectories, or secure servers (https)?
  • Review Google Webmaster Tools > Search Appearance > HTML Improvements > Duplicate Title Tags report. Duplicate titles often indicate duplicate content pages.
  • Do other domain names point to the same content on the server? This can be a severe problem, and many companies do this without realizing the possible problems it can cause. Ask if the company owns other domains, get a list of any “extra domains” and check to see what content they are pointed at and do a site: search to see if they are indexed.
  • Are there other domains representing the company, such as YellowPages or other promotional domains representing the company at the same address? These can be a source of problems for Google My Business Page resulting in lower ranking and/or incorrect address citations.

9. Check Image Duplicate Content

  • Load sample image URLs into Google’s “search by image” option at images.google.com, see if the images show up on other sites. Product photos are very likely culprits.

Load Time

1. Google Page Speed Insights

  • Test a sampling of the site’s pages (Home, Product, Gallery, etc.) with Google’s Page Speed Insights. Record Mobile and Desktop Ratings.
  • At the minimum your site should score better than your competition’s site, ideally your site should score in the 80-100 range for both desktop and mobile tests if possible.
  • Note that Google Page Speed Tests rely on Chrome Browser users traffic data to get a report on actual page speed. Unfortunately sites that are low traffic will not likely see the speed result. Higher traffic sites should see a report showing both the FCP (First Contentful Paint) and DCL (DOM Content Loaded) speed results.

    The Page Load Distribution is quite helpful to understand how quick the URL you submitted loads for most people.

  • If the site is using Google Analytics, review the Behavior > Site Speed reports. The Page Timings section can be sorted by Avg. Page Load Time showing you the slowest pages on the site. Additionally you can show in the 2nd column the page Bounce Rate. Specifically make note of slow pages and high bounce rate for needed improvements.
  • Next in the Google Analytics Site Speed section, the Speed Suggestions report will display pages that are ranked by page views. Make note of pages with low scores and high average page load time for improvements.

2. Test Website Response Time

We recommend using WebPageTest.org for performance benchmarks, it’s a fast and well accepted service to analyze Load Time. According to Google, sites that take 20 seconds or more to load are likely to cause ranking issues for the URL. These tests will help identify areas of the site that need further improvement. Make sure to test the proper protocol used on the site (http/https), failure to do so will result in inaccurate reporting.

  • Test Cold Cache vs. Hot Cache – WebPageTest.org reports page load time on first visit and subsequent visits. You should test your home page, and other important sections of your site. We recommend testing with both Cable (5/1 Mbps 28ms. RTT) and Mobile 3g (1.6Mbps/768 Kbps 300ms. RTT). Make note of the Load Time, First Byte and Number of Requests for both First View (Cold Cache) and Repeat View (Hot Cache). These are important metrics, for comparison you should see 2-3 second First View and sub one second Repeat View on Cable. For Mobile look for better than 5 seconds first View and 1 second repeat view, those values are both reasonably fast (and obtainable).
  • Note the Grade (A through F) reported in each section, First Byte Time, Keep-alive, Compressed Transfer, Compressed Images, Progressive Images and Caching. Each section should be scoring an “A” on an optimized web page. Again, test more than just the home page. It’s important to test major sections of the site. Ideally each time your site template makes a change (product, home, category, gallery, FAQ, etc.) you should test for irregularities.
  • Specifically note pages with high Time to First Byte (TTFB), these are pages that may have backend problems, hosting issues, DNS issues, etc. that need resolving. Slow TTFB pages are VERY aggravating for users, and have been suspected of having an impact in search ranking as a site quality factor. Ideally we would like to see a TTFB below .200 ms if possible. Google Page Speed Insights will ding you points if it’s much slower than that. Also note TTFB often is load dependent, if the site is getting heavy traffic often it will slow down, so this time will likely vary from test to test.
  • The “Start Render” value is the amount of time before the user sees the page start to load, which is a big factor in high bounce rates. Improvements such as progressive JPG images, removal of render blocking CSS and JavaScript will all improve this value.

3. DNS Server Tests

  • DNS errors can cause severe performance problems, you can test yours for free at dnscheck.pingdom.com.
  • DNS response time increases latency and impacts your site load time, including Time To First Byte (TTFB) performance. Check your DNS performance at www.ultratools.com/tools. Under 10 ms average is a good response time, over 100 ms response time you should work on improvements.

See How the Pros Engineer a Lightning Fast Website – Part 1 and Part 2 for tips on improving these numbers.

Content

1. Does the home page of the site have indexable text of at least one paragraph?

  • Lack of indexable text can be a severe ranking problem.
  • Rough rule of thumb, if you can hit CNTL-A then CNTL-C to copy the text on the site and paste it into a text editor, that text should be what is indexable.
  • Having 1500-2500 words on the page would be ideal, but of course in certain situations that’s difficult to do. At least try to have 150 words minimum for your pages – more is better. Pages that have a high number of links, but a small amount of text are typically going to be considered low quality.
  • Pages that you need, but that are very light on text you should consider setting them to meta noindex to avoid any Panda penalties, for example gallery pages that have very little text content.

2. Spelling, Grammar and Content Quality

  • Bing has specifically stated that their engine pays attention to spelling and grammar errors. Google’s Matt Cutts on the other hand previously said don’t worry about spelling errors in user comments, but he really didn’t address errors in main content. We believe a lack of spelling and grammar errors can be a small quality signal.
  • Does the content read like a magazine or newspaper article, or does it read like someone paid a worker in India $1/hr to produce?
  • Would you trust the content presented on the pages? Does it seem to be written by an expert or someone well versed on the topic?
  • Does the site have multiple articles on the same topic, with very similar content but with only certain keywords switched out? That’s a very good sign of low quality content that users won’t like.
  • If this is an ecommerce site, is there a lack of security signals that would make you not trust them with your credit card?

3. What is the Readability Level of Important Content ?

  • Copy the plain text from the home page (Not HTML Code) and paste it into an online tool such as the ‘Dale Chall Online Tester’ at www.readabilityformulas.com. Record the Grade Level, # of Words NOT found in the Word List (Unique Words) and Final Score. Also test other important content pages on your site such as FAQ, Product Pages, White Papers, etc. Studies have shown that low readability scores (low grade level) have some correlation to lower ranking.
  • Has readability been compromised for the sake of optimization?

4. Does the site have over usage of important keywords?

  • Is there excessive keyword repetition within the indexable body text, image alt text, and/or links? Does the content look “spammy”?
  • Is the site a “Your Money or Your Life” website? If so, would you trust the site with your credit card information, email address, or other personal health details?
  • Are outbound links relevant to the topic on the page?

5. Meta Tags

  • Review Meta keyword tags. This tag is no longer observed by search engines. Excessive content in the Meta keyword tag typically is wasted code. With the exception that the tag is sometimes used for on-site search.
  • Review Meta description tags. Does each page have a unique, well written description? Google Webmaster Tools has a report that shows duplicate Meta descriptions.
  • Look for use of the Meta robots tag. The noindex variable should only be used on pages that indeed should not be indexed. Make sure this tag is not in use on pages that should be indexed.
  • Use of the Meta robots index, follow tag is wasteful and only adds to code bloat on the site. Those settings are assumed and do not need tags to indicate index or follow.

6. Title Tags

  • Does each page have a unique title tag? Google Webmaster Tools has a report that lists pages with duplicate title tags. You should also use a tool like SEN’s Super SpiderXenu orScreaming Frog’s Desktop Spider to scan the site to find duplicate title tags.
  • Is the primary keyword phrase at the beginning or near the beginning of the title tag? This is typically a ranking boost.
  • Are there any title tags over approximately 70 characters long? If so, they will not display in search results and may not be useful. Long title tags can also cause Google to swap out the title for a more appropriate text.
  • Do the title tags appear to be spammy, or keyword stuffed? This can reduce click through rates in search results and result in lower ranking.
  • Every page should have a well optimized title tag, it is still one of the most important on-page ranking factors.

7. Are images SEO optimized?

  • Use of keywords in file name?
  • Use of keywords in img alt text?
  • Do pages that are dedicated to images, like a Gallery, have appropriate title tags to help describe the image?

For detailed image optimization recommendations, see Six Steps to Quickly and Expertly Optimize Your Site’s Images.

8. Text Formatting

  • Are H tags in use?
  • Does the site use more than one H1 tag per page? It shouldn’t, unless you’re working with a Blog where individual post sections are marked with h1 tags for each post title in Categories, Blog Home, etc.
  • Is content formatted well and easy to read?

9. Advertising

  • How many ad blocks are in use throughout the site?
  • What approximate percentage is the ad to content ratio in regards to screen real estate?
  • Is over 50% of the above the fold content composed of ads? If so, that may be viewed as poor quality by Google, and impact ranking negatively. On a desktop, a 1280×1000 pixel area is typically regarded as content “above the fold” – obviously mobile devices are MUCH smaller.

10. Desktop Compatibility

  • Review site using current and older versions of Edge/Internet Explorer, Chrome and Firefox. Check for variations, problems rendering pages, layout variations and using features such as navigation.
  • Is the site usable if you turn off JavaScript and CSS in the browser?

11. Mobile / Tablet Compatibility

  • Check the site using a Smartphone such as a Droid or iPhone, does it have at least minimal functionality? Test with 10″ and 7″ tablet as well, note problem areas if found. If you do not have access to these devices, at least test with a Smartphone emulator such as www.mobilephoneemulator.com and/or BrowserStack.com or the Chrome DevTools mobile emulation (look for smartphone icon on top left).
  • Test with Opera Mobile especially if you’re using web fonts.
  • When using a Smartphone, can you find the address, email contact and phone number?
  • Can you use the site’s main navigation links?
  • If there are videos present, can you view them on the Smartphone? Google lists unusable video as a ranking factor for mobile and does not want you to use Flash Videos. In fact there is a penalty if you server Flash to non-supported devices.

12. Is there Structured Data Markup compatible with Schema.org in use?

  • Is the markup properly formatted? Test using Google’s Structured Data Testing Tool at https://developers.google.com/structured-data/testing-tool/. Excessive errors can result in Google penalties.
  • Check for errors such as incorrect hours, business name, address or phone (NAP).
  • If using other markup formats such as hcard, consider updating to those endorsed by Schema.org.
  • Does the site have visible data that is represented with structured data hidden using CSS display:none or display:hidden? That practice can easily result in Google penalties. For example if the page uses JSON review markup, but there is no easily visible reviews on the page.
  • Is the structured data misleading, inaccurate, or false? If so, it would violate Google’s Guidelines and can result in a penalty for Spammy Structured Markup. An example is a site marking up hidden text in order to generate a review rich snippet, but the actual review is not visible on the page when viewed.
  • Are there aggregate review values marked up, but the reviews are not visible? If so, that’s a potential penalty problem. Are reviews posted that were sourced from someplace other than the site such as Google My Business, Trustpilot, Yelp, etc.? That’s also a something not compliant with Google’s Guidelines.

13. Are Facebook/G+ Open Graph Tags in use?

  • Are Facebook and Google+ Social Buttons in use? Home page only? Site wide?
  • Test Page’s Open Graph tags using Facebook’s Lint Tester and look for errors. Facebook’s Lint Test is the most complete method you can use for testing these.
  • Is the content within the tags unique on each page and formatted correctly?
  • When a URL is “liked” on the site, does the presentation on the person’s wall look correct?
  • Are the same Open Graph tags used site wide, or do they correctly reference each page uniquely?
  • Are Facebook and Google+ Social Buttons in use? Home page only? Site wide?
  • Is the Like button properly referencing the URL where the Like Button is at? Or do they have all the Like Buttons referencing the home page, or their Facebook page?

14. Canonical Issues

  • Is the rel=canonical tag in use on the site? If so, is it being used correctly?
  • Does the internal linking structure agree with your canonical version of the site that is preferred? Otherwise, if you prefer to use www.domain.com instead of domain.com, do the links within the site use the www for example http://www.domain.com/?
  • Remember other content formats, such as .PDF, .DOC, Video, etc. – All content should be using the same canonical version of the domain name.

15. Are iframes or frames in use?

  • Both iframes and regular frames are bad for ranking. Check to make sure the use of these methods are limited and not used to show any major portions of content. Iframe tags are quite common, such as the Facebook Like button, but should be avoided for content that needs to be indexed.

16. Is JavaScript or Ajax used to generate content, or used excessively?

  • Search engines typically have a problem reading JavaScript and in general, you should not expect JavaScript generated content to be indexed, or usable by a search engine.
  • JavaScript navigation can make a site hard or impossible to be spidered by search engines.
  • Ajax generated content can be problematic, not impossible, but certainly can be difficult to get indexed. It should be avoided for SEO best practices unless you fully research and test.

17. Does the site have hidden text?

  • Is any other content or links being hidden accidentally? For example anchor text links in the footer that are the same color as the background color. Issues like that can trigger a Google Penalty.
  • Use of CSS display:hidden or display:none is certainly something Google frowns upon for when used improperly. However even innocent usage may cause a problem, avoiding using these features unless the user can toggle/turn the text or feature on or off. If the user has no way of displaying what is hidden, then you should probably consider not using that method to hide the content.

18. Content Date

  • Is the creation date listed in the content? Last Updated Date? These are quality signals according to Google. Even on older, evergreen content Google likes to see the content date published on the page.
  • Is the Copyright Year in the footer of the site up to date?

19. Your Money or Your Life Requirements

YMYL (Your money or your life) type sites such as those that deal with financial, shopping, medical, legal and safety information have very high quality standards requirements by Google. They should typically have these content pages as described by Google’s Quality Raters Guidelines.

  • Privacy Policy.
  • Terms of Service.
  • Robust and Complete Customer Service and/or Contact pages.
  • Functioning Order Forms.
  • About Us/Company Info – these are typical pages on a legitimate business website.
  • HTTPS is highly recommended for YMYL content.

The Next Steps in Your Audit

As much as you’ve just digested by reading this resource in its entirety – Remember, this is only part one of our Complete Website Audit Checklist! Dive into the second installment to get exact technical details on:

  • Site Architecture
  • Basic Local Search
  • Secure Server Errors / HTTPS
  • Vides and Images
  • ADA Compliance
  • WordPress Updates
  • User Interface Testing
  • Dangerous Remote Resources
  • Dedicated Mobile Sites
  • Accelerated Mobile Pages (AMP)
  • And More!

Now head over to Part Two of this Audit Checklist to continue on with your audit! Or if you’re feeling like you’d rather hire a pro then you can hire our team to do all this hard work for you! Just check out SEN’s Website Audit & Consultation Packages to get full details.SEN article end

You’re only as good as the checklist you follow!