LOADING

Type to search

 

Winning the Search Engine Wars

 

[siteorigin_widget class=”WP_Widget_Custom_HTML”][/siteorigin_widget]
[siteorigin_widget class=”WP_Widget_Custom_HTML”][/siteorigin_widget]
[siteorigin_widget class=”WP_Widget_Custom_HTML”][/siteorigin_widget]

Chapter One –
Search Engine Strategies – A Brief History

Everyone knows a Search Engine is the vehicle people use to find things on the Internet. But many are oblivious to the hyper-competitive behind-the-scene strategies used to secure the highly coveted top-ranking positions.

Back in 1996, when the first version of this book was originally written over 200 updates ago, the leading search engines included the all-but-forgotten likes of WebCrawler, AltaVista, Infoseek, Excite, Open Text, Lycos, Inktomi, and the directory site, Yahoo.

Universe of Search EnginesThose engines all responded to search queries with results based solely on matching keywords on web pages to keywords (aka, search words) being used in the search queries. Search engine strategy was very simple back in those days. The top search results consisted of whichever pages contained the most keywords that matched the search query. Due to this fact, online marketers began “stuffing” extraordinary numbers of targeted keywords into web pages for the sole purpose of manipulating the search rankings. They went so far as to design entire web pages specifically to rank well for each of their targeted keywords. In many cases this meant flooding the search engine indexes with hundreds, or even thousands, of superfluous web pages in order to dominate the rankings. And that is how the arms race to ‘Winning The Search Engine Wars’ began.

In those early days, whoever knew how to “stuff” the right mix of keywords into a web page literally gained an unfair advantage! And, frankly, it was pretty easy to quickly score a whole bunch of top ranking pages on most any search engine under any topic. You see, back then the search engine formulas, or algorithms, for responding to a search query were pretty basic. And the data compiler programs — aka, robots, bots, spiders, and crawlers — that “crawled” the web simply indexed whatever they saw wherever they found it. Neither the bots nor the algorithms passed judgments based on the “quality” of the site. Nor did they evaluate the trustworthiness of the brandthe credibility of the linksthe quality or originality of the contentthe popularity of the page, or the reputation of the site. While it’s true today that ALL of these elements and more are factored into the ranking algorithm, in the early days of the commercial web, all that mattered were keywords.

For instance, if you were an online seafood outlet that sold soft shelled crabs via mail order, then you would stuff the keyphrase soft shelled crabs into as many places on your web page that mattered. You would place the keyphrase multiple times into the web page Title and Meta tags, the header, the headlines, and the body copy. You would also repeat it a hundred times or more in the footer of the page, in minuscule white font that matched the background rendering these keywords invisible to site visitors even though the spiders would “see” them in the source code and therefore index them. In 1995 through 1997 this was the number one strategy for scoring pages at the top of the search engines.

But today’s search engines know all of the tricks. So, it’s not only a waste of time to use them, it is counter-productive because your site will be penalized in the rankings if you get caught — and you willget caught!

Regardless, to gain the insight necessary to build today’s top ranking websites, it helps considerably to know the basic history of the arms race for top ranking pages. As one would expect, the engines eventually countered the keyword stuffing strategy by programming their algorithms to recognize it as search engine spam. And from that time forward the arms race to manipulate the engines has been repeatedly stymied by the search engines’ all-out-war on whatever they consider to be search engine spam.

Eventually, getting caught spamming the engines became highly detrimental to a website’s search rankings. Penalties now range from mild to wild. Mild might equate to being dropped a hundred pages back in the rankings until you’ve admitted the error of your ways, corrected the problem and promised in writing to never do it again. Wild would be something like a permanent ban for flagrant and repeated violation of the rules. Neither is good and both tend to cost the website owners dearly due to significant losses of traffic.

Spamming the Engines — a Moving Target

spam as a moving targetTo combat the strategy of keyword stuffing, the engines switched to an algorithm based on keyword positioning and keyword density (the number of keywordsrelative to the number of total words on a page). When online marketing experts got ahold of software to crack that formula, the engines countered by adding link popularity to the algorithm. At this point the search engine ranking formula used a combination of strategically placed keywords within a specific number of keywords used on a page heavily influenced by the total number of external off-site links that pointed at the page using keyword rich anchor text.

anchor text defined as the words that appear in a link

Their thinking was that it would be difficult-to-impossible to fake keyword-specific incoming links from external web pages that were (supposedly) under the control of a separate entity.

But, no surprise, online marketers are a creative and persistent lot. They quickly figured out all kinds of fabricated link systems designed to manipulate the search algorithms and score top rankings. They created link exchanges and links pages — pages that were nothing more than a collection of links. And, for a while, these so-called “link farms” were tolerated by the engines until all of a sudden they weren’t. Then the effort shifted to covertly buying links. Link brokers sprang up for a few years and were successful and even profitable for a while until the engines dropped the hammer by wiping the offending sites all-at-once from their index.

During these years, the engines began reclassifying mainstay and widely accepted search engine strategies like reciprocal link exchanges as artificial link structures and then later as spam. So, reciprocal links, link exchanges, link farms, buying links, brokering links, and heavily laden anchor-text links — strategies that were at one time widely used and mostly acceptable — all got tagged as “link schemes” and relegated to the list of forbidden strategies. The strategies themselves were derided as “black hat” (as opposed to the Google endorsed “white hat” strategies) in an effort to polarize online marketers into either good guy or bad guy categories. And then Google set out to teach lessons to the nasty “spammers” by severely penalizing their websites in the rankings or, in some cases, completely kicking them out of their index.

historical warning SEN

It’s important to note that, by this time (circa 2005), ALL of the original search engines (except Yahoo) had been rendered irrelevant by the overwhelming popularity of Google — which didn’t even exist back in 1996 when the search engine wars began heating up. In fact it wasn’t until 1999 when the fledgling Google first appeared at the Search Engine Strategies Conference as an almost-unheard-of panelist in San Francisco, sharing the stage with all of the aforementioned “leading” search engines. I was there when Google co-founder Sergey Brin proudly exclaimed to the attendees that ‘Google doesn’t worry about spam, you can’t spam Google.’ But online marketers did indeed figure out ways to spam Google. And a few years later, once they had acquired majority market share and crowded out all of their competition, they changed their mind. Now they say: You had better not even think about spamming Google. And they really, really mean it!

And now that Google is, effectively, the only search engine, they’ve become the tail that wags the dog. They dictate (via recommendations and suggestions) almost everything a website can and cannot do — all the way down to the design of the website itself. Sure, there’s Bing. They matter, but not enough so that you can ignore the demanding website “quality” requirements of Google. There’s also the social media sites Facebook, Twitter, Pinterest, and review sites like Yelp and the encyclopedic Wikipedia that influence rankings. But none of them matter near as much as Google although they DEFINITELY DO MATTER! …but if you defy Google, you probably aren’t going to do very well in any of these other important sources for search traffic. So, search strategies today are centered around strategic compliance with Google’s terms of service and Webmaster ‘best practices’ guidelines.

To state it simply, today’s search engine strategies focus on constantly adapting your site to comply with whatever Google currently thinks is a “good” website. In other words, if Google likes you, then all of the others are likely to like you too. Your online web presence will flourish. But, if you fly outside of Google’s “recommendations” then you probably won’t do very well ranking-wise anywhere. And if you piss Google off, it’s safe to expect that your website will be banished into cyber-Siberia and your efforts to “fix” the problem will be ignored. So pay close attention to the following chapters because your primary goal is to keep your website in Google’s good graces! And remember, even the basics can be a constantly moving target as they frequently keep raising the bar by honing their requirements and refining their suggestions.


Chapter Two, The Relative Importance of Ranking Components

Chapter Two –
The Relative Importance of Ranking Components

In Chapter One you learned the importance of keywords to top ranking pages in the early days of search engines. And today, keywords are still an essential component — but they are no longer the onlyessential component.

Today’s search engines (think Google) factor in so many other components into the algorithms that keywords found on a website are treated as on-page relevance indicators that are subordinated to a slew of off-site relevance indicators. The thinking is that:

off-site relevance factors
are difficult-to-impossible for a website owner to manipulate.

Picture the search engine algorithm as a mechanism with internal relevance dials. Each dial controls a component of the overall mechanism, aka, the algorithm.

search engine algorithm dials

One dial controls the maximumimportance placed on keywords found on a web page. Another dial controls the maximum importance placed on keywords found in links pointing to a web page(aka, anchor text). Another dial controls the maximum importance placed on the trust and authority of the sites that link to a web page. Another dial might control the maximum importance of having the keyword in the domain name. Now, as you will see later, this is a major oversimplification of Google’s actual algorithm. It does include these factors but there are many more that are entered into the actual mix. But for example purposes, let’s continue to see how this might work.

keyword algorithm dialsSince Google knows that anybody can put an unlimited number of keywords into any web page that they control, then keywords found on a web page might be dialed up to only two maximum on a ten-point relevance dial. They are given weight but not so much that people can stuff a bunch of keywords into their web page and score a top listing. In fact, Google may even penalize a page if their algorithm detects an artificial abundance of keywords or a disconnected relationship between the keywords and the rest of the page’s body content. So, keywords on a web page, if detected within a ‘normal’ range might only be weighted as high as a two, at best, on the ten-point relevance dial. In other words, no mater how efficient you are at placing keywords on your page in optimized format, that element carries only limited weight in relationship to other factors that make up the overall ranking algorithm. And, if Google detects an abnormality (like proportionately too many repeating keywords), that same dial might assign a negative number such as -3 on the ten-point relevance dial. keyword negative dialThis would have an adverse effect on the page’s ranking ability. In search engine optimization (SEO) jargon, we call this a penalty.

anchor text algorithm dialsDial two, off-site anchor text — aka, the keywords used in links pointing to a web page — is theoretically harder to manipulate. After all, if you don’t have control of the external website then how could you actually control the links that are pointing at your web page? Of course, it could be done by arrangement, secret or not. But assuming that the link is “natural” (meaning it was done solely on the volition of the linking website and without any payment, arrangement or enticement), then Google views that link with a higher degree of trust. So, the relevance dial is set to a higher maximum level, maybe a six on the ten-point dial. Now, if Google learns that your anchor text link(s) are contrived — i.e., purchased, negotiated, exchanged, or “unnatural” in any way anchor text negative algorithm dial— then you’re in violation of their “best practices and guidelines.” And, if so, then you can expect that dial to work against you in a big way. Maybe a -6 on the ten-point dial.

trust authority algorithm dialDial three; Google loves trust and authority. There are certain websites on the Internet that are viewed as trustworthy. In essence, these sites are basically whitelisted. They can do no wrong. They have an impeccible reputation. One such site is Wikipedia. Google knows that Wikipedia does not spam Google. Nor does Wikipedia assist others in spamming Google. They diligently patrol their own site with an eye toward eradicating manipulative content and links. Therefore, Google sees Wikipedia as both trustworthy AND authoritative. If your site has a link from Wikipedia, that’s a great link. Why? …because Google’s trust and authority dial is likely to be maximized at around 9 or even 10 on the ten-point relevance dial. Google views a link from Wikipedia as highly trustworthy (because they are hard to get and keep) and from an authoritative site because Wikipedia is a brand-name known for it’s content integrity. It’s a whitelisted site which Google thinks can do no wrong in regards to search engine relevance factoring.

More examples of highly authoritative and trusted sites are mainstream news sites (CNN.com, NYTimes.com, NationalGeographic.com, etc.) and big name brands (Geico, Nike, Yelp, Budweiser, Amazon, Coke, etc.). To a lesser degree, sites that are linked-to by whitelisted sites that then link to you would be viewed as trusted and authoritative by association even though the dial might only be set to 5, 6, 7, or 8 on the ten-point relevance scale based upon the slightly subordinate standing to the ultra-trusted big name sites. And, obviously, Google dials this up because they consider these kinds of links to be difficult-to-impossible to manipulate.

A while back ago Google updated their patent on this topic to include the concept that authoritative sitesare industry specific. That means that if you’re in the children’s clothing industry then the best sites to be associated with are vastly different than if you’re in the restaurant industry. However, to get even more defined if you’re in the restaurant industry in Dallas your trusted sites will be different than if you’re in the restaurant industry in Chicago.

domain name algorithm dialDial four, keywords in your domain name, might be set to 2 or 3 on the ten-point relevance dial. And, this dial could also be influenced by the “importance” placed on the keyword itself. For example, if the keyword in the domain name is purses, a generic term that is synonymous with women’s handbags, then the dial might only register a 2 or 3 in a search for handbags or purses. It sure won’t hurt your ranking efforts but it may not really help them either. However if the well known keyword (actually brand name) GEICO is in the domain name, then the dial might be set to 9 for the keyword geico or the synonymous search term car insurance. So, in this way, you can see that the search engine algorithm actually has dials within dials depending on extenuating variables. In general keywords in your domain are not something to get too worried about and not something to go through the trouble to change domains for!

Now remember, this is a major oversimplification of Google’s actual algorithm. But regardless, you can see that it’s complicated even in our simplistic example. Google says their formula is set to provide a “good user experience” which is indeed partially true (nice spin anyway). But as you will later see, their algorithm is primarily set to reward brands and sites with the right social signals all the while dissuading search engine marketers (SEMs) from trying to manipulate Google’s search rankings.

Good website vs. Less-Good website – Design Elements that Affect Rankings

Google likes to promote the impression that if you build a “good” website, everything will be wonderful. Google will rank you high and people will visit and give you the money you deserve. However, you must always remember that Google’s concept of a “good” website is constantly evolving.

For instance, whatever the focus of your website, you probably believe you provide the best product, or service, or facility as presented by your very “good” website. But these days, if your site’s web pages aren’t responsive (yeah, what’s that?) and your web page load speed is “slow” (yeah, how slow is slow?), and your site isn’t HTTPs (secure) and instead is HTTP, or some “untrustworthy” or “irrelevant” site has linked to you (which you may not even know about), or you’ve exchanged links with an ‘off topic’ website as a favor to a relative or friend (like a Real Estate Agent or your Webmaster), then Google may very well think your website is less “good” than those that outrank you. This is especially true if you’re competing for top rankings against a nationally known name-brand or “trusted” governmental website.

Another example of Google-defined “bad” website behavior is the failure to handle both Mobile and Desktop PC users in THE way that Google “suggests” that you should. Regardless of your intent, if you get any part of the recommended format wrong, you’ll be penalized in the rankings. So, yes, content is important (as Google constantly reminds us) but so is format and function (aka user experience).

The point is that there are a plethora of factors and design elements that were never before weighted enough or even factored into the algorithm that are indeed factored in today. And, collectively they often become the difference between a top ranking position on the search engine results page (SERP) and a hardly-found listing on the third, fifth, or hundredth page of the search engine results.

So, even if YOU think your website is good, remember that to rank well in the search engines, it has to be good according to Google. So that’s where today’s SEO/SEM strategies begin and the rest of this book will focus on creating web pages that are Good According To Google (GATG).


Chapter Three, The Search Engine Order of Importance and the Web page Discovery Process

Chapter Three
The Search Engine Order of Importance and the Web Page Discovery Process

Search Engines Market ShareThe pie chart shows that Google attracts 93% of the English reading eyeballs that search the Internet. This makes Google really the only relevant search engine in North America.

For the English speaking market, only two search engines count: Google and (in a small way) Bing.

Globally, Google has over 93% of the worldwide mobile market search market. The Chinese search engine, Baidu, is 2nd with a market share of 4%. Yahoo has 1% (boosted by Yahoo Japan) and Bing is 2nd with 2% of the worldwide market.

It’s very important to note the above figures reflect the mobile market share and that Google is the 800 pound gorilla because…

Mobile users have out numbered their desktop counter parts in most industries since 2016. Today there we’re seeing more of a 70/30 split in regards to mobile users.

…which effectively gives Google a monopoly on Mobile Search!

The Web Page Discovery & Update Process

Earlier we mentioned spiders, crawlers, and bots. These are actually sophisticated computer programs that are designed to “crawl” the web and discover new and updated web pages to save to their index. Search engine results would get stale very quickly if they didn’t continually look for new pages and update old ones. And, of course, this discovery and update process must start somewhere. These starting points are known as Seed Sets — sometimes also referred to as the Crawl Frontier.

Seed Sets

As the graphic indicates, sites that are part of a core seed set are likely to have a clearly identifiable authority like .gov.edu, or .org. In addition they could otherwise be a well known commercial entity like NYTimes.com, CNN.com, , etc.

A site that is part of a seed set is likely to have the highest Trust and Authorityvalues as the spiders begin their crawl.

Trust is determined by a combination of factors such as website quality, popularity, and incoming links from authoritative sites. For example, a site with a link from NASA.gov would gain trust. But trust can also be lost by linking out to disreputable sites.Trust & Authority

Authority sites would include the likes of National Cancer Institute, NASA.org, Wikipedia.org, Electronic Frontier Foundation, National Geographic, PC Magazine, Wall Street Journal, and so forth.

Hubs are sites of any size that link to authoritative web pages. It is good to be a Hub and it is good to be linked-to by a Hub. Any page that links to other websites with known authority can be a good page from which to get a link.


Chapter Four, The Various Types of Search Results

Chapter Four
The Various Types of Search Results

When generating search results, search engines make use of many sources of information which include, but are not limited to, web pages, images and videos. We often refer to this as Universal Search.

High ranking success is dependent on learning where the search results are coming from so you can position your content accordingly. Keep in mind there are many different paths to the top of the search results.

For instance, some search results are heavily influenced by Personalization. This means the results YOU see from your east coast city location might not be the same results that your friend in Seattle is seeing for the same search. That’s because personalized search results can vary according to:

  • Reported Location
  • IP Location
  • GPS Location (Mobile)
  • Search History (signed-in / cookies)
  • Social Networks (Likes / Friends / Circles)
  • Device (Mobile / TV / Desktop)
  • Language
  • Previous Searches

You should expect each of the above personalized factors to significantly affect your search results.

Most search queries generate results from many different sources. Such Universal search results are pulled from a variety of sources like images, video, news, Twitter, Facebook and so forth. In the screenshot below we see our search for BMW generated AdWords (paid), a Knowledge BoxNewsYouTube and Organic results.

Universal Search

The content, source, and mix of Universal search varies greatly depending on circumstances such as, but not limited to, query, time, location, history, personalization, and so forth.

Sometimes Google generates a Featured Snippet (formerly known as an ‘Instant Answer’ or ‘Position Zero’ result) as seen in the screenshot below.

Featured Snippet

Many search results for more detailed queries return only “organic” search results. These are unpaid search results also known as natural results. They do not comprise a Knowledge Box or News Feed or AdWords (i.e. paid) results. Below is a screenshot of organic search results for the query Wiring Diagram Trailer Lights.

Organic Search

Notice that the organic results lack any specific label that identifies them as something other than organic results.

Local Results are becoming more prominent, fueled by the explosion of mobile search devices. mobile devicesGoogle assumes that, if you’re searching on a mobile device, then you’re probably looking for a local result.

The local and mobile search revolution is so active these days that Google seems to be changing the way they present Local and Google My Business listings and their search results almost daily. We expect this element of search to continue evolve most dynamically throughout this year and beyond.

Whenever Google isn’t quite sure what kind of results you’re seeking, they’ll deliver a combination of results that may include Partial Local Results. The screenshot below blends a mix of AdWords (paid ads), Google Shopping, Google My Business listings, Google Maps, and organic search sources.

Bear in mind the Local Search results layout can vary greatly depending on the query and the location of the person who is doing the searching.

News & RealTime Results are displayed whenever the search query is considered news related or if the query is currently a hot topic. (Yes, the screenshot below is dated but it’s still a great example and remains relevant today.)

News & Real Time Results

The screenshot above shows the results of a Superbowl query just prior to the game being played. It leads with the Featured Snippet (formerly Instant Answers), which is the game schedule followed by the most relevant web page in the organic results with that followed by breaking news.

The screenshot below shows results from the same Superbowl query done a short time after the game was played. The Featured Snippet has changed. The top two organic results are displayed just above the News.

News & Real Time Search Results

A month after the Superbowl the Featured Snippet is gone and the organic results have risen to the top.

More News & Real Time results

The above three screenshots illustrate how search results change in respect to hot topics in the News.

The screenshot below shows that commercial product queries often trigger a Google AdWords product result. Such Shopping Results include product names and part numbers that are supplied to Google Shopping by advertisers via Google’s Merchant Feed.

Shopping Results

As you’ve seen in previous screenshots, certain kinds of queries generate an Instant Answer aka Featured Answer that appear at the top of the search results, which is immediately followed by ‘People also ask’ links as seen in the screenshot below.

Instant knowledge box

Featured Snippets are also used with Google Now.

The screenshot below shows how Google’s Knowledge Box summarizes data from multiple sources to provide answers for the most popular questions, (a little too) conveniently, so the site visitor doesn’t actually need to leave Google to get their answer.

Knowledge Graph

When we add “What is…” to the same query, we get a Featured Snippet and a Knowledge Box.

Instant Answers

At times you may see both, or either one, depending on query. Here are some Featured Snippet – Quick Facts:

  • Featured Snippets have nothing to do with Structured Data Markup or code on your site.
  • Featured Snippets power 80% of Google’s voice search results for their Digital Assistants (Google Home & Google Now).
  • Featured Snippets are NOT currently found in Google’s Local Search Results.
  • There are 3 types of Featured Snippets: List Style – Paragraphs – Tables.
  • Google’s Featured Snippet results are currently very volatile and are constantly changing.
  • There has been a steady increase in Featured Snippets as Google focuses more on Voice Search.

Follow up Reading: A Pro’s Guide to Understanding Featured Snippets.

Certain search results are served based upon a Query Deserves Freshness (QDF) basis. When an engine becomes aware of a trending topic, something that is “HOT” showing a clear uptick in interest, they believe the term deserves very up to date results, and will rank fresh content much higher than older pages that have more PageRank.

Quality Deserves Freshness

For insight into what might constitute a Hot Trend, visit Google Trends or visit Bing’s Popular Now, accessed from the Bing homepage.

Sometimes it’s difficult to determine the intent of the searcher’s query. Unless the engine knows your history, ambiguous search terms produce results based on Query Deserves Diversity (QDD).

For example, the screenshot below shows a search for touch. But since the keyword, touch has multiple meanings, we find the results include a TV show, music player, and restaurant all called touch.

Query Deserves Diversity

Query Deserves Diversity explains why a page with far less authority can rank well when it’s relevant for an alternative meaning in a search.

If a search term matches a Trending Topic that’s generating a spike in traffic, most engines will favor recent content over otherwise superior ranking web pages. Generally speaking, this places News at the top of the listings.

Trending Topics

Google, Yahoo and Bing all generate specially tuned results whenever a Brand or Product Name search is used. That makes these kinds of searches difficult to rank well in unless you happen to be the owner of the brand.

Brand or Product name

Mobile Results…

It’s important to understand there are now more people searching on mobile devices than on laptops and desktop computers. All of the screenshots above are taken from desktop search results in order to provide consistent examples of how much search results actually do vary.

But you should bear in mind that, today, your efforts should be focused on mobile device users. Be sure to check all of your sites and specific searches using your mobile device.

mobile search results

In 2016 Google released their Accelerated Mobile Pages project and sites that were testing it began to show up in Google’s Mobile Carousel, specifically large news sites that were already in the Google News Feed. Since then a number of ways have surfaced for smaller sites to easily get started with the new Super Fast Loading, restricted HTML format AMP offers.

To be clear, Accelerated Mobile Pages (AMP) are webpages that have been specifically coded according to Google’s AMP guidelines with strictly controlled HTML. Only Google approved Javascript, CSS, Structured Data and HTML syntax are allowed to be used in AMP compatible pages. By strictly limiting the resources used in AMP pages, they can load FAR faster than your average, un-optimized responsive web page. When loading an AMP page, it’s clear to see why Google is pushing this format. They load almost instantly.

AMP is a complicated topic and not something you need to understand right now. It’s good enough for you to just know they exist. However, if you want to learn more then go the Google’s AMP project siteand see SEN’s article here.

google accelerated mobile search results

Each of the aforementioned types of search results are, ostensibly, displayed without fee payment. This means the order in which the results appear depend on each search engine’s proprietary algorithm for displaying results in their order of relevancy. Pay-Per-Click (PPC) results are the exception.

PPC results are simply paid advertising. And, these ads are the reason why search engines operate and what makes them profitable. For YOU, they are a quick, but very expensive route to the top of the search results.

The screenshot below shows how these paid ads appear in the search results. Both the Shopping and the AdWords sections, seen below, are Ads that are purchased via the Pay-Per-Click route.

Pay-Per-Click Ads

Summary

Universal Search Results include all of the following types of search results:

  • Organic
  • Local Search
  • News & Realtime
  • Shopping
  • Featured Snippets
  • Knowledge Box
  • Social
  • Query Deserves Freshness
  • Query Deserves Diversity
  • AMP results
  • Trending Topic
  • Brand and Product Name
  • Pay-Per-Click

…all of which are pulled from their respective sources.


Chapter Five, Keyword Research
and The Buying Process

Chapter Five
Keyword Research and The Buying Process

Keywords are the cornerstone of search engine optimization and Internet marketing. Keyword Research is arguably the most important activity you’ll engage in as you begin your search engine optimization and marketing (SEO/SEM) efforts. Simply put, Keywords are key to web marketing success and a profitable online business.

Your objective is to find your most valuable keywords, the ones that make you money. Neverassume you know which keywords people are using to search for your products or service, that’s a fool’s approach to keyword research. Instead you must ferret out the money keywords by following the buying process. By doing so you will learn that…

The only type of keyword that consistently converts to sales
is the last one used before making a purchase.

The take-away point is that…

High traffic keywords do NOT typically generate high conversions or high profit!

Amateurs often attempt to rank high for broad, generic search terms that are relevant to their product or service. But, in reality, you won’t make many (if any) sales even if you rank number one for a generic search term like cell phones. Experts know that cell phone BUYERS only use generic terms when they are researching a potential purchase. But, when they are ready to buy, they will search for something very specific like Buy Apple iPhone X best price no contract or samsung galaxy s9+ 256gb coral blue free shipping right before they actually make a purchase.

In other words, the keyword-based customer buying process always involves what we call a long tail keyword search. This is invariably a phrase with descriptive terms and perhaps a geographical location that accompany a generic keyword like lawyer. For instance, personal injury lawyer in tampa specializing in defective products would be the type of long tailed keyword (actually a phrase) that a person would use just prior to making a purchasing decision.

Let’s look at a real example of how the keyword-based customer decision making process actually works. A while back one of our tech-guys was in the market for a bluetooth headset. He started by searching for bluetooth headset reviews to learn which units came with the best recommendations. In doing his research he learned about bluetooth multi-point and bluetooth 2.1.

That led to learning about and narrowing down his search to three possibilities: the Plantronics Voyager Pro, Motorola Command 1 and the Bose AE 2 bluetooth headsets. He decided on the Plantronics unit and his final search, the money search, was: Plantronics Voyager Pro Plus.

BuyingProcess

The graphic above shows that his product search began with a very generic search phrase (keyword), in this case, bluetooth headsets. As we would expect, there’s lots and lots of traffic for that term but not a lot (if any) sales. And, although amateurs might be tempted to target that generic keyword, experts know it’s the very specific money keyword (phrase) that will actually produce sales. In the case of purchasing pay-per-click traffic, this is critically important to know. The worst possible amateur mistake is paying for expensive high volume generic keyword traffic that does not convert to sales.

Of course this is a simplification of the process. There are often a variety of factors that increase or decrease the due diligence that consumers apply to the buying process. Generally speaking, the lower the price point the less research a consumer will bother doing. On a high-ticket item however, you should expect the buying process to be more extended. For instance a survey by Polk and Auto Trader found the average new car buyer spent 18-19 hours in research mode and searched for special offers, rebates and incentives 42% of the time.

As previously mentioned, the money “keywords” are actually a phrase. And, in SEO jargon, any keyword phrase of three or more keywords is called a Long Tail Keyword. Such keywords are very specific to whatever you are buying or selling. The good news is that long tail keywords are usually less competitive, making them easier to rank well for, and they’re more likely to convert to sales.

The Keyword Discovery Process

The first step in your keyword discovery process is low tech. All you need to do is ask people how they would search for your product or service. If what you offer solves a problem, then ask them what they would search for to solve that problem. Ask your employees, your vendors, and your customers. Then start asking friends and associates. Heck, you can even ask the cab driver; you get the point. ASK! …it will be the most cost effective and potentially productive aspect of your keyword research.

While you’re in the asking phase, be sure to look for slang, jargon, and dialect. Remember that different people oftentimes search for the same things using different words. For instance, some people use PLC to search for industrial computers. The search term murdered out is jargon for a flat black car with flat black wheels. People who live on the upper peninsula of Michigan are called Yoopers. If you’re selling fire extinguishers to car enthusiasts, it’s good to know that American drivers might store them in the trunk while British drivers might store them in the boot. And, did you know that re-pop is slang for reproduction? That’s why it’s critical to ask!

You should also mine your webserver logs and/or Google Analytics for unique keywords you haven’t yet discovered. Check your business email system, web site search engine data, and your support ticket system if you have one. Monitor relevant blogs, forums and competitors’ web pages (it can’t hurt to see what the other guy found). Use a thesaurus to find common synonyms. Alternate languages and spelling errors are frequently a good source. And remember to add location keywords such as your city and state whenever applicable.

The idea is to identify all of the keywords that people are associating with your product or service and the problem it solves.

4 Free Keyword Discovery Tools

Due to the competitiveness of phrases like smoothies, understanding and embracing very high-quality, long-form detailed pieces of evergreen content is essential. It’s this type of content – those that seek to ask and answer all possible questions of the audience – that Google loves and tends to rank the highest in competitive search results.

Here is an example of how we could use keyword research to put together an Ultimate Smoothies Guideresource as an example:

Keyword Research Strategy #1 – Google Related Searches

One of the best (and fastest) ways to research keywords is to go directly to Google, who provides Related Searches at the bottom of each search results page. For example, if you type smoothies into Google and scroll down to the bottom of the page, you’ll see a list of related searches like this:

Graphic showing Google Related Searches for the Query Smoothies

You can see that Google refines smoothies down to related queries like:

  • healthy smoothies
  • fruit smoothies
  • protein smoothies
  • yogurt smoothies
  • green smoothies

These related searches are GOLD to you when drafting your own Ultimate Smoothie Resource or when doing any keyword stemming research on a base keyword phrase.

If you were to put something like this together, it would be a good idea to have entire sections in the piece on fruit smoothies and protein smoothies and yogurt smoothies and green smoothies. Doing that would result in a complete resource that Google would, pardon the pun; drink up.

This should be your first step when researching a blog post or article in the future.

Keyword Strategy #2 – Google Autocomplete Results

If you’ve ever done a search at Google then you know that Google uses predictive analysis to actuallyautocomplete many of your searches as you are typing. This is called Google Instant and is completely controlled by the volume and breadth of searches done by Google users worldwide, and by the content of web pages in its index.

Google has, honestly, seen it all. So it’s relatively easy for them to understand “intent” by users and provide suggested searches that match this intent.

If you were to do a Google search for smoothie, you’d see a long list of populated searches that Google applies directly under smoothie.

Graphic showing Google Autocomplete Queries for the term Smoothies

Now if you compare this list with our list from the related searches you can see that a couple of patterns emerge. First, notice that smoothies for kids popped in. That would make another kick-ass section in any Ultimate Smoothies Resource.

Second, see that kale and spinach and yogurt show up again? Kale and spinach would fall under the green smoothies section we already established above. So, as you can see this strategy allows you to easily find a few new keyword phrases to include in your resource.

  • smoothies for kids
  • smoothies with kale
  • smoothies with yogurt

Autocomplete is something that constantly changes. Monitoring this now and in the future, especially when you are researching new content or updating existing content, will help you considerably with targeting what’s “hot” with keyword searches. That’s because Autocomplete is driven by real-time search query information provided by Google users worldwide.

Keyword Strategy #3 – Ubersuggest Results

If you’ve never heard of Ubersuggest.org then you’ve missed a great way to expand your keyword research. Ubersuggest is a keyword suggestion tool that searches and pulls information from known verticals in Google and other tools.

Graphic showing Ubersuggest Tool

If you go to this tool, put in a base keyword, and choose Web you’ll be presented with a long list of queries tied to this base term.

Here’s a brief overview of some keywords this tool returns from the Web vertical:

Graphic showing Ubersuggest Smoothie Keyword Queries

If you take this keyword data and apply it to the data lists under Strategies #1 and #2 listed above, you can once again see some clear themes emerge:

  • smoothies for kids
  • smoothies with kale
  • smoothies with yogurt

If you weren’t convinced that kids and kale and yogurt are BIG semantic keywords within the smoothieniche before, now it’s hard to ignore.

Keyword Strategy #4 – Answer the Public

Another great tool that many still haven’t heard of is Answer The Public. This tool is a bit different as it doesn’t return search query results but instead shows all of the most popular questions people are asking based on the keyword you submit.

Screenshot of the main page of Answer the Public Keyword Tool

This tool gives you countless ideas for potential content based on the questions people are actually asking. When you consider how important Voice Search is becoming, this is priceless. Each one of the lines seen in the image below is another question, and each area of interest is broken down to the finest details. You can see the search results here.

Screenshot of the main page of Answer the Public Keyword Tool

You can download the results to further process them or just use their interface to dig deeper. One of our favorite features of this tool is the Related. In this case it gave us 19 results that included these:

  • smoothies recipes
  • smoothies for kids
  • smoothies with kale
  • smoothies with yogurt
  • smoothies to lose weight
  • smoothies without yogurt

Now, if you weren’t convinced that kids and kale and yogurt are BIG semantic keywords within thesmoothie niche before, now it’s hard to ignore.

Conclusion

These are just four very quick (but very powerful) ways to perform keyword research on both new and existing blog topics.

Remember, Google wants to rank COMPLETE pieces of content. Think about your main root keyword, then seek to use these strategies to conduct keyword research based on real-world search feedback. Do that, and the content you publish may actually crack that all-important first page of Google results. Good luck!

Pro Tip: Consider installing the toolbar add-on Keywords Everywhere because it ads search volume datato both Google Suggest and Answer the Public.


Chapter Six, On-page, Internal Ranking Factors

Chapter Six
On-page, Internal Ranking Factors

Back in Chapter Three you learned about some of the ranking factors. We used hypothetical dials to illustrate their relative importance. In some cases the dial-maximums were set high, in other cases the dial-maximums were set low. We even showed you dials that could register a negative ranking score. In this chapter you’ll learn about all of the important internal (on-page) ranking factors and their relative importance on the current algorithm dial.

Let’s start by defining Internal Ranking Factors. These are variable page elements found within your site’s web pages. You have total control over all of these elements since they exist completely within the realm of your website. The internal factors covered in this chapter should be regarded as essential elements to your site’s optimized web presence. Any internal factor NOT covered here should be considered not significantly important.

  • The Title Tag

The <Title Tag> has always been, and still is the #1 most important internal ranking element. Within the source code of your web page the title tag looks like this: <title>Your title tag keywords go here</title>.

The title tag is intended to tell the search engine what the page is about. That’s why you should put your most important keywords in the title tag. If your page topic is about steel rebar, then the keywords steel rebar should be included in the title tag.

Below is an example of the title tags used by a real-world company that managed to rank 5 of their web pages in the Top 10 search results for the keyword steel rebar:

<title> Steel Rebar – Steel Reinforcing Bar | Harris Supply Solutions </title>
<title> Steel Rebar Sizes – Steel Rebar Stock | Harris Supply Solutions </title>
<title> #4 Rebar – #4 Reinforcing Bar | Harris Supply Solutions </title>
<title> Steel Rebar Supplier – Steel Reinforcement Supplier | Harris Supply Solutions </title>
<title> #6 Rebar – #6 Reinforcing Bar | Harris Supply Solutions </title>

Notice that NONE of the title tags are identical. This is important. You should never have any duplicate title tags anywhere on your website! Duplicate title tags are confusing to search engines and viewed as an error which can negatively affect your search rankings.

Generally speaking you should limit your title tags to 67 characters. This is intended as a guideline. You might notice that the fourth title tag listed above exceeds the 67 character limit. But it should also be noted that the most important keywords are arranged toward the beginning of each title tagjust as they should be.

Always remember the content within your title tag is frequently the text that appears as a link in the search results.

Title Tag ExampleThis screenshot illustrates how Google pulls content from the title tag to provide your descriptive link. If you compare these links to the title tags listed above, you’ll see how Google adapts them in whatever manner they deem appropriate. In many cases they’ll simply shorten them while in other cases they’ll delete a portion. Sometimes they’ll pull content from within your page that matches the search query and add it to the descriptive link. In this screenshot you may notice that Google truncated the link description in the second result and deleted the middle portion of the title tag in the fourth result, which also happens to exceed the 67 maximum character guideline.

By the way, it isn’t actually necessary to put your company name in your title tags, although Google tends to like it there. In some cases Google will add it for you if it isn’t already there. We mention this because it can be puzzling to see content that isn’t in your title tag appear in descriptive links as though it is. That’s not Google messing with your source code. It’s simply their way of making your descriptive link more relevant to whatever search query is generating the results. So don’t tear your hair out trying to figure out what’s wrong with your title tag. Nothing is wrong. Google routinely makes adjustments that it deems necessary to “provide a good user experience” for its site visitors. We’ve even seen cases where they’ll pull content from a headline tag or an inbound link’s anchor text if the page’s title tag is lacking content relevant to the search query. Their goal is to make the descriptive link match the search query as closely as possible. And fortunately in most cases this works to your benefit because people are more likely to click links that match their search queries.

  • The Meta Description Tag

While not nearly as important to ranking as the Title tag, the Meta Description tag should never be overlooked. It’s frequently the source from where Google pulls the text that’s displayed directly below your descriptive link. Think of it as the enticement for clicking if the content of your link isn’t already compelling enough. The Meta Description tag frequently provides that block of text.

Within the source code, it looks like this:

<meta name=”description” content=”A good meta description tag entices the searcher to click the link by describing what they’ll find when they view the page.“/>

Therefore, the Meta Description tag is important because it can affect click-through rates once your pages are actually found in the search engines. Remember, it doesn’t do any good to rank well if your links don’t get clicked.

Meta Description tag snippetsOn the right we see the Meta Description tag was used fully or partially in three out of five of the page descriptions.

Take note that Google also pulled snippets of text from the body content located at the beginning of two of the pages and near the end of one of the pages. This tells us that a web page’s opening and closing text content is important in regards to SERP descriptions and can be used to entice clicks.

Using your keyword in the Meta Description tag is unimportant in respects to ranking. However, it IS important in regards to click-throughs. Since search engines like to use the keywords that are searched, they will often grab a snippet of text from the page if the Meta Description tag doesn’t contain the keyword that was searched for.

  • Keywords in the Domain Name as a Ranking Factor

The importance dial has been turned down considerably on keywords in the domain name. This is especially true of domains composed of highly popular generic keywords like buycheapairlinetickets.com. These look spammy to Google so they are no longer favored in terms of ranking well. Even if the domain name is an exact match with the keyword in the search, Google takes into account the content quality as well as the quality of the incoming links as they evaluate the relevance of the page. If either the links or the page content are of low quality, then having an exact match domain (EMD) will not help very much, if at all, in terms of ranking well.

Instead, more and more favor is being lavished on brands. It helps considerably to have your unique brand name as your keyword in the domain name. And, if you can manage to get people searching for what you’re selling by using your brand name, then you have the ultimate advantage. That’s what you should strive for in the long run.

  • Headline Tags

Having your best keywords in your <H1> (headline) tag is important. Google looks for keywords in the H1 tag and oftentimes pulls your snippet from this area of your page. We recommend using the H1 tag only once per page and to place it somewhere near the beginning of the body text rather than in the left or right rail content where we often see the navigation portion of the page.

Ideally your keyword should appear early in the headline but keep in mind the headline must read well to site visitors or else it will hurt sales. So the rule is to use a headline intended to attract attention to your product first, and then cater to Google’s algorithm second.

By the way, as you probably know, an H1 headline can appear on a page as disturbingly HUGE. In most cases, the font size is too large in terms of creating aesthetically pleasing web page design. The work-around involves CSS (cascading style sheets). By using CSS to adjust the font size of the headline to align with the design goals of the page, you can address both design and SEO concerns at the same time. And, in case you are wondering, using CSS to reduce your H1 font size is perfectly ok with Google.

While the H1 tag can help your ranking, H2 and H3 tags are less effective. Regardless, it’s ok to use them, they might help a little and they certainly won’t hurt. But you shouldn’t expect a boost of any significance from keywords in headline tags other than the H1.

  • On-page Anchor Text

The keyword text within your on-page links — the anchor text — provides a bit of help, ranking-wise. But, since Google knows this is easily manipulated, they don’t turn the dial up very high on the algorithm. Regardless, it will usually help and not hurt your ranking efforts provided that you do not abuse the strategy. If Google thinks your on-page anchor text is there to manipulate rankings, they can penalize you. Therefore, we recommend that you limit your keywords within your anchor text to only a few per page. Any more than that could be counterproductive.

  • Keywords in Body Text

As you might imagine, having your keywords in your body text (the page content) is also important. This is what Google indexes and uses to determine if a page is relevant to the search query. In regards to on-page ranking indicators, the keywords found within page content are typically a medium to strong ranking factor on the algorithm dial.

However, it’s a bad idea to stuff or repeat an excessive number of keywords. Doing so will get you penalized. It’s best to sprinkle in your keywords naturally in ways that sound comfortably conversational when you read it out loud. Otherwise your page’s quality score will suffer and its ranking will be hurt.

It’s best to place your keywords toward the beginning of the body copy. It can also be beneficial to place them toward the end of the text as well. If it seems natural to use them in other locations, then do so. Just be sure to avoid using them in the way that a used car salesman might overuse your name when trying to sell you something. If it sounds a little creepy when you read it out loud, then you’ve probably repeated your keywords too frequently.

  • Images

Images are an often overlooked ranking factor. While it’s true that search engines can’t “see” images, they can see the filenames and the Alt tag. Therefore you should name your image files by using applicable keywords like keyword.jpg.

Remember that some people use Image Search as their primary search vehicle. In such cases you’ll want your images to rank well because top ranking images are another great way to drive traffic to your site. The Alt tag provides an opportunity to associate keywords with your images.

<img src=’/images/BMWZ4M.jpg’ Alt=’BMW Z4 M Series Roadster Convertible’>

However, you should NOT over use this strategy. Using keywords to name images is acceptable to Google. Adding a brief description of the image in the Alt tag is fine. But keyword stuffing your image Alt tag will get you penalized.

You can help the engines index your images more completely by using an XML image site map. For in-depth information on this topic, take a look at these articles (requires SEN Membership):

  • Keyword Density

Keyword Density is a ratio that is calculated by dividing the number of times your target keyword appears on the page by the number of total words on the page. For instance, if your keyword appears 10 times in a page with a total of 500 words, the keyword density is 2% (10/500=0.02).Although a perfect keyword density ratio was, in the past, an important ranking factor, today that is not the case. Our best advice regarding the ideal keyword density is to simply make it higher than any other word that appears on your page. If you’re selling rebar, then the keyword rebar should have the highest keyword density ratio. This ensures the search engines will accurately determine the topic of the web page.

Remember to keep it natural. Your page content should make sense to humans when they read it. Avoid hammering any keyword too much and don’t stress over trying to get the perfect keyword density ratio because there is no such thing as a perfect ratio anymore.

  • URL Structure

Using a simple URL that includes your targeted keywords will typically provide a slight to medium boost in ranking. Looking at our rebar example, we see the keyword is used in all five of the ‘Harris Supply Solutions’ URLs we found ranked in the top 10 as such:

  • http://www.harrissupplysolutions.com/steel-rebar.html
  • http://www.harrissupplysolutions.com/steel-rebar-sizes-stock.html
  • http://www.harrissupplysolutions.com/4-rebar.html
  • http://www.harrissupplysolutions.com/steel-rebar-supplier.html
  • http://www.harrissupplysolutions.com/6-rebar.html

Don’t get carried away trying to stuff too many keywords into the URL. Remember that when the URL becomes too long, it’s difficult to direct someone over the phone where to go. In addition, long URLs can be difficult to use in an email because the link tends to break at the hyphen if it uses more than one line. So, with these considerations in mind, it’s a good idea to include your keywords in the URL provided that you take the conservative approach.

  • Uniqueness of Content

Having unique content is critically important. That’s because search engines tend to view duplicatecontent as a waste of their indexing resources and counter to what their searchers are looking for. They correctly reason that nobody wants to search for a red widget and find hundreds of red widget pages that are all alike. The engines want to provide searchers with a variety of unique pages. This might include red widget product, red widget specifications, red widget reviews, red widget discounts, red widget videosand so forth.

Of course if you’re writing a blog and producing original content of your own, then uniqueness is easy. But if you’re one of a thousand sites selling a name-brand product, then creating uniqueness is going to be more challenging.

In such cases, the key to overcoming the challenge is in the product descriptions. While it’s true that many websites have hundreds of merchants listing the same products in their shopping carts, you will find that…

only pages with unique product descriptions will typically rank well

…while those that use the brand-name-suggested description are either filtered out or buried in the rankings. And, by the way, this applies to images as well.

So, if you are selling something that a lot of others are also selling then you must rewrite the product descriptions and rename the product image files so that your product pages and images are not filtered out of the search results as duplicate content.

In cases where it isn’t allowed to change the product description, you can add content to make the page unique. For instance, some sites add user reviews and product demonstration videos. By enriching the manufacturers content you can make your page unique and more deserving of a good ranking.

True, this requires a bit more work. But if you don’t do it, then you can’t expect to rank well because you’re probably competing with the likes of Amazon and perhaps also the name brand company that manufactures the product.

If there’s one ranking dial that’s trending up, it’s Mobile CompatibilityIt’s becoming increasingly important that web pages display properly when viewed on ALL devices and especially on mobile devices.

As far back as 2013 we began seeing a ranking preference given to sites that are fully mobile compatible. Google is committed to providing searchers a quality experience when using their smartphones and tablets. They’ve come out and stated that companies need to design their sites to be responsive to all devices if they want to be successful long term.

This has launched a trend toward what’s known as Responsive Web Design — arguably the most fluid sector of SEO/SEM strategy right now.

Concurrent with the emphasis on mobile compatibility, is page-loading speed. Mobile compatibility and page-loading speed are elements of a website that Google is increasingly factoring into their ranking algorithm. Their thinking is that quality websites load fast and are responsive to display properly on all devices. If your site isn’t responsive and fast loading then it’s just not going to be able to compete online.

Google has gone so far as to change the way their algorithm works completely. They’re calling it the Mobile-first index, which means that instead of focusing on the way your site works on a desktop computer for rankings (as of early 2018) they’re focusing on your site from a mobile device perspective. If your site has a responsive design then it’s the same content on both desktop and mobile so you’re fine. However, if you don’t have a responsive design or have separate sites for mobile and desktop then you could have some work to do. You can learn all the details on our resource:

Page speed is also such a hot topic that we’ve expounded on them considerably. We suggest carefully studying the following article-tutorials (requires SEN Membership):

  • Broken Videos and Faulty Redirects as Negative Ranking Factors

As the emphasis on Mobile continues to grow, Google is now penalizing sites with page elements that do not display properly or break when viewed on a mobile or tablet device. Most commonly, we’re talking about Flash videos and faulty redirects.

As you may already know, Flash does not work on smartphones or tablets. And for that reason, Google “suggests” that you not use Flash on your website. And, since websites and pages that are 100% smartphone and tablet compatible gain ranking favor over those that aren’t, it means that incompatibility translates to a ranking penalty.

The same is true if your redirects are faulty or if your site returns 404 Page Not Found errors. You can learn more about this by studying the aforementioned tutorials, The Practical Guide to Mobile SEOParts 1 & 2.

  • Spelling, Grammar and Readability (i.e. quality) issues as Negative Ranking Factors

Spelling and grammar usage are also factored into Google’s ranking algorithm as quality signals. However, comments that are posted on a web page are not factored into the quality scores.

In addition there is evidence indicating that rankings are also affected by readability level. Since mathematical formulas like the Flesch-Kincaid test can analyze a document for such, there is every reason to believe that Google calculates a score based on the average number of syllables used per word and the number of words used per sentence. We already know that Google’s Advanced Search provides the option to apply a reading level to filter the search results, so we have hard evidence that Google owns the capability to factor readability into their algorithm. The higher the readability level, the better, ranking-wise.

  • Page Freshness as a Ranking Factor

Earlier you learned about query deserves freshness (QDF) search results related to hot topics. Since the search engines love new content, it only stands to reason that newer (i.e, fresh) content will have a ranking advantage over older content.

Therefore it is always to your advantage to update your best pages as often as is practical based on the type of content you’re presenting. The more up-to-date your web pages, the better you can expect them to rank. On the flip side, the reverse is true. You should do everything in your power to avoid having stale, out-of-date content on your site because that will definitely hurt your rankings.

Pay very close attention to dates. Copyright, articles, reviews, and product pages that reference dates can be a problem if they indicate anything other than the current year or recent months. Your credibility and your rankings will suffer if you’re touting the best widget for 2017 if we’re already in 2018. You get the idea.

  • Geolocation Signals

In most cases, especially for businesses that attract customers locally, it’s important to include geolocation signals like your address and phone in addition to your company name. This is typically referred to as your NAP (name, address, phone).

Your NAP should be displayed in several locations on your site.

Be sure to make it consistent!

The search engines do NOT like multiple phone numbers or locations. It’s confusing to their database and your rankings can suffer. If a location keyword is an element of your customers’ searches, then be sure to also include it in your title tags. And, if your geographic location has a nickname or slang term, be sure to work that into your content as well. For instance, if you’re a dentist in the upper peninsula of Michigan then you know the term Yooper refers to residents of the local region. As such, a search for Yooper dentist produces the following top result:

Yooper dentist search results listing

Notice how they’ve work the slang term for the geographic location neatly into their content.

Take note that Local Search is a specialty within search. That’s why we’ve dedicated an entire eBook to the topic. For an in-depth study of local search strategies, get a copy of our Local Search Marketing Book.

  • Spider Friendly Website Architecture

Your site’s layout, aka architecture is important. While it’s obvious you should make it easy for site visitors to navigate, it’s critical that you make it easy for search engine spiders to find all of your pages as they crawl and index your site. You do not want obstacles such as Flash menus or dynamically generated web pages to prevent spiders from finding and following your links.

Spiders can find, follow, and interpret text links best. These are links with normal anchor text. Drop down links, created using CSS, are also easily found, followed and interpreted. If ever you’re in doubt about this, simply disable CSS in your browser and take a look at the page. (Below you see the path for disabling CSS in the Firefox browser. Select No Style.)

How to disable CSS

Once you’ve disabled CSS, you should see the links as normal looking anchor text links when the page is viewed. If you can see them, then so can the search engine spiders and you’re good to go.

Image-links are also easily found and followed. However, spiders can’t always tell what the image-link is about unless there are clues like keyword.jpg type file names and a description added to the Alt image attribute.

Static Links vs. Dynamic links – A static link looks like this:

http://www.yoursite.com/directory/file.html

Of course the web page in that link is: file.html. Static links offer many advantages over dynamic links. For starters, they make more sense to humans and therefore are more likely to get clicked. They help eliminate the problem of duplicate content, which is good because search engines hate duplicate content. Static links display better in print and other media advertising. And static links typically get broken less often than dynamic links because they are shorter and less likely to contain a lot of hyphens.

A dynamic link is generated on the fly by using a database to name the page on request and as needed. The link might look something like this:

http://www.yoursite.com/s/ref=nb_sb_noss/183-1484986-3953124?url=search-alias% 3Daps&field-keywords=keyword1%20keyword2

The web page in that link is: 183-1484986-3953124?url=search-alias%3Daps&field-keywords=keyword1%20keyword2 . As you can see the link is much longer and it contains a lot of hyphens and other characters that are likely to break the link when spread across two lines. It’s nearly impossible to remember and difficult to convey over the phone. It wouldn’t display well in print or any other form of advertising media and it could become duplicate content if the same search generated the same page but with a different assigned “dynamic” serial number. You get the idea.

But the worst disadvantage to dynamic links is when a spider gets caught in a loop. This happens when the spider finds a product link, indexes the dynamic URL, and then finds another link to the same product and indexes a different dynamic URL even though it’s the same product page but with a different serial number. And when this process happens again and again, the spider is said to be caught in a spider loop. This is bad in terms of getting your site properly indexed. Most spiders will leave your site to avoid such loops and therefore avoid indexing the rest of your site. Again, if your site isn’t getting properly indexed then your web pages will NOT show up in the search results.

There are times, however, when dynamic links are desirable in terms of integrating product databaseswith web page display. But the good news is there are workarounds. Many Content Management Systems (CMS) like WordPress, for example, can be set up to display static looking URLs that are actually dynamic. If your system demands dynamic URLs, then we suggest you look into making them as simple, unique and people friendly as possible.

Your goal should be to design your site architecture so that spiders can find every page on your site by starting at the home page. That does not necessarily mean that your home page must link to all of your pages. It does mean that, by starting at the home page and following links to secondary pages, these secondary pages allow the spiders to eventually find all of your pages. And, once again, remember to avoid links that depend on Flash even though you may hear that spiders are getting “pretty good” at finding such links.

For more information on setting up the ideal site architecture, take a look at our in-depth tutorial:

  • XML sitemaps

XML (Extensible Markup Language) is a type of markup language where tags are created to share information. An XML Sitemap tells the search engines what content you want indexed.Theoretically the search engines should find all of your content by following links. But an XML sitemap can help speed up the process and reduce the chance of spiders missing some content that isn’t easily indexed. This is especially true for getting content like imagesvideos, and product pages indexed.

Most experts agree that an XML Sitemap is essential for keeping the engines up-to-date with your website changes. It helps to ensure that all of your important content is indexed and provides supplemental information (metadata) about your content.

By the way, you should not confuse a navigation site map with an XML Sitemap. The former is simply a page of links for your site visitors to use while navigating your site. The latter is a list-feed intended just for search engines and is not at all seen or used by site visitors.

Although XML sitemaps are not technically required, they are highly recommended. They provide useful metadata for the search engines and they’re especially useful for content other than web pages. They’re fairly easy to generate and there are plug-ins available for WordPress and other CMS systems to help you do this.

Additional Reading: The article-tutorial listed below is a few years old but the step-by-step process is still the same today. To learn how to create and feed your XML sitemaps, take a look at:

  • Robots.txt

One of the earliest names given to search engine spiders, crawlers, and bots was robots. Thus, the function of a robots.txt file is to tell spiders what to do in regards to crawling and indexing pages on your site. You might picture your robots.txt file as the tour guide to your site for the search engines. It provides a map that tells search engines where to find the content you want indexed. It also tells them to skip the content you don’t want indexed. The end result is a faster and more complete indexing of your site.

If you do not have a robots.txt file, then the spiders will index everything. But regardless of whether this is what you want, we recommend that you have a robots.txt file anyway because the search engine spiders are looking for it.

There is a lot you can do with a robots.txt file to improve the efficiency of getting your site indexed. We highly recommend that you study and bookmark the following tutorial so that when the time comes to implement the various functions of robots.txt you’ll be able to easily create the perfect file that will give you the results you’re looking for.

  • URL Redirection

Redirects, sometimes referred to as URL forwarding, make a web page available under more than one URL address. When attempting to visit a URL that’s been redirected, a page with a different URL opens up. For example, www.yourolddomain.com is redirected to www.yournewdomain.com.Redirects can be used to forward incoming links to a correct new location whenever they’re pointed at an outdated URL. Such links might be coming from external sites that are unaware of the URL change. They may also be coming from bookmarks that users have saved in their browsers. Sometimes they’re used to tell search engines that a page has permanently moved.

There are two kinds of redirects that you need to know about.

  1. Browser based redirects
  2. Server side redirects

Browser based redirects have fallen out of favor with the search engines due to their frequent use in manipulating the search rankings. For that reason, they can often do more harm than good. That’s why we recommend that, if you use them, you’d better know what you’re doing. Otherwise, you should avoid browser based redirects if your site is dependent on good rankings.

Server side redirects are safer and necessary to use in specific instances, like when a URL has moved. The two most common redirects are the 301 redirect and the 302 redirect. Both of these are highly useful. We recommend that you study the tutorial below in order to gain a full working knowledge of how these valuable webmaster tools can be safely applied. Consider it essential reading.

  • Duplicate Content

As previously mentioned, search engines hate duplicate content. Their thinking is that it wastes their resources and provides a bad user experience. That’s why they tend to filter out and sometimes penalize sites that clog their index with duplicate content.

The biggest offenders are product pages that all carry the same product description. Google doesn’t care where you buy the product. They only care that you aren’t served the same product page coming from multiple websites. So they look for the content originator, tend to favor the name-brand company that produces the product or else prominently rank a large site like Amazon.com. The rest of the pages selling the same item tend to get filtered out of the rankings.

We also mentioned how duplicate content issues might arise whenever a site uses a dynamic database system to create product pages on the fly. This is to be avoided as well.

And, of course, any other content that duplicates what is already on another site should also be avoided. The bottom line is that Google is looking for original content. Anything that isn’t original reflects badly on the overall site quality. So, you should see to it that your site contains only original content and not something that can be found elsewhere.

  • Canonical URL

Your Canonical URL is your preferred URL. This is relevant because http://www.yourdomain.com and http://yourdomain.com are NOT the same URL even though they both land site visitors on your home page.

As you can see, the first includes the www, the second does not. The fact that these two URLs are notthe same is important because it means that some websites might link to one and some might link to the other. In such cases your PageRank (see next chapter) is divided instead of combined. This will hurt your rankings.

Furthermore, Google sees two different URLs with the same content. This creates a potential duplicate content problem. They can’t know which URL you want indexed and this puts your site at a disadvantage in the rankings.

The solution is to choose one “Canonical” URL over the other. It doesn’t matter which one you choose. Google doesn’t care whether you use the www or not. But you must consistently choose one over the other. And then simply redirect the traffic from the one you’re not using to the one you’ve chosen to use. In addition, you must see to it that your incoming links are pointed to your Canonical (i.e., preferred version) URL.


Chapter Seven, PageRank, Algorithms, & Search Engine Updates

Chapter Seven
PageRank, Algorithms, & Search Engine Updates

Google uses several algorithms to rank web pages. The oldest and best known of them all is PageRank. PageRank assigns a number ranging from 1 to 10 indicating the relative importance of a page. For instance, a page that is rated 7 on the PageRank scale (expressed as PR7) is considered to be an important page. Any web page rated PR8 or above is considered a very important page.

Loosely interpreted, the scale breaks down as such:

PR0 (zero) Very Weak
PR1-2 Weak
PR3-4 Moderate
PR4-5 Strong
PR6-7 Important
PR8-10 Very Important

Here are some examples of well known brands and their corresponding PageRank. As you can see, their home pages range from Important to Very Important.

  • PR6 – GEICO.com, RedBull.com, TacoBell.com
  • PR7 – GE.com, YellowPages.com, TheOnion.com, NFL.com, PGA.com, McDonalds.com
  • PR8 – eBay.com, bing.com, Slate.com, NASA.gov, NBA.com, Weather.com
  • PR9 – NYTimes.com, Yahoo.com, Wikipedia.com, CNN.com, Amazon.com, facebook.com, youtube.com, Coke.com
  • PR10 – Google, twitter.com

According to Google:

“PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.”

To gain some visual insight into how this works, let’s consider the hypothetical network of links in the graphic below.

PageRank or Link Juice Illustration

Notice first that all of the percentages add up to 100% (actually, 99.9%). That’s the maximum available PageRank. And, since Page B has the most incoming links, it has 38.4% of the available PageRank making it the most important page in this network of linked sites.

Most importantly, you should notice that Page C is the second most important page based solely on the fact that the most important page (Page B) is linking out to Page C. In fact, since Page Bis passing along to Page C all of its available link juice (SEO jargon for passing along PageRank), it becomes the second most important page even though it has only ONE link! If Page B were linking out to any additional pages, then Page C would not be getting as much link juice and therefore would be much less important in terms of PageRank.

Next, take a look at Page E. It is linked-to by Page F (which provides it with a fair amount of link juice) and also linked-to by 5 other not-so-important pages which all contribute a small amount of link juice. But it all adds up to give Page E a pretty good boost in PageRank totaling 8.1% of the available PageRank. And the fact that Pages E and F are reciprocally linked means they are passing link juice to each other and thereby increasing each other’s PageRank.

Now take a look at Page A. Although it has only one link, it’s getting a fair share of link juice from Page Dbecause Page D is linked-to by the fairly important Page E. In other words, Page E is passing link juice to Page A through Page D.

Confused? …don’t be. Simplified, it works like this. Important pages are king makers! But, generally speaking, the more high quality inbound links a page has, the more important Google thinks the page is. And, the more important a page is, the more link juice that page passes along to the pages it links to. Such link juice can be diluted, however. For instance you might notice that Page E is linking out to three pages; D, F, and B — and therefore passing along only a fraction of its available link juice to each page it links to. If it were linking only to Page D, for instance, then Page D would be getting something like 7.2% of the link juice instead of only 3.9. Get it?

The bottom line is that links from important pages pass along significant amounts of link juice. This helps your PageRank. And the more outgoing links on a page, the more the link juice is diluted. Therefore, you should strive to acquire links from important pages that do not link out to very many, if any, other pages.

A few years back Google’s toolbar was an easy way to quickly see PageRank and as you can image was a top tool for link building. Over the years PageRank has been seen as an old school term by new SEOs because many believed it was retired when proof of it disappeared from the Google Toolbar officially in April 2016. However, although it’s not available for public viewing, PageRank is still alive and well internally at Google

PageRank is suddenly back on the everyone’s radar because Bill Slawski, who is known for his brilliant interpretations of Google Patents, just reported on an update to Google’s original PageRank Patent that was granted on April 24th, 2018. Or as he put it…

 A continuation patent of an updated PageRank was granted today. The original patent was filed in 2006, and reminded me a lot of Yahoo’s Trustrank (which is cited by the patent’s applicants as one of a large number of documents that this new version of the patent is based upon.) 

So what’s changed? Well, the patent goes into detail on how pages are ranked based on how close they are to high quality seed sites that will be used as reference in Google’s link graph. The distance is measured in a complicated way that has everything to do with links. The simplified explanation is that the Internet is now separated into topics/industries and, the closer a site is linked to the Google recognized high quality and spam free seed sites in their industry, then the higher a site will rank.

That means that smaller sites with fewer links have the same chance of high rankings as larger sites with tons of links. If a smaller site gets great links that are close to the authority sites within their niche then it will give them enough of a boost to climb in rankings pretty dramatically. Here’s a great quote from the patent:

 One embodiment of the present invention provides a system that ranks pages on the web based on distances between the pages, wherein the pages are interconnected with links to form a link-graph. More specifically, a set of high-quality seed pages are chosen as references for ranking the pages in the link-graph, and shortest distances from the set of seed pages to each given page in the link-graph are computed. Each of the shortest distances is obtained by summing lengths of a set of links which follows the shortest path from a seed page to a given page, wherein the length of a given link is assigned to the link based on properties of the link and properties of the page attached to the link. The computed shortest distances are then used to determine the ranking scores of the associated pages.

Going forward, smart link builders are going to spend far more time identifying the authority sites in their niche and focusing on getting links as close to that source as possible. They’re going to do link analysis on the top trusted sites in their niche and then do their best to get a link from one of the sites that is directly linked from those trusted seed sites.

By the way, competitive backlink tools like the one offered by WebCEO are essential when doing this type of link research.

Smart marketers are going to stay the course and keep focusing on building links naturally by providing amazing resources for users, doing local events to gain local attention and staying far away from thin spun content!

The NOFOLLOW tag: There is a way to avoid wasting or diluting link juice (PageRank) when linking out to other sites. It involves using the NOFOLLOW tag. This is an attribute added to the source code of a link that tells Google to ignore the anchor text and to NOT pass along any link juice. It looks like this:

<a href=”signin.php” rel=”nofollow”>sign in</a>

Here are some cases in which you might want to consider using nofollow:

  • Untrusted content: If there’s any chance you’re linking to junk like untrusted user comments or guestbook entries, you should use nofollow on those links. Doing so tends to discourage spammers from targeting your site to manipulate their own rankings and will avoid inadvertently passing PageRank to bad sites that might cause your site to be associated with bad neighborhoods on the web.
  • Paid links: Search engine guidelines require bot-readable disclosure of paid links in the same way that consumers expect disclosure of paid relationships. Using nofollow on paid links, if you have them on your site, satisfies Google’s disclosure requirement. (Learn more about Google’s stance on paid links)
  • Crawl prioritization: Search engine robots can’t sign in or register as a member on your forum or site. Therefore there is no reason to invite Google’s bot to follow register here or sign in links.

While it’s true that PageRank is only a part of Google’s master algorithm for ranking web pages within its search results, it’s the most important one related to links coming from external websites. This makes it critical that you understand the basics of PageRank, which we’ve covered in this chapter so you’ll have the foundation for moving on to the next chapter that covers all of the off-site, external ranking factors.

Algorithms & Updates

In the world of SEO, Google’s algorithmic evolution is referred to as Updates. And these updates are often given names that suggest they are algorithms unto themselves. If this only makes partial sense, don’t worry about it. It’s only important that you will know what people are talking about when they use an obscure-seeming term to reference a change in Google’s algorithm.

Some of these updates are such minor tweaks that nobody bothers naming them. But some updates are MAJOR changes that completely alter the search ranking for thousands, if not millions of sites. To say these major changes have been upsetting to online businesses that depend on top rankings would be the understatement of all time, SEO-wise. Suffice it to say that if your site were benefitting from top rankings to the tune of a million dollars a month, which dropped close to zero overnight as a result of an update, then you’d probably be upset too. Believe us when we tell you that can happen because it most certainly has! Many times.

So, two things.

  1. When you hear someone like us or anyone else using terms like, Hummingbird, Penguin, Panda, Phantom, Dewey, Buffy, Big Daddy, Jagger, Gilligan, Bourbon, Allegra, Brandy, Austin, Florida, Fritz, Dominic, Cassandra, Caffeine, Boston, or Google Dance, please know that we are referring to the nickname that’s been given to one of Google’s algorithmic updates. And, you can bet that if the update is important enough to name, then it has significantly shaken up the order in which sites are being ranked.
  2. In every case there are winners and losers. Some sites benefit from algorithm updates. Some sites lose big time! Generally speaking, the sites that follow our advice tend to win. That’s because we strive to alert you about the trends that Google is favoring or about to start favoring. But more importantly, we warn you about the so-called “winning” strategies that are about to fall out of favor so you can phase them out in time before you get hit with one of Google’s updates.

Of course, we don’t control Google, we aren’t actually privy to their behind-the-scene plans. But we’ve been doing this since 1997 and our track record is excellent at predicting what Google will “hate” next. As a general rule, we figure that…

If Google CAN do something and it serves their interest to do it, then they WILL do it.

So far this logic has proven perfect. It even makes it easy to predict what they will do even if we can’t tell you exactly when they will pull the trigger.

So, when we warn you (via your SEN Membership) about some SEO strategy that is currently working, and tell you to phase it out, then you had better listen.

And that’s the best advice you will find in this entire book!


Chapter Eight, Off-site, External Ranking Factors

Chapter Eight
Off-site, External Ranking Factors

In Chapter Six we covered the ON-site ranking factors. You should always remember that each of the on-site ranking factors should be included and optimized in order to achieve a baseline ranking advantage over your competitors. If you neglect the on-site ranking factors, your web pages will be disadvantaged out of the starting gate in terms of ranking well.

Regardless, you should also remember that it’s the OFF-site ranking factors that will give your web pages the biggest ranking boost. So, once you’ve optimized the ON-site elements that you actually have direct control over, you must shift your focus to the most important OFF-site ranking factors that include:

  • External Links – links coming from other websites
  • Anchor Text in External Links
  • Link Diversity
  • Domain Trust & Authority
  • Author / Publisher Identification
  • Geo Location Signals
  • Traffic Signals (Toolbars/Browsers/Tools)
  • Quality Signals (Bounce rate/Time on site, etc.)
  • Citations / Reviews
  • Social Signals
  • External Links

External links are the links coming from other websites that are presumably out of your direct control. Google places a lot of relevance on external links because they are hard to manipulate. They rightly figure that websites typically will only link out to other websites when their content is valuable and unique. Therefore your goal should be to create and provide content that compels other sites to link to it. Here’s a great place to start looking for ideas:

  • Anchor Text in External Links

As you have already learned, anchor text is simply the text found within a link. Search engines view anchor text as a strong relevance indicator. They figure that, what others think your page is about (based on the anchor text) is more important that what your page actually says it’s about. Therefore, external link anchor text is more important, ranking-wise, than any of your on-site relevance factors. After all, if an outgoing link on an external page says: Green Canyon Whitewater Rafting Adventures then Google can be pretty sure about the content of the page it’s pointing toward.

Take note that naturally occurring incoming link profiles will typically have only a small number of keywords in the anchor text. Most will simply include the URL or domain name or company name or brand. If your inbound links contain too many targeted keywords, they’ll look unnatural and may trigger a penalty. Using our example above, a natural incoming link profile might look something like this:

…and so forth. Notice that none of the links are identical as one would expect from links that occur naturally and without compensation or other mutual considerations. It is ok, however, if some of the incoming links occasionally repeat from time to time, but if most or all of the links were: Green Canyon Whitewater Rafting Adventures then Google would believe the beneficiary site (i.e., Green Canyon) was in control of those links and would therefore discount them or even penalize the site for having an unnatural incoming link profile.

So, to maximize the value of links pointing at your site, the anchor text should be variable and NOT appear to be reciprocal or paid for. Working your best keywords into your domain name and company name, however, can be advantageous because Google attaches relevance to domain names and LOVES company names and brands. Such a strategy gives other sites a legitimate reason for using your targeted keywords in the anchor text of their outgoing links.

  • Link Diversity

The more diverse your incoming links, the better. To use a simple example, let’s say that all of your links are coming from only a few sites and they’re all pointed at your home page. Since there isn’t much diversity, and because they aren’t linking to your sub-pages, these links won’t help your rankings much.

Non-diverse Links

On the other hand, if your site has a variety of external sites all pointing at your site, with many of the links pointing at sub pages, then your rankings will be helped considerably based on your link diversity. The screenshots above and below illustrate a simplified representation of non-diversified vs. diversified incoming link profiles.

Diverse Links

Suffice it to say that you should strive to acquire a diversified incoming link profile from a diversified variety of sites, coming from a diversified variety of locations, using diversified anchor text, referenced by diversified content (social media, news sites, reviews, directories, blog posts, trade articles, etc.), by diversified authors and diversified publishers. As you can see, the more diversity, the better. Search engines really like diversified incoming link structure. It fosters trust and is therefore a verystrong relevance indicator.

  • Domain Trust

Domain Trust is a very important element of the ranking algorithm. Your goal should be to acquire links from domains with a high Trust-rank. Conversely, you should avoid getting links from sites with a low Trust-rank.

Obviously you need links, but even more important is getting links from the right places. For example, if your company sells nutritional supplements, having a link from NASA.gov (a site with a high Trust-rank) will help your rankings FAR more than having a link from a site like cheap-acai-berry.com. The NASA.gov link will add trust to your site while the cheap-acai-berry.com link may even subtract trust.

Earlier you learned about seed sets. These are starting places for search engine robots that crawl the web. The bots start there because seed-set sites are highly trusted. Therefore, a link from a trusted seed-set site is an excellent link. But if you can’t (yet) get a link from a seed-set site, then having a link from a site that has a link from a seed-set site is a very good link to get. It’s not as great as a seed-set link, but it’s a good link because it carries a fair amount of trust based on the fact that it is linked-to by a seed-set site. Such a site will also pass along to you a fair amount of trust.

As you move farther away, link-wise, from a seed-set site, trust decreases because the likelihood of spam increases with each hop. Reverse Trust-rank is also a reality. That’s because search engines measure how many links away from a bad site you are. The closer, in links, your site is to a bad site, the more trust your site loses. Therefore, you need to know that linking to bad sites can decrease your own site’s trust. So the take away message here is:

Be careful who you link to!

To a lesser degree other Trust-rank elements include:

  • Domain age; the older the better – Search engines tend to trust established domains with a track record of playing by the rules.
  • Google Analytics – Your website server stats give Google an abundance of information about your site that can indicate your trustworthiness.
  • Hosting Company – Some website hosting companies have a reputation for sleaziness that will hurt your Trust-rank. You should avoid hosting with companies that host sites of questionable trust, just like you should avoid living in a bad neighborhood.
  • Postal Address – Search engines can easily determine a quality, upscale address from a skid row or drop-box address. It’s better to use a good neighborhood’s physical address than a post office box or demographically questionable address location in your company contact information.
  • Phone Numbers – Remember that phone numbers are unique identifiers. They are easily connected with everything they have ever been associated with. If a phone number is associated with business dealings of dubious endeavor, this can hurt your site’s Trust-rank.
  • Interconnectivity – Always remember that search engines are in the business of connecting all of the data-dots. If your business is mentioned, cited, reviewed, associated, reprimanded, and so forth, the engines “see” this. The esteem, or lack thereof, associated with your website, regardless of where it occurs, does not go unnoticed. Who your website and company is associated with tells much about the trustworthiness. All of these interconnectivity signals can and will factor into your Trust-rank.
  • Domain Authority

Domain Authority is somewhat like PageRank, but on a domain-wide level. Factors that contribute to Domain Authority include:

  • High Link Diversity
  • Rate of Link Acquisition
  • Lots of Deep Links to a variety of pages

A domain with high Authority can often get a new page ranked very well, very quickly with only a few internal links. For example, CNN.com can rank a fresh page immediately based on their Trust and Authority.

  • Geo Targeting Signals

Geo Targeting is another element that is factored into the search engine algorithm for reasons that, in many cases, are simple to understand. For instance, someone in Seattle searching for a restaurant probably is looking for a local result. It’s unlikely that a Boston restaurant would be relevant to this specific search for this specific person.

Therefore, as you might expect, the search engines look for geo-targeting signals that assist them in determining what’s relevant for any particular search request. Such signals might include:

  • Top Level Domain Name (.com, .de, .co.uk)
  • Language
  • Geo-location of incoming links
  • Host / IP Address
  • Webmaster Tools’ Geo Targeting Setting
  • Google Places Registration
  • Domain Registration Postal Address
  • Location of Visitors / Google Analytics and other sources.
  • Images may be Geocoded
  • Traffic Factors and User Behaviors

Google and Bing are known to monitor the aggregate behavior of searchers after clicking a link in the search results. And there is every reason to believe they factor such behavior into their algorithm. Specifically, they utilize the tools at their disposal — Google Analytics, Google Toolbar, Bing Toolbar, Internet Explorer, etc. — to determine relevancy based on:

  • Time On Site — The more time a visitor spends on a site, the more relevant that result is to the search that brought them to that page.
  • Bounce Rate — If a searcher clicks a link in the search results but quickly returns from the site (bounces), then Google assumes the result was NOT relevant to the search. And, any site with a consistently high bounce rate is assumed to be an irrelevant site.
  • Click Through Rate (CTR) — If a particular search result is frequently clicked, then it has a high click through rate (CTR). This is a good thing and adds relevance to that page.
  • Conversion Rates — If visits to a page consistently culminate in conversions (i.e., sales, sign-ups, registrations, etc. as measured by Google Analytics), then obviously the page is relevant to the search.
  • Return Visitors — If a high percentage of visitors return to a page, then the page must be relevant to the search that originally brought the visitor to that page.
  • Citations

Citations are external site references to your business, address, phone number or other unique identifiers. They aren’t links. Instead they are mentions that can be distinctly tied to your website and/or company name and used to identify some aspect of quality or association related to your operation, service, product, or people.

Citations are currently influencing the rankings in the Local Pack as much as Reviews and On-page content. That’s why they’re SO worth getting. Use the following resource for guidance on the best way to go about building citations.

  • Social Media

Social Media (i.e., Facebook, Pinterest, Instagram, Twitter, etc.) can be a very powerful traffic generating and brand building tool. Google has made it clear that they don’t use social signals within their algorithm but evidence shows that increased traffic to a page does boost it’s overall authority.

The following tutorials will help you efficiently leverage the power of social media and save you a lot of trial-&-error time. (requires SEN Membership login).


Chapter Nine, Negative Ranking Factors

Chapter Nine
Negative Ranking Factors

As we’ve alluded to throughout this book, there are situations, errors, elements, relationships, and strategies that can work against you, ranking-wise. Collectively, we call them Negative Ranking Factors

The most common Negative Ranking Factors include:

  • Hidden text and/or links
  • Keyword Stuffing
  • Cloaking
  • Buying links from link networks/brokers
  • Getting links from bad sites
  • Linking to low quality or bad sites
  • Over-optimized anchor text
  • Low quality content
  • Redirecting the user with the intent to mislead
  • Server down or connection problems with website
  • Hidden Text and/or Links

Remember in Chapter One we mentioned how webmasters used to stuff web pages with keywords in order to manipulate search rankings? Part of that strategy included hidden text and links that the spiderscould see in the source code even though people could not. Well, search engines do not like content that only spiders can see. It’s one of the surest ways to get penalized.

For example, take a look at the box below. Inside is a bunch of text and a single link that matches the background color of the page, rendering it invisible. And because the font is set to super-small, we’ve managed to stuff 1280 “invisible” keywords plus one link into a very small space. You can highlight the text inside the box, then copy and paste it into a text editor to see for yourself what a search engine spider would see.

keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword keyword <a href=’link.com’>Link</a>

The border is there for your convenience, to make it easier to find the text. In the old days, we’d simply make this our header and footer (without the border) and, presto, we’d have a number one ranking within a day or so. Today, we’d get penalized or possibly even banned from the index.

  • Keyword Stuffing

The example above (inside the bordered box) represents a most egregious form of keyword stuffing. It’s pretty obvious that you shouldn’t do that. But a little less obvious is the fact that you should avoid superfluous overuse of keywords in your body content, headline tags, title tags, meta tags and anchor text.

If Google thinks you’re overusing your keywords, they’re likely to penalize your web page’s ability to rank well. Furthermore, such a violation of their guidelines will probably affect your entire site’s ability to rank well. Use this test: Read your content aloud, if it sounds a little weird like you are repeating any particular keyword(s) too often, then you probably are.

  • Cloaking

Cloaking refers to an old SEO technique where the content that’s presented to spiders is different from the content a site visitor would see when visiting the same URL.

This is made possible when the website’s server identifies a visiting spider based upon the spider’s known IP address. When that spider visits to crawl and index the page, it is served a special page designed specifically for that spider instead of the page that a normal site visitor would see. In such cases the normal page is said to be cloaked since the spider cannot see it.

One reason for cloaking is to help spiders understand the content of a Flash video. In the past, this was considered useful because spiders can’t easily index Flash video. However, cloaking has also been used to manipulate search rankings. For this reason, Google frowns on all forms of cloaking.

Our recommendation is that you entirely avoid cloaking. And, fortunately, there is little need for cloaking these days. The evolution of Responsive Web Design, coupled with the fact that spiders have gotten much better at indexing all forms of content, means there is no good reason to use cloaking as an SEO technique.

  • Buying links from link networks/brokers

Since much of Google’s algorithm is based on naturally occurring incoming links, it stands to reason they would not like links that are presented as “natural” but, in fact, are paid links. Mind you, they don’t care if you buy links, per se, but they strenuously object to any attempts to make those links look like they are naturally occurring, organic, unpaid-for links.

So, I hear you asking: How can they tell the difference? …good question. Here’s how. For starters, Google is in the business of identifying networks and relationships. It’s what they do, and they are VERY good at it! When someone is selling links (i.e., a link broker), Google quickly figures them out by seeing what looks like an unnatural network of link relationships. In addition, the anchor text will often be over-optimized with targeted keywords that make the links look contrived.

The bottom line is that you should avoid buying links UNLESS you ensure that:

  1. The links are nofollowed, telling Google that you do NOT expect to receive any link juice (PageRank) from those links. This satisfies their terms of service (TOS) and in such cases, paid links are ok.
  2. Or, if there is no possible way that Google could learn the links are paid (good luck with that, btw).

Remember that Google knows about ALL of the brokers, ALL of the networks, and ALL of the websites that sell links. It’s their business to know. If it is mathematically logical for them to know it, they either know it already or else soon will know it. We have never found an exception to this rule.

  • Getting links from bad sites

Earlier we talked about such algorithmic factors like PageRank, Trust, and Authority. You’ve learned that having links from pages with high PageRank and domains with high Trust and Authority can help your web pages rank well even if you have only a few such links.

And, you’ve learned that links from low PageRank, Trust, and Authority sites will not help you nearly as much ranking-wise. Well, in fact, having links from sites that are guilty of violating Google’s Webmaster Guidelines or Terms of Service can actually hurt your rankings. That’s because Google views such sites as bad sites. And the sites they link to are considered to be bad neighborhoods. So, as you might imagine, being associated with a bad site or a bad neighborhood will kill your site’s ability to rank well.

Therefore, it’s important to pay attention to who is linking to you! You should strive to acquire ONLY links from reputable sites that are trusted in the eyes of Google. This is a quality over quantity issue that you should take very seriously in your quest for top rankings.

  • Linking to low quality sites

Of course you should also be very careful who YOU link to. Not only are you passing PageRank, you are effectively recommending them. If Google sees you are linking to (i.e., recommending) sites that are in violation of their guidelines or TOS, then Google will think you’re endorsing dubious practices. This will hurt your rankings. So, be very careful who you link to.

  • Over-optimized Anchor Text

As previously mentioned, there can be too many targeted keywords in a link. Such links can work against your ranking efforts when Google detects them. That’s why you should do your best to acquire a very natural looking incoming link profile that avoids stuffing all of your best keywords into all of your incoming links. For more information on this, see:

  • Low Quality Content

It’s better to have fewer pages of high quality than it is to have a lot of pages of low quality (duh!). Make sure you avoid anything that looks computer generated or simply rehashes content found elsewhere on the web. Your goal should be high quality, unique, and compelling content that other sites will want to link to. To gain a deeper perspective on this, go to:

  • Redirecting the user with the intent to mislead

Google will seriously penalize (or ban) any site that misleads users. Therefore it stands to reason they don’t like pages or sites that use redirection to trick visitors into landing on a different page than the one Google indexed. So, don’t do that, ok?

  • Server down or connection problems with website

Google expects that your site will provide a high quality user experience. Glitches and downtime are the antithesis of a good user experience. Therefore you must see to it that your web hosting service provider is reliable and that your pages are not broken, missing, or otherwise contrary to what Google expects them to be based upon what their spider has indexed.

Summary

You first goal in SEO should be to avoid making any of these dumb and avoidable mistakes. If you fail to do so, it won’t matter how good you are at any of the other strategies. So, priority-wise, you must:

  1. First, avoid the mistakes that can get your site penalized or banned.
  2. Second, get your ON-site elements in place and optimized.
  3. Third, focus on continuously sharpening your off-site strategies.

And always remember, if you piss Google off (by making these dumb mistakes), it may not matter for a very long time how good you are with the rest of your SEO efforts.


Chapter Ten, Link Building

Chapter Ten
Link Building

Acquiring incoming links from external high quality websites is of the utmost importance. It’s also the most challenging aspect of search engine optimization (SEO). From where you get them, and how many you need, depends entirely on what websites you’re competing with for top rankings. You don’t have to be perfect in every sense, just closer to perfect than whoever you’re competing with. So, the answer is always relative and is different in every case.

Although we expect that links will continue to be one of the most influential factors for the foreseeable future, exactly how search engines value links will continue to evolve as developments in social media, profiles, and authorship also continue to evolve in regards to authority and trust.

Let’s start with the basics. Below you see an illustration of the simplest and most common link relationship, a one way link.

one way link

The next illustration is a view of another common type of link relationship, the reciprocal link.

reciprocal link

Both the one way link and the reciprocal link pass PageRank (link juice) to the site being linked-to. And, since Google and the other search engines base a large part of their ranking algorithm on these two types of link relationships, such links hold the potential to significantly boost your rankings.

Sometimes, however, there are link relationships that should be ignored by the search engines in order to avoid wasting PageRank or running contrary to Webmaster Guidelines or Terms of Service. In such cases the nofollow attribute or robots.txt file can be used to protect your site from diluting its link juice or violating the rules.

Nofollow attribute

The illustration above describes a link relationship, which will be ignored by the search engines but without sacrificing the possibility of the link driving traffic to your site.

You already know that anchor text refers to the text found within a text link. There are several different ways that natural looking anchor text is typically displayed. Some are better than others as such:

  • <a href=’http://domain.com/page.html’>Keyword</a>
    Using your keyword as anchor text is the most powerful type of link. However be careful to not over do this. Too many keyword-optimized anchor text links can make your link profile look unnatural and thereby trigger a penalty. We recommend that you limit this type of incoming link to 20% or LESS of your incoming link profile.
  • <a href=’http://domain.com/page.html’>Brand Keyword</a>
    Using a brand name with a keyword as your anchor text is powerful and also safer because Google LOVES brands.
  • <a href=’http://domain.com/page.html’>www.domain.com/pages.html</a>
    Using the URL as your anchor text is less powerful but very safe. There should always be a few of these in your incoming link profile to make it look natural.
  • <a href=’http://domain.com/page.html’><img src=’http://domain.com/img.jpg’ alt=’Keyword’><a>
    Using an image-link as your anchor text is also good as well as very safe but not as powerful a relevancy factor as a text link. That’s why you should always remember to use a relevant keyword as your file name as well as in your Alt tag.
  • <a href=’http://domain.com/page.html’ rel=’nofollow’>Keyword</a>
    Using nofollow means the link will be ignored by search engines except on a few highly trusted and authoritative sites like Wikipedia.

The general rule of thumb is, if the link is visible it will be taken into account by the search engines unless it is excluded in robots.txt or assigned the nofollow attribute.

Remember also that Google can read, index and will likely count links in all sorts of file types like: .html, .swf, .pdf, .ps, .dwf, .bas, .c, .kml, .gpx, .hwp, .java, .xls, .ppt, .docx, .odp, .ods, .odt, .pl, .py, .rtf, .svg, .txt, .css, .ans, .wml, .wap, .xml.

As you’re no-doubt seeing, the key to having a natural looking incoming link profile is to mix it up. You should avoid a large percentage of exact match keyword (EMK) anchor text. You should also avoid participating in link schemes. These include the so-called link wheels, link exchanges, text-link networksand anything that might be detected or interpreted as paid links (unless they are nofollowed). You should also avoid getting links from off-topic, irrelevant sites, especially in high numbers.

Link locations matter as well. The illustration below maps out the prime locations for your links.

Link locations matter

As the graphic shows, links found within the Header and the Body Content are assigned more importance than links found in sidebars and footers based on Google’s 2010 Reasonable Surfer patent that correlates link relevance with link visibility.

The location of a link within the HTML code of a web page matters. Specifically, it is ONLY the first link to a site that passes PageRank. The rest of the page’s links to the same site, whether they are in a list (as seen below) or scattered throughout the page, usually do not pass PageRank.

Link Code Location matters

As the screenshot above suggests, PageRank will pass to the page listed in the first link. All of the secondary links are ignored in terms of PageRank unless a hashtag (indicating a specific internal location within a page) is used.

This also means that if you have multiple links to the same page on your site, the first link (from top to bottom in the source code) is the link from which Google will use the anchor text to help determine relevancy when ranking your page.

Based on the various patents that Google and the other engines own, there is every reason to believe that the following list of factors either do or soon will influence the value of outgoing/incoming links in the eyes of the search engines. For instance:

  • Font Size — Normal font size looks right while super-tiny font does not. If any link font-size is considerably smaller than the text surrounding it, that link will probably be discounted.
  • Location — The page location of a link is important. As previously mentioned, links found in headers and body content are viewed with more importance than links found in sidebars and footers.
  • Position of Link (top/bottom) within a list (<li>) — You should expect that links found at the top of a bullet list will be the most valuable. The rest may be ignored in terms of PageRank and anchor text.
  • Font Color as compared to background — If the font color matches the background too closely the link will likely be discounted. This is because the link will be invisible to people and only seen by spiders, a strategy used in the past to spam the search engines.
  • The number of words in anchor text — Stuffing too many words into your anchor text can make it look unnatural and will likely work against you.
  • Actual words in anchor text — Over-optimizing with too many targeted keywords can also work against you.
  • How commercial the anchor text is — Words like Buy, Shop & Cheap tell the search engines that your content is commercial rather than informational and may cause the link to be somewhat devalued.
  • Type of link — Typically a text link will carry more value than an image link.
  • Aspect Ratio of image link — If the aspect ratio (the size of the image) looks suspicious, then the value of the image link will be discounted. An example would include a 1 pixel square image that would effectively be invisible to people and seen only by spiders.
  • Nearby word content and topical cluster associated with the link — The context and topic cluster of the text that the link is associated with can influence the weight given to the link. Topically associated text is good, topically unassociated text is not good.
  • The URL of the linking page — Having the keyword in the URL of the page that is linking to you can boost the importance of that link.
  • Number of links on the page — If a page links to you with their only link out, then your page will benefit from 100% of the available link juice (PageRank) being passed. However, if that page also links out to 10 other pages, then your page will receive only about 10% of the available PageRank. The fewer links on the pages that link to you, the better.
  • Content found in specific places on a page — The content found in the title and headline tags as well as the body content in the upper portion of the document (i.e., above the fold), helps determine how important an outgoing link will be viewed.
  • User Behavior (Toolbar Data) — Behavior like bookmarking, revisiting, page sharing, etc. influences the importance assigned to a link. And whether or not people actually click the link, matters.
  • Language — can also be a factor insomuch as outgoing/incoming links would be expected to match the language of the page they’re pointing toward.
  • Click Rate of Link — As previously indicated, how often people click a specific link matters. The more the link is clicked, the more important the link.
  • Page Authorship & Social Authority — Links coming from well known, registered, and socially networked authors are likely to be given more importance than links coming from unregistered, unknown and/or socially unconnected authors.
  • Link acquisition rate is also an important factor. Google figures that natural link profiles are built gradually, over time.

    Link acquisition rate matters

    If you suddenly acquire a disproportionate number of links, that signals a possible link buy and raises red flags. As you know, Google frowns on purchased links unless they are nofollowed. Therefore suddenly acquiring a lot of links can actually work against your ranking efforts.

  • Remember also that your incoming anchor text must vary in ways that make your incoming link profile look natural.

    Natural Link Profile

    Links that point deep into your site’s subpages lend the appearance of quality content and help make your incoming link profile look natural.

  • Be sure to focus on Canonical URL consistency.

    Canonical URL consistency

    The graphic above illustrates how PageRank can be needlessly diluted when using more than one of the many different versions of your home page URL, even though all of them will land a site visitor on what is apparently the same page. The ONLY way to avoid such PageRank dilution is to choose one canonical (preferred) URL and see to it that you and others consistently link only to your canonical URL.

Your BEST Link Building Strategies should include:

  • Trust-rank and Authority. Ideally you want links coming directly from trusted domains. If that is not (yet) possible, then your goal should be to get your links from sites that have links from trusted domains. The closer a site is, link-wise, to a trusted domain, the better. The further away a site is, the less the link is worth.
  • Links from on-topic (semantically related) hubs and authorities are highly valuable – If your site sells sewing equipment then you want links from popular sites whose topics include everything related to sewing – sewing instruction, sewing patterns, garment material suppliers, and so forth (based on the Hilltop Algorithm).
  • Links coming from so-called restricted top level domains (TLDs) tend to carry more trust.These include educational institution (.edu), governmental institution (.gov), and military organization (.mil) domains.
  • Links coming from high PageRank pages will also help your rankings considerably.Remember, however that PageRank is only one of many indicators of a page’s importance, trust, and authority.
  • Unique Domains vs. Total Volume. Google favors pages with a fair number of links coming from a wide variety of domains over pages that have lots of links but coming from only a few domains. The more domains that link to you, the higher your site’s level of trust.

    It’s worth mentioning that Bing, however, is not nearly as advanced as Google. They tend to favor more links regardless of where they come from. So, one could say that Google is more quality orientedwhile Bing seems more quantity oriented. But since Google is the 800lb gorilla and Bing is the 90lb weakling, we recommend that you focus on creating a link profile that caters to Google over Bing.

  • Deep Link Ratio. As previously indicated, deep links can dramatically improve your site’s ability to rank well. It can also help your site get completely indexed. These are links that are NOT to your home page, but rather to the various pages within your site. It’s worth noting that highly successful shopping sites are running upwards of 80% deep link ratios. This means that 80% or more of their incoming links are pointed at subpages.
  • Hyperlocal Links. IF your business depends on acquiring local customers within a geographic area, it can be highly beneficial to get links from sites that endorse local businesses. These would include sites like Chamber of Commerce, Better Business Bureau, Local Trade Associations, Local Directories, Schools, Hotels, Restaurants and Community Action Groups.

The Four Biggest Mistakes to Avoid in Your Link Building Efforts

  1. Run-of-site links — Having your incoming link on every page of an external site is BAD! In the eyes of Google, it’s a sure sign of a paid-for link. Such run-of-site (ROS) links should be carefully avoided.
  2. Link Farms, Reciprocal Link Networks, Web Rings, Paid Link Networks, and Link Wheels — aka, Link Schemes should be avoided like a leaky boat in a swamp full of alligators.
  3. Linking to low quality sites (aka, badsites — examples would include topical sites such as gambling, adult, pharmacy, loan, debt consolidation — any site that promotes controversial topics or products or which uses dubious SEO strategies.
  4. Linking to off topic sites — is a bad idea since going off-topic is the antithesis of relevancy. If your site is about sewing, you should only link out to sites that are related in some way to the endeavor of sewing. Clothing, garments, patterns, crafts, sewing equipment, sewing lessons, sewing tips and so forth. But linking out to your webmaster’s design company, your local real estate broker, your brother’s vacation rental, or a political action site is a bad idea since all such sites would be considered off-topic in respects to sewing. But, if ever a situation compels you to link to an off-topic site, be sure to nofollow the link.

Summary

Boiled down to its essence, link building is a popularity contest. The search engines want to rank the most popular sites at the top of the search results. Therefore you must do everything in your power to entice people to LOVE your pages because when they do they will:

  • share links to it on other web pages
  • comment on blog posts, forums, etc.
  • write reviews, rate products and talk about the brand and product names.
  • Like, +1, Share, Tweet, Instagram and Pinterest photos and videos of your products or services.

…and the engines use all of these indicators to determine the importance of websites and their pages. Therefore it is critical that you make it easy for them to “see” how popular your site’s web pages are.

To get you started on the right foot building links, check out The Vault on SearchEngineNews.com, which currently has close to 20 resources on Link Building alone!


Chapter Eleven, Webmaster & SEO Analysis Tools

Chapter Eleven
Building Your Network & Finding the Right Tools

High FiveYou’ve made it this far, Congratulations! You’ve gained an excellent foundational understanding of SEO.

High Five!

As you can see, SEO is complicated. But if you think about it, that’s kinda great because it works to your advantage. Here’s why.

  • If you’re a business owner, you now have an big advantage over your competitors. And whether you do the SEO yourself (not recommended because you’ll be taking away time from actually running your business) or if you hire it done (highly recommended)you’ll KNOW what to expect and you’ll know when they’re doing it right.
  • If you’re a current or soon-to-be professional SEO, then you now have a foundational awareness of the fundamentals. This gives you an advantage over your competitors who are missing these critical insights and forms a solid foundation upon which you can build your expertise.

But regardless of whether you’re doing SEO for your own business or doing it professionally for business clients, you now know that…

the only constant in SEO is change.

Seriously! Google changes something every month. If it isn’t the guidelines, it’s their terms-of-service … or else it’s something important within their “acceptable” strategies — sometimes it’s all three and many times it’s something brand new! …or maybe it’s something they’ve just shut down.

Regardless, keeping up on these changes can be daunting!

Honestly, if WE didn’t have a network of pros who share their research and strategies with us, and quickly alert us to new problems and features, we’d be struggling to keep current with all of the changes too!

But, we have solved that problem!

…at least for ourselves. And, we’d like to help you solve that same problem for YOURself.

How?

By inviting you into our network … what we call…

The SEN MasterMind.

Actually, it’s a system which includes…

  1. The SEO Fundamentals Bootcamp which ensures that you’re on the same page as the rest of us. Say goodbye to GUESSING about the important SEO stuff that Google expects you to get right – or else!
  2. An exclusive chat-space where you can post questions 24/7 and get answers from our network of professionals which includes our SEN experts. Sometimes it only takes minutes to solve your most baffling problems.
  3. FastAnswers™ to private questions delivered via email within one business day from one of our certified SEO experts with literally a decade or more of experience.

This 1,2,3 approach to being SURE you’re getting your SEO right is as good as it gets. It’s like having the collaborative team of an Agency but without the high overhead!

 Start here…

The 2018 SEO Fundamentals Bootcamp, beginning on August 21st, covers the 99 Fundamental Ranking Factors and Strategies that apply to Google’s algorithm.

 Your Next Move…

Whether you connect with us through our Fundamentals Bootcamp and become part of The SEN MasterMind or not, your next move should be to get yourself into a trustworthy network of professionals and business owners who “do” SEO in the wild so they can help you with problems and challenges as they occur in real time.

SEN MasterMindConsider this to be your most essential action. It’s not realistic to think you can know everything all by yourself. Much of our success depends on adjusting to changes in real time and adapting our strategies for unique circumstances. We certainly know from experience how dependent we’ve become on our own braintrust to always find the best solution. Indispensable!

So your next move is to build your braintrust. That’s what we’ve done for ourselves with The SEN MasterMind — it’s our braintrust of people who do SEO for their own business or else as a professionalworking for their business clients. Here’s the best way to get in.

 Plus, Monitor The Industry Publications

It’s important to stay informed through the industry publications. Google’s blog announcements, guideline changes, and the workshops & events they publish on their YouTube channel are especially important.

There are also a number of forums as well as a plethora of SEO publications and “experts” chiming in on a variety of topics. Some good, some not-so-good. We won’t lie to you, keeping up on everything, while sifting through the junk to find the gems, is a BIG JOB.

In fact, it’s such a big job that we have a staff dedicated to finding all of the important stuff and breaking it down into a comprehensive monthly digest — everything you need to know for the entire month, published all on one page which SEN members receive on the 1st of every month. They avoid being distracted by a daily bombardment of superflous noise. They have much more time to actually run their business because they don’t have to monitor everything on a daily basis. We do that for them.

Fyi, we call our SEO digest SEN (Search Engine News), and we include it as one of your benefits when you join our network. In case you’re wondering, we’ve been publishing SEN on the first of every month since early 1997 so you can bet we’re pretty good at it.

 Btw, You Can’t Do SEO Without Tools!

SEO is a tool-intensive endeavor. You can’t do SEO without tools. It would take too long, even if you coulddo it. To illustrate this fact, let’s take a look at a short list of tasks for which we use the tools of our SEO trade. These are all essential tasks that need to be done in no particular order of importance:

  • Find all of the coding errors on your site so that Google doesn’t penalize you for being in violation of their terms of service (TOS). Errors on web pages will get you buried in the search results. Google expects your pages to be absolutely error free. So, we have a tool for that.
  • As you know, keywords are important. So we have a tool that tells us exactly what keywords a web page is most optimized for. Besides analyzing our own Web pages, we use this tool to quickly learn what keywords our competitors’ are targeting.
  • Obviously, LINKS play a critical role in determining whether or not your Web pages are highly ranked.

    That’s why we have link tools that tell us how many links we (and our competitors) have. Link tools that identify the best sites from which to get links, and tell us specifically which pages within a site are the most powerful pages to get links from.

    Our link tools also perform backlink quality checks, competitor backlink monitoring, and backlink integrity tracking. They help us find broken links and detect what Google calls unnatural links and toxic links for which we’ll be mightily penalized unless they’re fixed.

    From a practical standpoint, link prospecting and link analysis is impossible without these tools. We wouldn’t know which links are helping and which ones are hurting our SEO efforts. Nor would we be able to compare our own link structures to our competitors which would put us at a great disadvantage.

  • Keeping track of your Web Page rankings is another important function that should be automated. That why we have a tool that gives us periodic ranking reports. We also have tools that measure and monitor our Web Buzz, Social Engagement, Brand Mentions, and Social Traffic metrics.Without these tools to monitor our progress and tell us if our SEO efforts are paying off, or not, we’d be flying blind and guessing.
  • These days, local businesses need a LOT of SEO help. And because Local Search is currently such a hot market, we use another tool that’s designed to locate unverified local business listings in a particular geographic area.

    This helps us quickly find Local prospects who need help complying with the increasingly complicated local search requirements and web-presence guidelines, rules, and ranking strategies. Using this tool makes it easy to find prospects we can turn into clients.

  • We even have an SEO Checklist tool that verifies and validates every optimization stage and type of activity enabling you to perform SEO effectively, even if you’re new to the process.

The bottom line is:

If your competitor is using these tools, and you aren’t, your competitor will win.

You simply can’t do SEO without tools. Whether you use our tools or another toolset, you must have access to SEO tools to do SEO.

 Summary – To “Do” SEO, you need…

  1. A firm grip on the history and the fundamentals of SEO.
  2. A braintrust or some kind of like-minded network of eyeballs and mentality that asks questions and shares findings with the group on a real time basis via an open communication channel that is available 24/7.
  3. To monitor the industry publications, forums, blogs, and experts. You’re looking to stay abreast of new product developments, guideline changes, strategy updates, and product or feature shutdowns at the very least.
  4. To apply the SEO Tools of the trade to track, monitor, and analyze the myriad functions that are critical to a site’s overall web presence, marketing, and ranking strategies.

So, where can you find all that?

 Start here — The 2018 SEO Fundamentals Bootcamp where you’ll find all of these components neatly tucked into an all-inclusive package for a fraction of what it would cost you in time and money to recreate them yourself.