The Link Building Book
9 minute read

The History of Link Building

9 minute read

The role links played in Google’s growth

As we will discuss shortly, it was the invention of PageRank and the vastly improved results it led to that helped Google push ahead of their competitors very quickly. Usage of Google outside of Stanford University soon increased and, after a short period of time, Google became a living, breathing commercial entity. Google was attracting lots of users and keeping them happy. As a result, businesses (and SEO practitioners) started paying much closer attention.

Remember that some SEO practitioners had been around before Google came along in the late 1990s. Before then, Yahoo and AltaVista were leading the way. Links, of course, existed before Google, as did anchor text and it was certainly something that SEO professionals were aware of. One of the earliest mentions of anchor text I could find reference to was in 1998.

However, Google was quickly gaining momentum and SEO professionals turned their attention to figuring out exactly how Google worked and what made it tick. The sheer power of links and anchor text in Google search results became apparent, showing their reliance on these signals as a ranking factor.

This, combined with the first real tangible measure of a link, called PageRank, was the first step toward the link building market that we see today. In fact, PageRank most likely gave birth to the link buying and selling market, since a value (albeit a rather rudimentary one) could be put on the different types of links that were available.

Over the years, many link building techniques have been “in fashion,” and were utilized by SEOs. Some worked well and some worked very, very well. For quite a long time, a link was a link and Google had a very hard time catching out techniques which were against their guidelines and spammy in nature.

It wasn’t until 2012 that we really saw some SEO techniques come under serious fire from Google.

A lot of the large-scale link building techniques (e.g., article syndication, directory submissions, blog comments) were used for a reason – they worked! And not only did they work, but they were also very scalable and they made link building into much more of a commodity, which suited most SEO agencies. Even some of the very best SEO agencies that prefer to stay on the white hat side of the line used these techniques because they worked and posed little risk. It was possible to get yourself penalized for sure, but smart use of these tactics – particularly on large, established websites – worked very, very well.

It was super easy to offer packages such as “500 directory links” or “1,000 article submission links” because, at worst, they may not have worked the first time, but the agency would keep trying until they did work. The reality was that unless your website was brand new and had no other links at all, you were unlikely to get into trouble with Google. Even these new sites escaped problems sometimes if they were able to get just a handful of actual, good quality links to go alongside all the large-scale stuff.

Why did Google allow this to happen?

This is a question that pretty much everyone who works in the SEO industry has asked at some point. We all knew Google didn’t want to reward these links. Many of these types of links actually ended up making the web (and the search results) a worse place – one might even call it a cesspool.

The thing is, Google always had a few principles they adhered to:

  • They preferred scalable, algorithmic methods of dealing with webspam and tactics that broke their webmaster guidelines.
  • They tended to err on the side of caution when making algorithm updates or penalties. They’d rather not push an update through if they felt that innocent websites might accidentally get penalized.

It was because of these things that (I believe) Google stood by for so long and allowed these techniques to work. I think Google struggled with solutions that would fit with these principles. Not to mention that the webspam team is not really that big and never has been. They had a lot of other things to focus their time on, which, arguably, posed more problems for users (e.g., filtering adult results, protecting users from hacked sites, etc.).

However, in early 2011, we saw Google become a lot more aggressive and it changed the game. February 2011 saw the release of the Panda update in the U.S., and it was released globally two months later. Now, while these updates weren’t specifically targeted at link building techniques, they did signal a change in the way Google was willing to deal with spam. Lots of websites were caught in the crossfire from Google and were given the option of telling Google if they felt they’d been hit unfairly.

At the time, Google stated that they would make no manual exceptions to Panda, but instead would incorporate feedback into their algorithms:

While we aren’t making any manual exceptions, we will consider this feedback as we continue to refine our algorithms.

Just over a year later, we saw the release of Penguin. This algorithm update went a step further and sought to actively penalize websites for “over-optimization.” We will discuss Penguin in far more detail later on, but once again, it showed how Google was prepared to be much more aggressive than in previous years.

Today, and many updates later, we’re at a point where most SEO professionals are steering clear of large-scale, low-quality link building techniques. The truth is that they pose far more risk than they used to and for legitimate businesses, they can't afford to take risks. If you're messing about with a random side project or affiliate website, then things are a little different and low-quality techniques could work well, but it's likely that it would be for a relatively short period of time.


Historical Google updates related to links

Google pushes out thousands of search algorithm “improvements” every year. They do not publicly announce every single one. However, occasionally, they will push out a bigger than usual update that affects a lot of search results. Sometimes, these are named after the Google engineer who worked on that update (which was the case with Panda).

This section will give you an outline of all known confirmed Google updates that specifically affected the way links are used. For a full list of all updates, I’d recommend this list, which is maintained by Moz.

If you want to see if you’ve been hit by any of these updates, Panguin is a great tool which will overlay your Google Analytics data with all known Google updates.

Penguin 4.0 - September 23, 2016

Nearly two years since the last confirmed Penguin update, Google finally announced an update and that Penguin was now part of the core algorithm. They confirmed that it was therefore real-time and intended to target spam at a granular level i.e. pages would be hurt as opposed to entire websites.

Penguin 3.1 - December 10, 2014

Not so much an update, but Google confirmed that Penguin was still rolling out after the last major update and that it was unlikely to have a distinct end-point because it was now aiming to be a continuous update.

Penguin 3.0 – October 17, 2014

The long-awaited Penguin update came along and appeared to be a lot less impactful than anticipated. It had been a long time since a major Penguin update, so a large impact was expected in the community. However, it appeared that less than 1 percent of queries would be affected. As mentioned above, it was later revealed that Penguin was actually moving to a rolling update, which may explain the initial low impact.

Payday loan update 3.0 – June 12, 2014

Another update from Google around spammy queries including payday loans. It was suggested that this update was targeting at queries, while the previous one was aimed at the sites themselves.

Payday loan update 2.0 – May 16, 2014

This is the second rollout of an update designed to target “very spammy queries” and was believed to target industries such as payday loans and gambling. The reaction was mixed, with some SEO professionals not seeing any changes, despite being active in these industries.

Penguin 2.1 – October 4, 2013

Matt Cutts announced via Twitter that an update affecting around 1 percent of searches was being rolled out. While this was lower than the previous update, the reaction was pretty mixed with lots of webmasters reporting being hit pretty badly.

Payday loan update – June 11, 2013

While this probably isn’t a pure link-related update, I wanted to include it for completeness. The main reason being that the industries affected are notoriously spammy and often driven by mass-generated link building techniques. The essence of this update was that it would target specific industries that were well known as targets for spammers because of how lucrative they could be.

Penguin 2.0 – May 22, 2013

There was a lot of expectation around the next big iteration of Penguin. Given the impact of the original rollout, many expected big changes when this version was released. However, based on industry feedback and Mozcast data at the time, the impact didn’t seem too severe. A post by Matt Cutts suggested that about 2.3 percent of English-U.S. searches were affected to the point where a user would notice.

Penguin 1.2 – October 5, 2012

The third iteration of Penguin was a lot smaller than many expected after a previous warning from Matt Cutts that the next release would be a big one. It actually affected around 0.3 percent of English language queries to a noticeable degree. Compare this to 3.1 percent from the first Penguin update, and you can see that the impact was less noticeable.

Penguin 1.1 – May 25, 2012

The second iteration of Penguin was very small, affecting less than 0.1 percent of English language search queries to a noticeable degree. Despite lots of speculation from several SEOs about an update before this, Matt Cutts explicitly said that this was the first actual update since the initial push of Penguin.

Penguin 1.0 – April 24, 2012

The now infamous Penguin update was specifically targeted at fighting web spam. A couple of months prior to this, Matt Cutts did warn of an upcoming “over-optimization” penalty. He did later confirm that Penguin was the update he was referring to, but clarified that the update wasn’t so much targeting over-optimization as it was targeting webspam. The distinction was made because Google still wanted to encourage good quality SEO.

Like Panda, Google seemed open to the idea that some sites may be hit by accident. Although they were quick to say that the number of false positives should be quite small because the websites affected had a high probability of using webspam techniques. Despite this, Google released a public form that webmasters could use to tell Google if they felt they’d been affected when they shouldn’t have been. This form is no longer available, but here is a screenshot of it:

While Google didn’t confirm the exact types of techniques that Penguin targeted, it was widely observed that websites with low-quality or unnatural link profiles were being affected. It was also around this time that Google started sending out unnatural link warnings to webmasters, and many felt the two were connected.

Google did, however, outline a few techniques in the blog post that announced the update. It included things such as keyword stuffing and unusual linking patterns. Google has been trying to combat these techniques for many years, but Penguin appeared to be much more aggressive and actively sought to penalize websites heavily for webspam tactics. Whereas, previously, Google may have just quietly stopped the webspam links from passing any PageRank - or at least they claimed to. Experience from SEOs who worked prior to 2012 would often say otherwise.

This update changed SEO profoundly. Once popular link building tactics were now viewed as extremely risky. As previously mentioned, SEOs knew that these techniques were not great quality, but they posed little risk, even if they didn’t work. Now they were facing the real possibility that the wrong link building tactics could actively hurt a website.

Search quality update – April 3, 2012

This was a rather subtle update because it was bundled in with 50 other changes that Google made around the same time. We don’t know the exact date that the update actually started affecting results, but the announcement was made on 3rd April 2012.

These are the sections that are most important:

Tweaks to handling of anchor text. [launch codename "PC"] This month we turned off a classifier related to anchor text (the visible text appearing in links). Our experimental data suggested that other methods of anchor processing had greater success, so turning off this component made our scoring cleaner and more robust.

Better interpretation and use of anchor text. We’ve improved systems we use to interpret and use anchor text, and determine how relevant a given anchor might be for a given query and website.

Source: http://insidesearch.blogspot.com/2012/04/search-quality-highlights-50-changes.html

This doesn’t give us much information to go on, and there wasn’t much talk from SEO pros at the time, but clearly Google changed the way anchor text was handled.

Jagger – October 18, 2005

Jagger appeared to be more of a rolling update rather than a single swoop in one day. Various forum threads and blog posts at the time pointed toward a series of updates that aimed to focus more on quality and trust when it came to links. Reciprocal link exchanges with low quality and unrelated websites were also targeted, which hurt a number of websites because it was a very popular practice during the time. There was also talk about the age of a website coming into play, which confused things even more and made it more complicated to diagnose exactly what had happened.

Don't have time to read the book now? Take it away with you in either a pdf or download our Kindle version

CTA - Link Building Book Image - Kindle pdf download of link building book

Getting started is as easy as having a conversation.

crosschevron-down