When any business is deciding their marketing strategies and looking at their marketing budget it is extremely important to try and get the best value for money as possible. Engaging an SEO company UK with an excellent SEO strategy and excellent SEO techniques is of utmost importance but for many companies price is an extremely important factor and return on investment is even more important.
We offer bespoke prices based on your companies needs and the likelihood of delivering strong return on investment for your money. We are the best affordable SEO agency available in the UK. Contact us today for a quote and we will ensure you get the best value for money for your online marketing.
We Are One Of the Best SEO Company UK Available
Over the years, Google has had to change and adapt to survive. It has been in a constant battle with webmasters who are eager to manipulate its SERPs. Since the algorithm is based on real factors and properties of a website, site owners have been trying to identify those factors and manipulate them for better rankings. Whenever webmasters find a competitive advantage (sometimes called a loophole), Google tries to quickly plug it.
Let’s look at a real example of this struggle.
Over a decade ago, webmasters found out that Google used the Meta Keyword tag as a major ranking factor. What did they do? They began stuffing this tag with keywords to rank well for those terms. What did Google do? It started to ignore the Meta Keyword tag, effectively closing that loophole.
I would like to point out that I do believe Google still looks at the Meta Keyword tag, but not as you might think. I think the company uses it to help identify spammers. Any page that has a Meta keyword tag stuffed with dozens, or even hundreds, of keywords, is clearly doing something underhand or at least trying to.
Here is another example of a loophole being closed.
A few years ago, webmasters found out that by using a domain name that was the keyword phrase they wanted to rank for, the site would get a massive ranking boost in the SERPs. This type of domain is called an Exact Match Domain (EMD). In September 2012, Google released the “EMD Update” which removed that unfair ranking advantage. Hundreds of thousands of EMD sites dropped out of the Google top 10 overnight, which saw an end to a large industry that had profited in buying and selling EMDs.
Today, EMD sites are rarely seen in Google. The battle between spammer and search engine continues to rage on to this day. Spammers find loopholes, and Google plugs them.
In September 2011, Google CEO Eric Schmidt said that Google had tested over 13,000 possible algorithm updates in 2010, approving just 516 of them. Although 516 may sound a lot (it’s more than one update a day), it certainly wasn’t an unusual year.
Google probably updates the algorithm at least 500-600 times every year. Most of these updates will be minor, but Google does roll out major changes every now and again. We’ll look at the most important ones in the next chapter.
The one thing all Google updates have in common is that they are designed to improve the search results for the people that use the search engine – your potential visitors.
Panda, Penguin, and Other Major Updates
We could go back to the very beginning to see all the changes and updates that Google has made, but I want to focus on those changes that have had the biggest effect on the way we do SEO today. I think it is important that you know this history as it helps you make decisions on which SEO strategies to avoid. So let’s start back in 2011.
Updates to the Google Search Algorithm
This was a huge year in SEO terms, shocking many webmasters. In fact, 2011 was the year that wiped out a lot of online businesses. Most deserved to go, but quite a few innocent victims got caught in the carnage, never to recover.

At the beginning of the year, Google hit scraper sites (sites that used bots to steal and post content from other sites). This was all about trying to attribute ownership of content back to the original owner and thus penalize the thieves.
On February 23, the Panda update launched in the USA. Panda (also called “Farmer”) was essentially targeting low-quality content and link farms. Link farms were basically collections of low-quality blogs that were set up to link out to other sites. The term “thin content” became popular during this time; describing pages that really didn’t say much and were there purely to host adverts. Panda was all about squashing thin content, and a lot of sites took a hit too.
In March of the same year, Google introduced the +1 button. This was probably expected bearing in mind that Google had confirmed it used social signals in its ranking algorithm. What better signals to monitor than its own?
In April 2011, Panda 2.0 was unleashed, expanding its reach to all countries of the world, though still just targeting pages in English. Even more signals were included in Panda 2.0. Maybe even user feedback via the Google Chrome web browser. Here users had the option to “block” pages in the SERPs that they didn’t like.
As if these two Panda releases weren’t enough, Google went on to release Panda 2.1, 2.2, 2.3, 2.4, 2.5, and 3.1, all in 2011. Note that Panda 3.0 is missing. There was an update between 2.5 and 3.1, but it is commonly referred to as the Panda “Flux”. Each new update built on the previous, helping to eliminate still more low-quality content from the SERPs. With each new release of Panda, webmasters worried, panicked, and complained on forums and social media. A lot of websites were penalized, though not all deserved to be; unavoidable “collateral damage” Google casually called it.
In June 2011, we saw the birth of Google’s first social network project, Google Plus.
Another change that angered webmasters was “query encryption”, introduced in October 2011. Google said it was doing this for privacy reasons, but webmasters were suspicious of its motives. Prior to this query encryption, whenever someone searched for something on Google, the search term they typed in was passed on to the site they clicked through to.
That meant webmasters could see what search terms visitors were typing to find their pages using any web traffic analysis tool. Query encryption changed all of this. Anyone who was logged into their Google account at the time they performed a search from Google would have their search query encrypted. This prevented their search terms from being passed over to the websites they visited. The result of this was that webmasters increasingly had no idea which terms people were using to find their site.
In November 2011, there was a freshness update. This was to supposedly reward websites that provided time-sensitive information (like news sites), whenever visitors searched for time-sensitive news and events.
Click here for a free SEO report!