Tuesday, December 24, 2024
HomeSEOLondon SEO Knowledge Base - Welcome To Google’s Algorithm Zoo

London SEO Knowledge Base – Welcome To Google’s Algorithm Zoo

If you’ve had anything to do with London SEO, UK online marketing strategies, page rankings, etc., you’ve probably heard of a few of Google’s algorithms. Delightfully, these all seem to be named after animals (or at least a lot of them are – there are a few odd ones out). They’re all important but understanding them can be a bit confusing. If you’ve ever asked “What is the Panda algorithm?” or “What is Google Hummingbird when it’s at home?” then you’re not alone.

I am the London SEO Guy – a search engine enthusiast who has been watching and learning from Google since the year 2000. I will try to share my thoughts about the most popular Big G's updates that carry names of animals. So welcome to Google’s Algorithm Zoo. All of these algorithms are used to manage the rankings of each and every website out there and all of them have been designed so that users – like you and me and Auntie Caroline – can find what we want without scrolling through page after page of annoying idiot sites. These algorithms reward what they believe to be “good” sites and penalizes others. For those of us wanting to land on Page 1 of the search results, understanding what each member of the zoo likes is an absolute key factor.

So let’s start our SEO tour of the zoo by visiting the biggest and most important one of them all…

Google Panda

The Panda algorithm wasn’t actually named after an animal. Instead, it was named after the key computer engineer who headed up the project, Navneet Panda. However, it set the tone for subsequent algorithms and started the trend of being named after animals… unless there really are some engineers working in Google called Hummingbird and Pigeon.

The Panda algorithm was implemented around the early twenty-tens (what are we calling this decade – the teens?). It had to be brought in because a lot of hideously spammy sites (nothing to do with SEO in London, ahem) had worked how to diddle the very first algorithm, PageRank (which was also named after a Google developer). PageRank simply looked at how many links came in and out of a particular webpage, and the keywords on that page. We all know what the result of that simple idea was: low-quality sites stuffed with keywords and very bad English, hidden content and metadata containing popular search terms that were nothing to do with the site itself, and link farms. People were getting annoyed with these “black hat SEO” sites landing at the top of the rankings while the things they really wanted – like a good local London restaurant’s menu, hours and location – were down on Page 3 or 4. Something had to be done.

The Panda algorithm rewarded quality sites: sites with good quality original content (enter White Hat SEO) rather than copied stuff (a common trick used by the black hat brigade), and sites that contained reliable and/or interesting information. It was designed to reward sites that had content of the sort that you’d expect to find in a magazine or book. The developers created an algorithm that rewarded sites and content by answering questions like “Was the article edited well, or does it appear sloppy or hastily produced?” and “Is this the sort of page you’d want to bookmark, share with a friend, or recommend?” (You can find the full list of 23 questions at Google’s blog).

Google Panda was first released in 2011, with a patent granted in 2014. It was certainly a breath of fresh air for those of us safe SEOs who loved to create good original and well-written content for websites who had been competing with those producing junk written in bad English, presumably by robots. Or if the articles we had slaved over were copied by some ad-ridden site full of nasties. From now on, real content was the key – hooray!

Google Penguin

Google Penguin was the next member of the zoo, coming out in 2012. Like the Panda, it separated sites using white hat SEO techniques from those using black hat SEO techniques (clear separation of black and white, like the colouration of actual pandas and penguins – get it?). Although the Panda rewarded the good sites, the Penguin had the job of punishing the naughty ones, especially the ones engaging in shady search engine optimisation practices.

The Penguin algorithm has the goal of shoving sites that use the sorts of tricks that users hate down the page rankings. Things like using doorway pages, repetitive use of keywords (which I won’t demonstrate because the Penguin will peck me – but you probably came across this sort of barbarity at some point!) and too many ads “above the fold”. “Above the fold” means placed on the page where it’s the first thing that hits you in the face, like what’s at the top of an old fashioned snail-mail letter that actually really did get folded.

Google Hummingbird

Hummingbirds in nature are very accurate and precise as well as fast. Google’s Hummingbird algorithm is also designed to be fast and precise. The Hummingbird was launched in 2013 and it’s a happy little thing, just like a real live one.

Hummingbird is designed to boost the rankings of pages that contain content with natural language rather than forced keywords. The reason behind this was that people often used real language – especially with voice-activated searching – when entering a query. They didn’t always type in disjointed collections of keywords or key phrases. However, some of us who follow SEO rules religiously (that’s me with my hand up here) learnt to type in our specific key words in this disjointed fashion, and the result was often something that contained those specific words in that specific order – even if it was disjointed. OK, this made pages easy to find but the end results were usually a pain to read. You had to wallow through all the keyword-heavy content that sent a shudder down your spine. We understood why the web developers had done it but we had to ignore it to get to the information we wanted – like whether that particular London restaurant had vegetarian options or whether the nearest hardware store was open after 5 p.m.

The Hummingbird also rewarded long-tailed keywords (we’d call them key phrases). Is anybody else mentally picturing a glittering hummingbird with a long tail hovering over a hibiscus flower at the moment, or is that just me? This made it very easy if you were trying to track down the source of a quote, for example. For web designers and London SEO experts, it meant that we had to get more creative when thinking of and using these long tailed keywords. The advantage is that they were a lot easier to work into our content simply because they are part of natural language and they are the sort of thing you’d actually write or say.

The Hummingbird algorithm was developed through use of machine learning techniques and semantics, which meant that the team at Google would have had to train it on lots of natural language texts. I’d love to know exactly what they used as a training dataset so the algorithm learned what natural language sounds like. Did they collect conversations via eavesdropping? Did they use dialogue from books, movies and TV shows? Somebody would also have had to teach the machine semantics so that it knew that if someone was searching for a nearby vegetarian restaurant in London, it would recognise things like cafés, bistros, eateries, gastropubs and takeaway shops, as well as vegan establishments. A little bird with search engine optimisation interests told me (ha!) that Wikipedia is involved somewhere.

Google Pigeon

Last but not least for our tour through the Google Zoo, we’ve got the Pigeon. Natural pigeons don’t like being far from home and will prefer to stay in their local area. So does Google Pigeon. This algorithm is the one for local searches. It means that if you are developing and SEO optimising that website for the vegetarian restaurant in London that we’ve been using in our example, you won’t have to compete with a big outfit on the other side of the world that gets lots of hits and keeps the rest of the zoo happy. Google Pigeon will look at the location of the user and will put sites that are near that place up the top of the rankings, not the ones further away. This is local 

The cautionary tale here is that you have to let the algorithm know where you’re located so that it can find you and boost you up the local page rankings. This means that you have to mention the location and some of your surrounding areas in your page content. It’s not as hard as you think! London SEO at its finest.

Brian Flores
Brian Floreshttps://www.techicy.com
Brian is a business editor who writes about various topics such as technology, health and finance. He works along with the colourful folks that build a nation through tech startups. He is also a professional football player and video games enthusiast.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Follow Us

Most Popular