The statement “they have an algorithm!” has become my mantra. How does Twitter choose which tweets should be at the top of its list? – shorthand for a computer program. What criteria does Google use to choose which results to display? – shorthand for a computer program. I believe it would be beneficial for us to investigate the background of this mysterious technological formula. Around the year 1990, the first search engines started indexing the content on the web. In order to register, online users needed to do nothing more than send their URL address to the engine. This would cause the engine to launch a search spider, which, in a manner reminiscent of something from The Matrix, would extract connections to other sites and return the information to be indexed.

A significant portion of the search algorithms depended on meta tags, which included labelling your websites with relevant keywords. As a result, your website would gradually move up the ranks. This, however, led to the practice of keyword dumping, often known as having a high key word density, and the creation of web pages with content such as the following: We have a selection of inflatable palm trees in store, and these inflated palm trees are available for purchase for $14.99 each. Be sure to pick up your inflatable palm tree today before we run out of them completely since we only have a limited supply. Our inflatable palm trees are selling like hotcakes and we can’t keep up. Annoying. And Google agreed with that assessment.

This also meant that search rank listings could be readily altered, which led to search phrases bringing up sites that were utterly unrelated to the search. One strategy that falls within the category of black hat SEO is known as key word dumpling. (Tactics that are considered unethical by search engines, as opposed to white hat methods such as site design, which will result in greater ranks over the long run.) Therefore, the industry leader in search engines refined their secret formula to incorporate a trust and credibility metric, which is the number of external websites that link back to the material that is located on your webpage. Link farms, which generated backlinks and littered the web in an effort to increase ranks, were one of the many fronts in the never-ending war between SEO manipulators and Google.

This is where things start to get interesting: in April of 2012, Google introduced a program called Penguin (named after the black and white hat SEO techniques). The most recent version of the algorithm, which now takes into account not just meta tags and backlinks but also social impact. In essence, the number of social networking sites to which you have links, as well as the number of individuals that interact with and naturally promote your material.

Because the algorithm that Google uses is now so finely honed, it is able to present you with tailored search results. This means that if you and I were to both Google inflatable palm trees, the top 10 results that we would get back would be very different from one another. To better assist you in finding the information you are looking for, Google takes into account your location, the browser you use, your age, and the websites that you frequent. Knowledge is now more readily available than it has ever been.

LEAVE A REPLY

Please enter your comment!
Please enter your name here