Google Penguin is now part of the algorithm and in real time

Thanks to Green Genie SEO in Rochester

The news had been circulating for a few weeks now and many SEO’s had seen signs of this change to the Google algorithm, but today the news is official: Penguin becomes part of the real-time signals that influence the positioning of a website.

Although George Muller, Google’s search engineer, in an official Hangout had explained how updating the SERPs and the related fluctuations in the queries were not due to Penguin, today with the news on the blog for Google’s webmasters we come to know all the contrary.

Apart from the words of Muller, the new official news is that, after a break-in period in which we saw decisive changes in SERP, the algorithm for the analysis of spam results (incoming links, keyword stuffing and the like) is entered to be part of the 200 or more signals that influence the decree in the SERP of a URL for certain queries.

What is Google Penguin?
As you can read, from an article dedicated to Penguin , this algorithm analyzes how much and which spam techniques a website uses and, depending on their quality, modifies more or less heavily the positioning of the relative results in the different SERPs .

In particular, if a site has a large number of links considered artificial, spam , low quality , or even worse organized in incoming link schemes, it is possible to see a decided decrease in positions related to some queries, with some serious cases of disappearance complete from the results page. The same goes for all the other techniques which, although once fruitful, today penalize a website as they are considered spam : hidden text, cloacking and keyword stuffing are some of these.

The latest Penguin update was dated 2014, with Penguin 3.0, so for this big new update some SEOs talk about Google Penguin 4.0 .

What changes following this change in the algorithm
Although it was already necessary to pay close attention to the way in which link building activities were carried out for off-site SEO, today it is necessary to stay alert when acquiring hypertext links entering your domain.

But let’s see in detail what changes have been made by the entry of Penguin among the signals of Google’s positioning algorithm.

SEO audit: automate the control of the title tag

On the occasion of the birthday of positioning-seo.com (5 years of online presence acquired domain 14 october 2013 but site rose over months later, almost 130 thousand users in the last year as an organic) I decided to make a gift to this domain. So like any birthday present I go to pack not a really necessary good (like a complete SEO audit to fix the total confusion of this portal) but anyway a content that I hope is of value, explaining the method I have defined over the years to automate the as much as possible an item of the 117 factors I analyze in the Audit phase.

⇧ Return to the summary

Objective analysis
The Audit is the most important action that an SEO consultant must perform when he takes charge of an online domain for some time and he knows nothing about it .

Without a detailed report of the strengths and weaknesses to be corrected, the optimization work to follow will not be organized, it will take more time than it needs, it will not be possible to estimate the operator time to invest in this activity, etc. etc.

For this reason, as every colleague of mine has done (I guess), I made a checklist of the factors to check. To do this, I started from the excellent file of Giovanni Sacheli, to whom I added other analysis factors over time, including an entire card for local SEO if an audit is needed to do so.

As you can imagine in the checklist there is an entry related to the analysis of the tag present in the <head> field of the code. I don’t call it a meta tag because for me the meta tags are those defined by the tag <meta> .</p> <p>If for a site with a handful of URLs the analysis can be carried out manually, when the URLs are more than 50 it starts to become problematic to invest time in this sense and for this reason it is convenient to standardize the method, in order to use it (more or not) the same time each time. Maybe the first time you define this method it will take you very long, but it will be saved in the future.</p> <p>Warning! The analysis of the content of the <title> tag can only be partially automated, clearly an accurate analysis of how much it affects the CTR and its relevance to the strategy linked to your query research requires actions that can be automated but which we will not see, if not in part, in this article.</p> <p>In detail, therefore, the objectives I set for myself to reach are:</p> <p>control of the presence of the tag (ok, you say, but sometimes it happens that it is not defined)<br /> size of the content of the tag (if too short it makes you lose opportunities, too long is truncated and for this could limit its action)<br /> presence of all uppercase text</p>

Truths and Lies of Statistics

A completely true fact can also be very deceptive.
How is it possible? As usual, the problem is not in the data but in the people. For a computer, data is not more than that: it is not good or bad, it is not much or little. But people always value the data: we want to know what they mean, whether they are important or not, or whether we should be happy or worried.
At first reaction, we do not process the information: we perceive it . Perception is a quick way to categorize the avalanche of “inputs” we receive, but it is also the equivalent of a security hole: it opens the door to manipulation.
There are many types of manipulation. These are some of the most common:
▪ It can draw attention to debates that “omit” some points of view that they want to hide (as Noam Chomsky tells )
▪ You can play with the vocabulary to give something positive or negative connotations, or to eliminate positive or negative connotations (ex: “collateral damage”)
▪ You can highlight some figures or others: for example, we can compare the last economic data with the best or the worst in history, with the previous month, with the same month of the previous year; or we can compare the accumulated of the year with the accumulated of the previous year. Depending on our interests, we can choose the comparison that most benefits us
▪ Or you can play with the same pair of data and present it in different ways: the impact on our perception can be impressive (via Tim O’Reilly )
The conclusion: the data itself does not deceive, but we must be critical of the selection (what information is highlighted and which is omitted), the presentation and, consequently, our perception.

About Me

Over the years, we have worked with Fortune 500s and brand-new startups.These companies may have an international presence or focus strictly on a local clientele, and we can still deliver the results they need.

Advertisement

Subscribe to Our Newsletter

We keep your data private and share your data only with third parties that make this service possible. Read our Privacy Policy.

Advertisement

Categories