Posted by Dr. Pete
The long-awaited Penguin 2.0 (also called "Penguin 4") rolled out on Wednesady, May 22nd. Rumor has been brewing for a while that the next Penguin update would be big, and include significant algorithm changes, and M...
Posted by Dr. Pete
The long-awaited Penguin 2.0 (also called "Penguin 4") rolled out on Wednesady, May 22nd. Rumor has been brewing for a while that the next Penguin update would be big, and include significant algorithm changes, and Matt Cutts has suggested more than once that major changes are in the works. We wanted to give the dust a day to settle, but this post will review data from our MozCast Google weather stations to see if Penguin 2.0 really lives up to the hype.
Short-Term MozCast Data
First things first - the recorded temperature (algorithm "flux") for May 22nd was 80.7°F. For reference, MozCast is tuned to an average temperature of about 70°, but the reality is that that average has slipped into the high 60s over the past few months. Here's a 7-day history, along with a couple of significant events (including Penguin 1.0):
By our numbers, Penguin 2.0 was about on par with the 20th Panda update. Google claimed that Penguin 2.0 impacted about 2.3% of US/English queries, while they clocked Panda #20 at about 2.4% of queries (see my post on how to interpret "X% of queries"). Penguin 1.0 was measured at 3.1% of queries, the highest query impact Google has publicly reported. These three updates seem to line up pretty well between temperature and reported impact, but the reality is that we've seen big differences for other updates, so take that with a grain of salt.
Overall, the picture of Penguin 2.0 in our data confirms an update, but it doesn't seem to be as big as many people expected. Please note that we had a data collection issue on May 20th, so the temperatures for May 20-21 are unreliable. It's possible that Penguin 2.0 rolled out over two days, but we can't confirm that observation.
Temperatures by Category
In addition to the core MozCast data, we have a beta system running 10K keywords distributed across 20 industry categories (based on Google AdWords categories). The average temperature for any given category can vary quite a bit, so I looked at the difference between Penguin 2.0 and the previous 7 days for each category. Here they are, in order by most impacted (1-day/7-day temps in parentheses):
33.0% (80°/60°) – Retailers & General Merchandise
31.2% (81°/62°) – Real Estate
30.8% (90°/69°) – Dining & Nightlife
29.1% (89°/69°) – Internet & Telecom
26.0% (82°/65°) – Law & Government
24.4% (79°/64°) – Finance
23.5% (81°/65°) – Occasions & Gifts
20.8% (88°/73°) – Beauty & Personal Care
17.3% (70°/60°) – Travel & Tourism
15.7% (87°/75°) – Vehicles
15.5% (84°/73°) – Arts & Entertainment
15.4% (72°/62°) – Health
15.0% (83°/72°) – Home & Garden
14.2% (78°/69°) – Family & Community
13.4% (79°/70°) – Apparel
13.1% (78°/69°) – Hobbies & Leisure
12.0% (74°/66°) – Jobs & Education
11.5% (88°/79°) – Sports & Fitness
7.8% (75°/70°) – Food & Groceries
-3.7% (70°/73°) – Computers & Consumer Electronics
Retailers and Real Estate came in at the top, with just over 30% higher than average temperatures. Consumer Electronics rounded out the bottom, with slightly lower than average flux, oddly. Of course, split 20 ways, this represents a relatively small number of data points for each category. It's useful for reference, but I wouldn't read too much into these breakdowns.
"Big 20" Sub-domains
Across the beta 10K data-set, we track the top sub-domains by overall share of SERP real-estate. Essentially, we count how many page-1 positions each sub-domain holds and divide it across the entire data set. These were the Big 20 sub-domains for the day after Penguin