Showing posts with label Google Penguin Update. Show all posts
Showing posts with label Google Penguin Update. Show all posts

Saturday, 19 May 2012

Useful Info On the new Google Penguin Update that went into effect late April 2012.

What niches were affected the most by this update?”
Gambling,Hosting,weight loss, insurance, make money online, pharma, Loan --> Guys those are the most competitive niche of the web and the 2 websites I managed which dropped are categorized in the 2 first niches I mentioned.

Why were these niches affected my by the penguin update?
1) More back links (broader, and more in quantity)
2) More Keywords oriented content
3) More Black hat Methods
4) Well at the end more spam, they were the models of a working BH SEO strategy (I believe those websites has served to build a spam selection models based on their signals, the crappy results we see at the moment on those peculiar niches and others tough ones tends to concur on that matter. It is as they have reset those niches)

Here are my questions based of the thousands of posts I’ve been reading the last couple of days, I've barely sleep the last couple of days in order to be able to work efficiently with this update in spite of most of the websites went fine here.

Are Link Pyramids still efficient?

What tends to prove my spun content theory above is that crappy spun Web 2.0 is ranking in the search engines. (That’s a straight WTF for me there)

Web 2.0 are usually heavily back linked and served a buffer websites purpose and they increased in ranking (hum interesting) So spunned spammed web 2.0 are still rankings.

That led me to conclude to 4 possibilities:
1) The link juice is now not passing for more than 1 tiers going from the lower.
2) Backlink have lost a ton of value (going to prove otherwise below) and the SEO juice of those web 2.0 properties come from the main domain of those web 2.0
3) If the OBL of the web 2.0 are targeting one or a very limited numbers of websites then no value (which is quite relevant a spam signals)
4) A combination of the above factors.

Is Blog Commenting and Xrumer Campaign Dead ?

This is a tough question here, and my answer will be so far a no.

I know someone who has been playing with SB massively for a month or 2 now (30k links indexed in that amount of time) and Keywords oriented content, and he has been through the update easily. It is a very disturbing example!

He used a load of anchors though and increased his link diversity with many kind of links, but most of them were targeted to his website.

I've also used SB and Xrumer straightly on my websites but only on highly selected list (edu / gov / high PR / AA / and very few high OBL ones) and they remain un-hit !

Did EMD gain in value ?
Yes, I believe they did! The URL gained in a value that's for sure. "python-hosting", "make-money-online.co.uk" are bluffing examples and so are the web 2.0 properties which has the keywords in URL.

The domain authority may have also gained in value as can show error 404 pages from yahoo.answer for exemple.

However, I always thought they will loose in relevancy as they tend to attract more spam websites, so i find this a little confusing to be honest.

What is left from all these to be responsible of our drops ?

1) Massive Link Pinging
2) Irrelevant Contextual backlinks
3) Link Pyramid Juice Lost
4) On site SEO
6) Anchors Diversity.
7) Recurring OBL on some Web 2.0 properties, or Bookmarking Accounts

Don't forget that for this update, not only spammed website got hit, and also white hat ones. So what kind of signals can be shared across the 2 methods beyond the spam ? (I’m actually asking the question to you guys !)

What can i do to fix my website ?

1) Add fresh and quality content (this will never hurt you and may rank for new keywords as fresh content may have been valued)
2) On site Optimization such as net linking, page speed, rich content, sitemap, content relevancy. However, pay attention to your inner links titles so far. Go as descriptive as you can and forget the keyword stuffing on this. I'm still reviewing the effect of such netlinking on my website which has been hurt and it is a possibility that it was targeted as i had this keyword which was ranking exclusively thanks to netlinking and it had dropped. I'll come with results.
3) Manual Backlinking, Manual web 2.0 properties, High PR site as usual as long as they are no return on the efficiency of SEO tools. This is what i would do actually, until I’m 100% positive that we can use them again and we will the question remains on the how.
4) For Penalized sites you can redirect your page to a new one with the same content and no backlinks. The penalty would more likely doesn't go through the 301. I'm testing that method on a dropped page and on a penalized one.

Additional Comment
"2) Backlinks have lost a ton of value (going to proove otherwise below) and the SEO juice of those web 2.0 propoties come from the main domain of those web 2.0"

after 3 days of analyzing my webs - i could not agree more on this point.

i have 1 web still ranking at first spot, and all that web had is 100-200 web 2.0 linking at it. No blast to those web20 were done at all. Just pure web 2.0 link, that's all

and i have bunch of webs totally dropped by 50-100 position, those webs had web 2.0 but and all those were blast with thousands shitty links.

So what happened?
web 2.0 stopped serving as a buffer
web 2.0 stopped passing though link juice to my money site and they have dropped

surely, i had some other Backlinking sources, unfortunately can't see any logical pattern on BMD, AMR,NHS or similar stuff

My sites have been hit pretty hard, but I have a couple of interesting observations at this point:

pages with keywords I had targeted in backlink anchor texts have dropped 50-70 places pages which were not the target of those keywords are frequently ranking higher than pages which were targeted (so site is still ranking for interesting keywords, but on un-related pages) some pages are ranking for keywords related to my niche, but which I never targeted (so if I targeted "coffee mugs" I am now ranking for "cups of java" - ok, terrible example, but you get the idea)

I've been spending a painful amount of time analyzing Penguin's effects, and talking with other bulldog SEO</acronym>'s to discover what factors were implemented in the latest algorithm update. One thing that we all instantly agreed on, was that the Penguin update has been unlike anything we've ever seen before. We expect to see a major revamp and tweaks to the algo soon.

Examples of Low Quality Garbage Rising to the Top:

Credit Card Refinance

5th result on Page 1

The site that's ranked has 7 pages indexed, of which are all default pages for a stock CMS. The total content on this site is all on the homepage, which is a short blurb that was clearly written in a matter of minutes.

Website Stats:

Exact Match Domain

Backlinks: Zero Backlinks

Age of domain: 2 yrs old

PageRank: 0

Paid Surveys

5th result on Page 1

The site has 140 pages indexed, but most are duplicate, tag, or thin pages in general. Overall the site is very thin, with unmasked affiliate links! Overall, not a site you'd expect to be prominently displayed on the first page of Google for the keyword 'paid surveys'.

Not an exact match domain, but includes the keyword 'survey' in domain

Website Stats:

Backlinks: 354 (ahrefs)

Notes about backlinks: A great majority of the backlinks to this domain are from BuildMyRank and other networks that have been deindexed entirely. Some directory links, but overall a very thin link profile that is blatantly artificial with little anchor text diversity

Age of domain: 4 yrs old

PageRank: 1

Other Crappy Rankings:

mexico pharmacy

5th result is a movie review for a Christian Movie on a popular movie review website

Zero links pointed to this page with any keywords related to "mexico pharmacy"

It's very unlikely that this was a hacked, redirect, Google mask, or anything more than a ranking mistake.

credit card review

10th result on Page 1 is a Wikipedia link to Amazon's page!

Similarities Among Sites Crushed by Penguin:

Sitewide Above the Fold Call to Actions/Forms

I've been spending a god awful amount of time on Google's Webmaster Forum, which is a great resource to find sites that were negatively affected by the recent algorithm update. Frustrated webmasters provide their full URL in hopes that someone will point out the horrible mistake(s) they've made so that they can correct it. Don't waste time reading the responses, you'd get better advice from an Eskimo. Regardless, its a good resource for finding actual URL's of sites affected, and in niches that you may never hear about or think about.

One commonality I'm seeing is that sites with large sitewide Call To Actions/forms above the fold were hit, and hit hard. Think insurance related websites, investigative websites, and most lead gen type sites.

Case-Studies of Above-the-Fold Penalty Theory

Site #1

Website Stats:

Backlinks: 9,000

Age of domain: 16 yrs old (well branded)

PageRank: 5

Notice the huge orange button? The large form to enter information in?
Each and every page on this website has this same exact lead gen form on the top of the page. It's clear that a team of people have poured their hearts and souls into this site. Painstakingly optimizing each page on the site with unique, well written content. But, there's a lot of overlap in terms of design of each page, and that big form takes up most of the real estate ABOVE the fold. Viewing the site in 800x600 resolutions, there's NO content above the fold.

All other optimization factors could qualify as amazing, great backlinks, good anchor text and incoming link diversity. A very well branded domain with an ancient old domain (16 yrs old!)! The SEOMoz team would get their rocks off if they had something to do with this domain, its that good. It's astonishing, if anything, this domain's optimization is too good (another Penguin theory of mine).

Site #2

Website Stats:

Backlinks: 37,000

An unbelievable backlink profile, with no signs of artificial linkbuilding. Links from .edu's, .gov's, .mil, and just about every authority type of TLD that you can imagine. This is nearly the pinnacle of a backlink profile that SEOMoz would give as an example of what to do for a white-hat, well branded domain.

Age of domain: 12 yrs old

PageRank: 7

Again, notice the white fields at the top of the page? That's how it is on EVERY page of the domain. The top header section is simply a lead gen form that's sitewide. According to this guy in the Webmaster forum he had sustained amazing rankings until the recent Penguin update. This was a leader in this niche, and had been for many years.

Site #3

Website Stats:

Backlinks: 257

All backlinks appear to be natural, with no sign of manually built links.

Age of domain: 8 yrs old

PageRank: 4

This is a UK based site that was also hit very hard according to theowner on GWT forums. According to the owner the site has plummeted in the rankings after Penguin was rolled out. Notice the huge header image? When viewing on the pinnacle 800x600 resolution there is NO visible content on the site. For the content itself, and title tags, it could be considered over-optimized with the primary keyword showing up on just about every page, with slight modification.

Major Take Aways:

While it's still early to determine the actual changes in the algorithm, when can begin to paint a picture and make some hypothesis about potential changes. My gut feeling is that Penguin largely affected on-site factors rather than off-site factors. Sites that would be considered perfectly optimized, are some of the best examples of sites that got crushed in the latest Penguin update.

Above the fold penalty

It's very likely that Google has implemented this into Penguin. Sites with forms, advertisements or large images that fill up area above the fold sitewide appear to have been hit hardest. If you think this was your problem try viewing your site in a 800x600 screen resolution, how much unique content is visible in this area? You can use Google's own tool.

"Bad Backlinks" AREN'T Reason for Ranking Drops

Like many BH SEO</acronym>'s, I've got a ton of domains that I've done testing with. Manytest domains with nothing but massive Xrumer and Scrapebox spam skyrocketed in the SERPs after the recent algorithm update.

A couple examples I provided above that increased in rankings have links from BMR, ALN, and other networks that have been deindexed! The rest are lower quality article directory links, low quality social bookmarks, and nothing to really write home about.

If anything, link "penalties", because of over-optimization, were distributed a few weeks ago, but not as a direct result of Penguin.

Monday, 14 May 2012

How The New Google “Penguin” Algorithm Update Affects Your Business

Thinking of Penguins tends to conjure up images of cute, waddling birds. But now, at least in the SEO world, they’ll lose part of their innocent image with the new Google algorithm update aimed at webspam being referred to as the “Penguin Update”.  This  is expected to impact about 3% of search queries. If you’re engaging in black hat techniques – be warned – yet again (remember the “Panda Update” anyone?). Google is coming after you, continuing its relentless pursuit of offering only high quality, relevant results for its users. Here are the details you need to know to ensure your website stays on Google’s good side.

Over-optimized websites
Matt Cutts, Head of Webspam at Google, had alluded to this update when he described “over-optimized” websites being punished. This received criticism from the SEO world as it blurred the lines between white hat SEO and webspam. Fortunately he clarified this by explaining, “The idea is basically to try and level the playing ground a little bit, so all those people who have sort of been doing, for lack of a better word, ‘over-optimization’ or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little more level.”
If you’ve been in a frenzy over thoughts of your website being punished by Google either by manual changes or automated internet marketing software, you can calm down. Cutts has confirmed the over-optimization warning was aimed towards webspam, not SEO in general.

The Penguin Update
In his latest blog post appropriately titled “Another step to reward high-quality sites”, Cutts explains:
“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines”.
The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.
In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.
We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics”.

What are the implications for SEO?
The Penguin Update specifically targets keyword stuffing, linking schemes and cloaking:
  • Keyword stuffing places repetitive targeted keywords in low visibility areas of a website in hopes of being associated with the term by a search engine.
  • Linking schemes use organized rings of link spammers that spread unrelated links throughout the internet.
  • Cloaking is the most advanced of these methods, whereby a webmaster shows the search engines a fake version of their website specifically designed to game the algorithm.
Most of the black hat techniques have been around for a while. But now Google have improved measures for targeting and punishing websites using these tactics. If you’re engaging in any of these black hat techniques, heed Google’s warning.
Cutts specifically mentions, “In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.

What are the quality guidelines?
What are the guidelines you need to ensure you’re complying with? Here they are below:
1. Avoid hidden text or hidden links.
2. Don’t use cloaking or sneaky redirects.
3. Don’t send automated queries to Google.
4. Don’t load pages with irrelevant keywords.
5. Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
6. Don’t create pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware.
7. Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.
8. If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.
While most of these are straight forward, a few of these guidelines suffer from Google’s well known ‘let’s be as vague as possible so webmasters don’t game our algorithm’ syndrome. What constitutes “substantially duplicate content” for example? Well you can head over to Google’s help center for more information on the Google search quality guidelines.

Is SEO dead?
Cutts also mentions, “We [Google] want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites”.
Google have unleashed a fury of updates aimed at promoting quality content so this Penguin Update isn’t at all surprising. What sparked my interest is it raises the question of how effective will SEO be in the coming future. We now have confirmation from Cutts, Head of Webspam, that the updates are also aimed at de-emphasizing the importance of SEO for websites wanting to rank highly in the search results. While they are in favour of white hat SEO, Google would like it to it to matter less so those creating great content (but aren’t savvy to SEO tactics) stand a chance of ranking well on quality content alone.
I was adamant in my previous article “How To SEO Is Not Important Because Content Is King, Says Google Employee” that I didn’t believe the world could survive without white hat SEO. SEO plays an incredibly important part in driving users to your website by ensuring it is a “good match” with relevant keywords your target audience searches for.

I just don’t believe we could ever get to a stage where SEO is becomes unnecessary. By continuously mentioning their goal for an SEO-less future, it de-emphasizes SEO in the eyes of naïve businesses. Yet, we’re nowhere near this euphoric world and there’s no evidence to suggest otherwise. Surely it’s counterintuitive for businesses that write great content to disregard SEO? What are the chances un-optimized content from an un-optimized website ranks highly within Google’s search results in the near future?

Saturday, 12 May 2012

Rewarding High Quality Sites: Google’s Penguin / Webspam Update Has Significant SEO Implications

The recent update to Google’s algorithm — referred to initially as the Webspam update, before Google officially dubbed it Penguin– has caused a good amount of debate within the SEO community.  Although the number of sites affected by it was relatively small, the Penguin update continues to be the topic of much discussion because of the information Google released along with it and its implications for the future of search engine optimization.

Some were concerned that this update spelled the end for SEO; however, their fears appear to be unfounded.  In his blog post introducing the Penguin update, Matt Cutts, head of Google’s Webspam Team, notes:
 
Google has said before that search engine optimization, or SEO, can be positive and constructive—and we’re not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

The Penguin update is another step by Google to reward well built sites that have been optimized using “white hat techniques” while punishing “webspam.”  The way these terms are defined in the Cutts’ post offer and Google’s desire to reward sites that offer a good user experience offer useful insights to conducting SEO in the post-Penguin landscape.

Cutts defines white hat search engine optimizers as those who “often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.”

Black hat webspam, on the other hand, makes use of techniques that offer no benefit to users exploit and loopholes or shortcuts to rank pages higher than they ought to be.  Examples of webspam techniques include keyword stuffing and link schemes.

In short, Google wants white hat optimizers to be able to focus on designing and maintaining quality sites without having to worry that sites optimized with black hat techniques might rank higher despite offering a poorer user experience.  ”We also want the ‘good guys’ making great sites for users, not just algorithms, to see their effort rewarded,” says Cutts.

For those who pay attention to the stance Google has been taking on SEO and webspam recently, the Penguin update should have come as no surprise.  For SEO firms that adhere to tested white hat techniques, it was in many ways a validation of those methods.  Keyword stuffing and link building schemes have been obvious things to avoid for some time now although have still tried to use them to game rankings.  Doubtless, there will be those who continue to do so, but the Penguin update should make such efforts much less effective, while rewarding white hat strategies like improving site usability, creating great content, or optimizing page load times.

It’s also worth noting that Google rolled out a Panda 3.5 refresh on 4/19/12 during the Webspam / Penguin update.  For more on the Panda 3.5 refresh and the biggest winners and losers in the rankings, check out Danny Sullivan’s  recent post on the topic.

If you have any questions about the Penguin or Panda 3.5 updates or would like to consult with us regarding its implications for your SEO initiatives, please feel free to contact us.  At Collaboration 133, we utilize proven and effective white hat SEO techniques to help our clients maximize their rankings without violating Google’s webspam policies.

Friday, 11 May 2012

When Will The Next Google Penguin Update Take Place

Google Algorithm Updates
Google has been quite active lately, introducing new algorithms and other minor updates. These updates dictate how well your website performs and how much traffic you get from the search engine. Now Google doesn't want webmasters finding a comfort zone for themselves, because if they do, they will trick Google and effect the quality of results returned on the SERPs (Search Engine Result Pages). To achieve this, Google introduces lots and lots of updates, both major and minor, to counter web-spam. So it might be kinda hard to keep up with them. But blogging is all about flowing along with the wave currents, or with the trends, if you will. So it is very important for webmasters to understand the nature of these updates.
In this post, we will talk about what these updates are all about, and how frequently Google brings updates. We will also talk about some of the flaws in the latest updates, so you know what kind of stuff you should be on the lookout for.

A brief update history

Google Update history
Google was launched in March 1998. In internet terms, that's pretty long ago, considering that the world wide web was invented just 8 or 9 years before Google was born. I can't imagine how internet users survived before Google! We sure as hell can't live without it now :P. Anyways, at that time, Google was just a simple keyword based search engine, and it returned pages that had the most number of user query keywords in them.
The very first update from Google was the Google Toolbar for browsers, and the Toolbar PageRank (TBPR) in 2000. This was when Google started ranking pages intelligently, and the concept of PageRank came into being.
in 2002, Google launched its first official update, code-named the 'Boston' update. Google started monthly indexing of webpages, the so-called Google Dance.
2003 was a pretty busy year for Google. The Boston algorithm was updated, and a new algo was introduced, called the Florida update. This was much like the recent Panda update, and left many websites devastated. The Fritz update ended the monthly Google Dance and now, pages were being indexed on a daily basis.
The Austin update was the major algo in 2004, and it continued Google's crackdown on low quality, black hat SEO websites. Apart from that, Google began investing, and raised its market share.
By the start of 2005, Google emerged as a major search engine, and its share prices doubled. This is when it continued its efforts for fighting against spam by bringing many updates, such as the Bourbon, Allegra, Jagger, and Big Daddy updates. The "nofollow" attribute was introduced around this time as well, and Google started XML Sitemap submission. Google continued working on these updates in 2006 by Supplemental updates. This went on in 2007 as well, although Google started Video, News, and other searches.
2008 wasn't much constructive either, with no algorithm update from Google. However, Google introduced the Caffeine update, which was a major algorithm and infrastructure update. This update was perfected in 2010, and a new update was introduced in May, infamously known as the 'May Day' update. It impacted a lot of webmasters around the globe.
In 2011, most of you will remember, Google unleashed the Panda update, which was a major algorithmic update and impacted lots of websites. Google also introduced the +1 button around that time. Google continued to update the Panda in 2012, and more than a dozen Panda updates have been rolled out to this day. The most recent update from Google was the Penguin update, which was again a major algorithm update.

How frequently does Google bring updates?

Well, judging from Google's update history, Google now brings a major algorithm update each year. But it introduces hundreds of smaller minor updates. Take the Penguin, for example. It has only been here for like two weeks, But in that time, Google has introduced more than 50 minor changes and updates. The total count for last year was around 500-600. So, yeah. Google is busy as a bee these days.
Google also updates its previous algorithms now. The Panda 3.6 was released just after Penguin was rolled out. And the indexing is now done every minute. Google also updates the PR of websites every 3-4 months. The latest PR update was in the start of May 2012, so we can expect the next update to take place in August 2012. So fasten your seat belts, and start working on recovering from the Panda or Penguin updates.

What are these updates all about?

Like I have talked earlier, Google introduces these updates to fight against web-spam. The idea is to bump up quality content on search results, and push down the ruthless and black-hat SEO websites that use spam and unethical SEO tactics as a shortcut to gain the best rankings. So the priority is given to quality and original content, and duplicate and spam content is being pushed down. Of course, Search Engine Optimizers know how to duck and dodge these updates, which is why Google is constantly updating to stop them from finding their comfort zone.
There can be another possible implication from all this ruthless updating and stuff from Google. We now are getting the feeling that Google is trying to monopolize the market for itself. Of course, it already has a monopoly of sorts. But it only has around 2/3rd (64%) share in the industry, and it wants more than that. By introducing all these updates, it wants webmasters to follow its own rules. In short, it is trying to bring the game to its own home turf. Other search engines are presented by a problem here. They either can copy the idea, and make Google look like the big guy who is also their mentor, or they can just ignore the changes and be taken by the tide that is Google. Now of course, Google takes a gamble with such big updates, because it doesn't make webmasters happy. But considering that Google is the biggest in the industry, it can take risks. Smaller engines can only look in awe.

Some problems with Google's algorithms

Now it also seems, that Google isn't entirely aware of what's happening with these updates. There are some loopholes that some webmasters know exactly how to exploit. And Google isn't aware of that. Otherwise, it would never allow such thing s to happen. Let me give you an example.
Type into your browser address bar, www.something.com. You'll find just one word on the entire site; "something". Yeah. That's right! But there's more. If you check the PR of that website, you will find that it's at PR 5! Consider this blog you are on right now. It is at PR 4. See the problem? Previously, this website came out on top of SERPs when you searched for the word "something". Thankfully though, that's been fixed by the Panda and Penguin updates. But the PR remains the same.
Google is also returning irrelevant search results since the Penguin update. Search for "Panda Recovery" on Google, and it will show you Pandora Recovery instead. Weird, huh? There's a lot more anomalies of this kind, and WPMU has written a detailed article on this.
What's happening here? Well, I'm not really sure. But looks like there are some flaws in Google's algorithms, which the veteran black-hat SEOs are striving to exploit. Hopefully, this will fixed in the next algo update from Google, because this gives black hatters an unfair advantage over people like us who write original and quality content.

Monday, 7 May 2012

Google Penguin Update: Don’t Forget About Duplicate Content

There has been a ton of speculation regarding Google’s Penguin update. Few know exactly what the update specifically does, and how it works with Google’s other signals exactly. Google always plays its hand close to its chest.
“While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics,” Google’s Matt Cutts said in the announcement of the update.
He also said, “The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.”
“We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings,” he said. To me, that indicates that this is about all webspam techniques – not just keyword stuffing and link schemes, but also everything in between.
So it’s about quality guidelines. Cutts was pretty clear about that, and that’s why we’ve been discussing some of the various things Google mentions specifically in those guidelines. So far, we’ve talked about:
Cloaking
Links
Hidden text and links
Keyword stuffing
Another thing on the quality guidelines list is: “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.”
Of course, like the rest of the guidelines, this is nothing new, but in light of the Penguin update, it seems worth examining the guidelines again, if for no other reason than to provide reminders or educate those who are unfamiliar. Duplicate content seems like one of those that could get sites into trouble, even when they aren’t intentionally trying to spam Google. Even Google says in its help center article on the topic, “Mostly, this is not deceptive in origin.”
“However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic,” Google says. “Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.”
Google lists the following as steps you can take to address any duplicate content issues you may have:
  • Use 301s: If you’ve restructured your site, use 301 redirects (“RedirectPermanent”) in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)
  • Be consistent: Try to keep your internal linking consistent. For example, don’t link to http://www.example.com/page/ and http://www.example.com/page and http://www.example.com/page/index.htm.
  • Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We’re more likely to know that http://www.example.de contains Germany-focused content, for instance, than http://www.example.com/de or http://de.example.com.
  • Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.
  • Use Webmaster Tools to tell us how you prefer your site to be indexed: You can tell Google your preferred domain (for example, http://www.example.com or http://example.com).
  • Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details. In addition, you can use the Parameter Handling tool to specify how you would like Google to treat URL parameters.
  • Avoid publishing stubs: Users don’t like seeing “empty” pages, so avoid placeholders where possible. For example, don’t publish pages for which you don’t yet have real content. If you do create placeholder pages, use the noindex meta tag to block these pages from being indexed.
  • Understand your content management system: Make sure you’re familiar with how content is displayed on your web site. Blogs, forums, and related systems often show the same content in multiple formats. For example, a blog entry may appear on the home page of a blog, in an archive page, and in a page of other entries with the same label.
  • Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Don’t block Google from duplicate content. Google advises against this, because it won’t be able to detect when URLs point to the same content, and will have to treat them as separate pages. Use the canonical link element (rel=”canonical”).

Saturday, 5 May 2012

Latest Google Algorithm: Penguin Algo Update

PENGUIN UPDATE! Latest GOOGLE ALGORITHM Update as on APRIL 2012 24 Update

There has been widespread reports about the latest algorithm called "webspam update" which is now said to be called as the PENGUIN UPDATE.

I have found that many say this algo is a total failure as webspam update was released to fight webspam and spam sites to reduce rankings for such sites. But in the process many legitimate sites rankings have gone down or have vanised from the top 100 lists.

Other updates that had taken place in April:
April 19 - PANDA UPDATE 3.5
Domain Classifier update - MISTAKEN Update by google which google then clarified
Webspam update: Penguin Update : APRIL 24, 2012

Google's spam update has done much damage to legitimate sites than spam sites. I have also discovered that ecommerce sites are less hit by the udpate than the service sector sites.

Google Algo Penguin Updates Questions ?

  1. Is off page more important  than on page ?
  2. Is backlinks from social bookmarking sites like folkd.com will be devalued?  As many social sites have lost traffic during this update.
  3. Is Blog commenting termed as spam ?  Does blog commenting from legitimate sites is also spam?
  4. Is Forum signature links completely dead
  5. If a website doesn't do SEO then also will a site rank ?  ( Becz as per matt cutts he says that even if one doesn't do SEO then also its better)

Tuesday, 1 May 2012

The Google Panda Fiasco And The “Moronic” Google Penguin Update

The other day Google released yet another Google Panda update. Panda 3.3, 3.4 or maybe it was even panda 3.5? Soon after they released what they’re calling the “webspam algorithm update” which they later named the “Google Penguin Update”.

The specific names don’t really matter; what matters is how they affect you. (Quick Note: this post isn’t about panda/penguin specifics, it’s about Google, algorithm changes and the future of online marketing – a little more generic)

According to Google, the last public update was Panda 3.4 which they officially launched on March 23. But since then there’s been a dramatic change in the algorithm. A change that was supposedly designed to fight spam and improve the quality of search results. (The damn Penguin)
Unfortunately, the Penguin update has been the last nail in the coffin for thousands of people around the globe. A month or so ago I wrote a post about a previous Panda update that was said to affect just 1% of search queries.

At the time hundreds of thousands of webmasters around the world proved that statistic to be very wrong. The results were catastrophic.  And now… Matt Cutts has said this update will “only affect 3% of search queries”. To that I say what a load of BS.

If you log on to any online marketing related forum or even the Google Webmaster’s forum itself, you’ll find thousands upon thousands of webmasters complaining about this update. It was supposed to target sites that were keyword stuffing and using other dodgy black hat tactics. (It has and hasn’t)
When really all its done is brought terrible results to the top of Google and pushed great websites to the bottom. Ok – so it has done some good. Many great sites have risen, but countless others have lost their rankings. In my opinion collateral damage is not ok. 

Google has pushed all the BRANDS to the top. Brand sites are now dominating, more so than ever before. And the big thing that’s got so many marketers outraged is this: niche sites have been killed. 
Really, there are only three types of sites. (Minus article directories and web2.0’s)
  • Brands
  • Niche Sites
  • Authority Sites
As you probably already know, a huge percentage of Internet Marketers make their money from niche sites. Niche sites are generally sites that are built to make money. So it seems Google is out to destroy our incomes and improve search results too?

All jokes aside… Niche sites are typically sites built around a single topic and most of the time they are between 10 and 300 pages.
The problem is a lot of niche sites are full of terrible content. Usually they are 5-10 page sites built for the sole purpose of ranking for 1-10 keywords and making a small profit. Internet Marketers have been building these sites for years because they are guaranteed earners.
Over the past few years however, marketers have gotten lazy and built terrible websites full of outsourced content they know nothing about. Most of those sites have been hit, which is fair enough. But what about the excellent niche sites? What about the sites with 10-300 pages of superb content that genuinely helps people?

Yep – those sites have also been crushed which makes me furious. The funny thing is, black hat SEO’s have been badly penalized. Which is fair enough to a degree. I believe that webmasters should be able to use whatever SEO they like to blast their sites to the top; just as long as they provide value and deserve to rank there.

It’s not just niche sites that have been hit, individual blogs and small businesses have also been destroyed.
The problem is it’s so damn random. Countless white hat SEO users have been hit. People who’ve followed Google’s rules to the T; they’ve been spanked. I was actually talking to a woman the other day who spent 4 years of her life building a health and beauty blog.

The blog had over 600 pages of excellent content. After this update she practically lost every single ranking and her traffic came to a halt… literally overnight. Not cool at all Google…
So niche sites have been penalized badly and thrown to the bottom of the pile. They are ranking in third place despite their individual quality or link profiles. Next authority sites. It’s kind of difficult to define authority sites as they are essentially big niche websites.

Authority sites are enormous sites that specialize in one broad topic. Sites that cover everything on a topic and generally have excellent content. There’s a fine line between authority sites and brand sites. Authority sites as of this last update, have been placed below brand sites.

I used to have a couple of authority sites in obscure niches, they each had 300+ blog posts and were incredible resources for their visitors. I used white hat SEO on them in terms of link building. Obviously white hat link building doesn’t technically exist. But it’s what I call white hat link building…

I built manual backlinks using the 1 article = 1 backlink principle.
I only did guest blogging, article marketing, web2.0’s and social bookmarking + social signals.  While those sites haven’t been sandboxed or thrown to the bottom, they’ve lost a crap load of traffic. All because of Google’s stupid algorithm changes, especially the damn penguin.
My point is, even though those sites followed the rules to the T; they’ve lost a load of traffic and rankings. Which to me is completely unfair. So niche sites have been killed off big time and so have authority sites. The majority of medium to large authority sites have lost a substantial percentage of their traffic if not all.

The problem is this; it never seems like Google targets certain websites. Small sites, big sites, sites using black hat SEO and sites using white hat SEO – they all get hit. Many awful sites seem to just slip through the cracks and even thrive while others get demolished.

Useful Info On the new Google Penguin Update

What niches were affected the most by this update?”
Gambling,Hosting,weight loss, insurance, make money online, pharma, Loan --> Guys those are the most competitive niche of the web and the 2 websites I managed which dropped are categorized in the 2 first niches I mentioned.

Why were these niches affected my by the penguin update?
1) More back links (broader, and more in quantity)
2) More Keywords oriented content
3) More Black hat Methods
4) Well at the end more spam, they were the models of a working BH SEO strategy (I believe those websites has served to build a spam selection models based on their signals, the crappy results we see at the moment on those peculiar niches and others tough ones tends to concur on that matter. It is as they have reset those niches)

Here are my questions based of the thousands of posts I’ve been reading the last couple of days, I've barely sleep the last couple of days in order to be able to work efficiently with this update in spite of most of the websites went fine here.

Are Link Pyramids still efficient?

What tends to prove my spun content theory above is that crappy spun Web 2.0 is ranking in the search engines. (That’s a straight WTF for me there)

Web 2.0 are usually heavily back linked and served a buffer websites purpose and they increased in ranking (hum interesting) So spunned spammed web 2.0 are still rankings.

That led me to conclude to 4 possibilities:
1) The link juice is now not passing for more than 1 tiers going from the lower.
2) Backlink have lost a ton of value (going to prove otherwise below) and the SEO juice of those web 2.0 properties come from the main domain of those web 2.0
3) If the OBL of the web 2.0 are targeting one or a very limited numbers of websites then no value (which is quite relevant a spam signals)
4) A combination of the above factors.

Is Blog Commenting and Xrumer Campaign Dead ?

This is a tough question here, and my answer will be so far a no.

I know someone who has been playing with SB massively for a month or 2 now (30k links indexed in that amount of time) and Keywords oriented content, and he has been through the update easily. It is a very disturbing example!

He used a load of anchors though and increased his link diversity with many kind of links, but most of them were targeted to his website.

I've also used SB and Xrumer straightly on my websites but only on highly selected list (edu / gov / high PR / AA / and very few high OBL ones) and they remain un-hit !

Did EMD gain in value ?
Yes, I believe they did! The URL gained in a value that's for sure. "python-hosting", "make-money-online.co.uk" are bluffing examples and so are the web 2.0 properties which has the keywords in URL.

The domain authority may have also gained in value as can show error 404 pages from yahoo.answer for exemple.

However, I always thought they will loose in relevancy as they tend to attract more spam websites, so i find this a little confusing to be honest.

What is left from all these to be responsible of our drops ?

1) Massive Link Pinging
2) Irrelevant Contextual backlinks
3) Link Pyramid Juice Lost
4) On site SEO
6) Anchors Diversity.
7) Recurring OBL on some Web 2.0 properties, or Bookmarking Accounts

Don't forget that for this update, not only spammed website got hit, and also white hat ones. So what kind of signals can be shared across the 2 methods beyond the spam ? (I’m actually asking the question to you guys !)

What can i do to fix my website ?

1) Add fresh and quality content (this will never hurt you and may rank for new keywords as fresh content may have been valued)
2) On site Optimization such as net linking, page speed, rich content, sitemap, content relevancy. However, pay attention to your inner links titles so far. Go as descriptive as you can and forget the keyword stuffing on this. I'm still reviewing the effect of such netlinking on my website which has been hurt and it is a possibility that it was targeted as i had this keyword which was ranking exclusively thanks to netlinking and it had dropped. I'll come with results.
3) Manual Backlinking, Manual web 2.0 properties, High PR site as usual as long as they are no return on the efficiency of SEO tools. This is what i would do actually, until I’m 100% positive that we can use them again and we will the question remains on the how.
4) For Penalized sites you can redirect your page to a new one with the same content and no backlinks. The penalty would more likely doesn't go through the 301. I'm testing that method on a dropped page and on a penalized one.

Additional Comment
"2) Backlinks have lost a ton of value (going to proove otherwise below) and the SEO juice of those web 2.0 propoties come from the main domain of those web 2.0"

after 3 days of analyzing my webs - i could not agree more on this point.

i have 1 web still ranking at first spot, and all that web had is 100-200 web 2.0 linking at it. No blast to those web20 were done at all. Just pure web 2.0 link, that's all

and i have bunch of webs totally dropped by 50-100 position, those webs had web 2.0 but and all those were blast with thousands shitty links.

So what happened?
web 2.0 stopped serving as a buffer
web 2.0 stopped passing though link juice to my money site and they have dropped

surely, i had some other Backlinking sources, unfortunately can't see any logical pattern on BMD, AMR,NHS or similar stuff

My sites have been hit pretty hard, but I have a couple of interesting observations at this point:

pages with keywords I had targeted in backlink anchor texts have dropped 50-70 places
pages which were not the target of those keywords are frequently ranking higher than pages which were targeted (so site is still ranking for interesting keywords, but on un-related pages)
some pages are ranking for keywords related to my niche, but which I never targeted (so if I targeted "coffee mugs" I am now ranking for "cups of java" - ok, terrible example, but you get the idea)

I've been spending a painful amount of time analyzing Penguin's effects, and talking with other bulldog SEO</acronym>'s to discover what factors were implemented in the latest algorithm update. One thing that we all instantly agreed on, was that the Penguin update has been unlike anything we've ever seen before. We expect to see a major revamp and tweaks to the algo soon.


Monday, 30 April 2012

Google Penguin Update: 5 Types of Link Issues Harming Some Affected Websites

Are you angry and looking for answers about why your rankings vanished after Google released its Penguin update? One common factor thus far appears to be the signals of links that are pointing to your website, early analysis indicates.
The main purpose of the Penguin update is to put a deep freeze on web spam in Google's search results. By extension, a big piece of that web spam appears to be links from low-quality networks.

Natural Links

Before we get into the new findings, first it’s important to understand a bit about Google and links.
Above all, Google considers links as editorial "votes". So, theoretically, the sites that receive the most votes should rank higher on Google because more people find them valuable.
Google analyzes the quantity, quality, and relevance of websites that link to yours. When Google looks at your link profile, they’re looking at such things as what types of websites link to yours, how quickly you acquired these links, and the anchor text (the clickable words) used by the linking website. When Google's algorithm detects such things as a large number of new links or an imbalance in the anchor text, it raises a big red flag.
As Google and many SEOs have preached for years, you’ll attract more links by creating unique, worthwhile content that others will want to link to naturally. If you want to learn more about Google, links, and link building, definitely read our posts “Why Links Matter”, "Filthy Linking Rich", and “Introduction to Google PageRank: Myths & Facts”.

Unnatural Links

For companies that have been hit by the Penguin update, one common theme appears to be a severe lack of natural links, according to a blog post by Glenn Gabe at G-Squared Interactive. He noted five common issues these sites are all facing:
  1. Paid text links using exact match anchor text: For companies that want to rank for a certain term (such as “red widgets”) one way to accomplish this is by buying links from other websites with that exact matching anchor text. This is against Google’s guidelines, as Google would consider this a paid link that exists solely to manipulate PageRank, rather than to provide any value to visitors.
  2. Comment spam: Two things proved problematic for websites trying to unnaturally rank for specific keywords: signatures in comments that contained exact match anchor text; and people who used a spammy user name (e.g., Best India SEO Company) as exact match text.
  3. Guest posts on questionable sites: Although guest posts are a legitimate way to earn links to your site, sites dinged by the Penguin had links pointing to their website from sites filled with low-quality articles where the focus was on the anchor text rather than the content.
  4. Article marketing sites: Thin content featuring links with exact match anchor text were another common factor among affected sites.
  5. Links from dangerous sites: Do you have inbound links from sites that have been flagged for malware, numerous pop-ups, or other spammy issues? This was another factor that caused websites to lose their Google rankings, so links to and from web spammers or “bad neighborhoods” are a danger.
Ultimately, the Penguin update didn’t really change anything that Google has deemed unacceptable. Google has just evolved its algorithm to catch up to those who try to loophole their way to higher Google rankings (and, to be fair, some who simply don't know any better or fully understand SEO). If any (or all) of the above are your sole link building tactic(s), you probably aren't doing enough to rank prominently long-term on Google anymore.
For those unfamiliar, Google has a section devoted to link schemes and makes no secret that such practices “can negatively impact your site's ranking in search results.”

Penguin Recovery?

So, fix all these link issues, eliminate any instances of keyword stuffing, spun content, cloaking, and other spammy tactics and you're guaranteed a Penguin recovery, right? Not necessarily. There are never any magical guarantees for gaining or regaining top search rankings and Google is notoriously tight-lipped about the exact signals it uses to detect web spam.
Additionally, Google is constantly making tweaks to its search algorithm. So check your traffic in analytics and make sure your traffic indeed was impacted starting on or after April 24. If your traffic vanished before this date, another change might be to blame – there was also a parked domain classifier issue the week prior to Penguin's launch in addition to the latest Panda refresh on April 19.
Regardless, with the new tag team of Panda and Penguin, Google can put the smack down on websites that appear to be creating or supporting spam to increase their rankings in search engines. So even if you fix all these link signals, you still must make sure you have quality content.
But even beyond that, there are hundreds of other factors at play that Google's algorithm looks at. Among them:
  • Does your site have too many ads?
  • Does you site have fresh content.
  • Is your business on Google+?

Life After Penguin

While it’s much easier to blame Google and sign a petition begging Google to kill its Penguin update, this isn't the time to give up. Now is the time to look at your website, do a proper, careful evaluation of your inbound link profile, clean up your website, and devise a smarter marketing and business strategy that doesn't rely on Google for the majority of your traffic and income so you can escape the endless loop of Google algorithm updates:
google-algo-change
This isn’t to say Google or any search engine results are perfect – though now might be a good time to check out alternatives like Google's closest competitor, Bing, or upstarts Blekko and DuckDuckGo. Google has created a Penguin feedback form for those who feel websites have been hit unfairly, but this update is algorithmic as opposed to a manual penalty (i.e., reviewed by a human), so don’t expect to see whatever rankings you’ve lost miraculously restored over night.
If you’re a small business, there are ways to Google-proof your marketing. And don't forget to look for non-Google-based link opportunities.

But above all, sometimes when these algorithmic changes roll out, one of the wisest moves is to be patient and carefully analyze any changes before you react blindly to the latest penalty – because by the time you do that, Google will release the latest Panda or its next iteration of Penguin, and you'll be trapped again in the endless loop of relying solely on a third party (Google) for your livelihood.