Saturday 19 May 2012

Useful Info On the new Google Penguin Update that went into effect late April 2012.

What niches were affected the most by this update?”
Gambling,Hosting,weight loss, insurance, make money online, pharma, Loan --> Guys those are the most competitive niche of the web and the 2 websites I managed which dropped are categorized in the 2 first niches I mentioned.

Why were these niches affected my by the penguin update?
1) More back links (broader, and more in quantity)
2) More Keywords oriented content
3) More Black hat Methods
4) Well at the end more spam, they were the models of a working BH SEO strategy (I believe those websites has served to build a spam selection models based on their signals, the crappy results we see at the moment on those peculiar niches and others tough ones tends to concur on that matter. It is as they have reset those niches)

Here are my questions based of the thousands of posts I’ve been reading the last couple of days, I've barely sleep the last couple of days in order to be able to work efficiently with this update in spite of most of the websites went fine here.

Are Link Pyramids still efficient?

What tends to prove my spun content theory above is that crappy spun Web 2.0 is ranking in the search engines. (That’s a straight WTF for me there)

Web 2.0 are usually heavily back linked and served a buffer websites purpose and they increased in ranking (hum interesting) So spunned spammed web 2.0 are still rankings.

That led me to conclude to 4 possibilities:
1) The link juice is now not passing for more than 1 tiers going from the lower.
2) Backlink have lost a ton of value (going to prove otherwise below) and the SEO juice of those web 2.0 properties come from the main domain of those web 2.0
3) If the OBL of the web 2.0 are targeting one or a very limited numbers of websites then no value (which is quite relevant a spam signals)
4) A combination of the above factors.

Is Blog Commenting and Xrumer Campaign Dead ?

This is a tough question here, and my answer will be so far a no.

I know someone who has been playing with SB massively for a month or 2 now (30k links indexed in that amount of time) and Keywords oriented content, and he has been through the update easily. It is a very disturbing example!

He used a load of anchors though and increased his link diversity with many kind of links, but most of them were targeted to his website.

I've also used SB and Xrumer straightly on my websites but only on highly selected list (edu / gov / high PR / AA / and very few high OBL ones) and they remain un-hit !

Did EMD gain in value ?
Yes, I believe they did! The URL gained in a value that's for sure. "python-hosting", "make-money-online.co.uk" are bluffing examples and so are the web 2.0 properties which has the keywords in URL.

The domain authority may have also gained in value as can show error 404 pages from yahoo.answer for exemple.

However, I always thought they will loose in relevancy as they tend to attract more spam websites, so i find this a little confusing to be honest.

What is left from all these to be responsible of our drops ?

1) Massive Link Pinging
2) Irrelevant Contextual backlinks
3) Link Pyramid Juice Lost
4) On site SEO
6) Anchors Diversity.
7) Recurring OBL on some Web 2.0 properties, or Bookmarking Accounts

Don't forget that for this update, not only spammed website got hit, and also white hat ones. So what kind of signals can be shared across the 2 methods beyond the spam ? (I’m actually asking the question to you guys !)

What can i do to fix my website ?

1) Add fresh and quality content (this will never hurt you and may rank for new keywords as fresh content may have been valued)
2) On site Optimization such as net linking, page speed, rich content, sitemap, content relevancy. However, pay attention to your inner links titles so far. Go as descriptive as you can and forget the keyword stuffing on this. I'm still reviewing the effect of such netlinking on my website which has been hurt and it is a possibility that it was targeted as i had this keyword which was ranking exclusively thanks to netlinking and it had dropped. I'll come with results.
3) Manual Backlinking, Manual web 2.0 properties, High PR site as usual as long as they are no return on the efficiency of SEO tools. This is what i would do actually, until I’m 100% positive that we can use them again and we will the question remains on the how.
4) For Penalized sites you can redirect your page to a new one with the same content and no backlinks. The penalty would more likely doesn't go through the 301. I'm testing that method on a dropped page and on a penalized one.

Additional Comment
"2) Backlinks have lost a ton of value (going to proove otherwise below) and the SEO juice of those web 2.0 propoties come from the main domain of those web 2.0"

after 3 days of analyzing my webs - i could not agree more on this point.

i have 1 web still ranking at first spot, and all that web had is 100-200 web 2.0 linking at it. No blast to those web20 were done at all. Just pure web 2.0 link, that's all

and i have bunch of webs totally dropped by 50-100 position, those webs had web 2.0 but and all those were blast with thousands shitty links.

So what happened?
web 2.0 stopped serving as a buffer
web 2.0 stopped passing though link juice to my money site and they have dropped

surely, i had some other Backlinking sources, unfortunately can't see any logical pattern on BMD, AMR,NHS or similar stuff

My sites have been hit pretty hard, but I have a couple of interesting observations at this point:

pages with keywords I had targeted in backlink anchor texts have dropped 50-70 places pages which were not the target of those keywords are frequently ranking higher than pages which were targeted (so site is still ranking for interesting keywords, but on un-related pages) some pages are ranking for keywords related to my niche, but which I never targeted (so if I targeted "coffee mugs" I am now ranking for "cups of java" - ok, terrible example, but you get the idea)

I've been spending a painful amount of time analyzing Penguin's effects, and talking with other bulldog SEO</acronym>'s to discover what factors were implemented in the latest algorithm update. One thing that we all instantly agreed on, was that the Penguin update has been unlike anything we've ever seen before. We expect to see a major revamp and tweaks to the algo soon.

Examples of Low Quality Garbage Rising to the Top:

Credit Card Refinance

5th result on Page 1

The site that's ranked has 7 pages indexed, of which are all default pages for a stock CMS. The total content on this site is all on the homepage, which is a short blurb that was clearly written in a matter of minutes.

Website Stats:

Exact Match Domain

Backlinks: Zero Backlinks

Age of domain: 2 yrs old

PageRank: 0

Paid Surveys

5th result on Page 1

The site has 140 pages indexed, but most are duplicate, tag, or thin pages in general. Overall the site is very thin, with unmasked affiliate links! Overall, not a site you'd expect to be prominently displayed on the first page of Google for the keyword 'paid surveys'.

Not an exact match domain, but includes the keyword 'survey' in domain

Website Stats:

Backlinks: 354 (ahrefs)

Notes about backlinks: A great majority of the backlinks to this domain are from BuildMyRank and other networks that have been deindexed entirely. Some directory links, but overall a very thin link profile that is blatantly artificial with little anchor text diversity

Age of domain: 4 yrs old

PageRank: 1

Other Crappy Rankings:

mexico pharmacy

5th result is a movie review for a Christian Movie on a popular movie review website

Zero links pointed to this page with any keywords related to "mexico pharmacy"

It's very unlikely that this was a hacked, redirect, Google mask, or anything more than a ranking mistake.

credit card review

10th result on Page 1 is a Wikipedia link to Amazon's page!

Similarities Among Sites Crushed by Penguin:

Sitewide Above the Fold Call to Actions/Forms

I've been spending a god awful amount of time on Google's Webmaster Forum, which is a great resource to find sites that were negatively affected by the recent algorithm update. Frustrated webmasters provide their full URL in hopes that someone will point out the horrible mistake(s) they've made so that they can correct it. Don't waste time reading the responses, you'd get better advice from an Eskimo. Regardless, its a good resource for finding actual URL's of sites affected, and in niches that you may never hear about or think about.

One commonality I'm seeing is that sites with large sitewide Call To Actions/forms above the fold were hit, and hit hard. Think insurance related websites, investigative websites, and most lead gen type sites.

Case-Studies of Above-the-Fold Penalty Theory

Site #1

Website Stats:

Backlinks: 9,000

Age of domain: 16 yrs old (well branded)

PageRank: 5

Notice the huge orange button? The large form to enter information in?
Each and every page on this website has this same exact lead gen form on the top of the page. It's clear that a team of people have poured their hearts and souls into this site. Painstakingly optimizing each page on the site with unique, well written content. But, there's a lot of overlap in terms of design of each page, and that big form takes up most of the real estate ABOVE the fold. Viewing the site in 800x600 resolutions, there's NO content above the fold.

All other optimization factors could qualify as amazing, great backlinks, good anchor text and incoming link diversity. A very well branded domain with an ancient old domain (16 yrs old!)! The SEOMoz team would get their rocks off if they had something to do with this domain, its that good. It's astonishing, if anything, this domain's optimization is too good (another Penguin theory of mine).

Site #2

Website Stats:

Backlinks: 37,000

An unbelievable backlink profile, with no signs of artificial linkbuilding. Links from .edu's, .gov's, .mil, and just about every authority type of TLD that you can imagine. This is nearly the pinnacle of a backlink profile that SEOMoz would give as an example of what to do for a white-hat, well branded domain.

Age of domain: 12 yrs old

PageRank: 7

Again, notice the white fields at the top of the page? That's how it is on EVERY page of the domain. The top header section is simply a lead gen form that's sitewide. According to this guy in the Webmaster forum he had sustained amazing rankings until the recent Penguin update. This was a leader in this niche, and had been for many years.

Site #3

Website Stats:

Backlinks: 257

All backlinks appear to be natural, with no sign of manually built links.

Age of domain: 8 yrs old

PageRank: 4

This is a UK based site that was also hit very hard according to theowner on GWT forums. According to the owner the site has plummeted in the rankings after Penguin was rolled out. Notice the huge header image? When viewing on the pinnacle 800x600 resolution there is NO visible content on the site. For the content itself, and title tags, it could be considered over-optimized with the primary keyword showing up on just about every page, with slight modification.

Major Take Aways:

While it's still early to determine the actual changes in the algorithm, when can begin to paint a picture and make some hypothesis about potential changes. My gut feeling is that Penguin largely affected on-site factors rather than off-site factors. Sites that would be considered perfectly optimized, are some of the best examples of sites that got crushed in the latest Penguin update.

Above the fold penalty

It's very likely that Google has implemented this into Penguin. Sites with forms, advertisements or large images that fill up area above the fold sitewide appear to have been hit hardest. If you think this was your problem try viewing your site in a 800x600 screen resolution, how much unique content is visible in this area? You can use Google's own tool.

"Bad Backlinks" AREN'T Reason for Ranking Drops

Like many BH SEO</acronym>'s, I've got a ton of domains that I've done testing with. Manytest domains with nothing but massive Xrumer and Scrapebox spam skyrocketed in the SERPs after the recent algorithm update.

A couple examples I provided above that increased in rankings have links from BMR, ALN, and other networks that have been deindexed! The rest are lower quality article directory links, low quality social bookmarks, and nothing to really write home about.

If anything, link "penalties", because of over-optimization, were distributed a few weeks ago, but not as a direct result of Penguin.

Thursday 17 May 2012

How To Increase Your Google Adsense Revenue

1 – Make sure the ads that are appearing on your site are closely related with your content. That means the all ads are related to your site.

2 – Use Your main Keyword or theme in the title. Create a keyword-rich content. And you can use keywords in the content 1-3 times and should be bold or underlined.

3 – Always use HTML Heading Tag H1 as your content title of the page and put your keyword in it. This tag should be placed at the beginning of your content.

4 – If your page have some pictures or data using table format then Use wide ads (336×280, 300×250 or 250×250). Because these are the best performing ads.

5 – If your page have more verticles links then you can use links format and should be use border as same as the color of the page.

6 – The back ground of the ads should be same as of the background of the page or pages.

7 – Read and obey the AdSense rules – NEVER click your own ads or ask someone else too
.
8 – Filter out irrelevant ads by applying a filter.

9 – Monitor your earnings and performance regularly.

Wednesday 16 May 2012

Organic SEO Vs PPC

Organic SEO :

In organic search engine optimization, website owner or webmasters spend time and effort to get listed on top in major search engines. But natural optimization is much better than PPC campaign. It costs nothing. Only we spend time, follow the guidelines of search engines, and search some good websites to increase the backlinks of our website. We need also fresh content for our website pages means no duplicate or copied content. Long term market strategy is organic search engine optimization not PPC. Website owner always focus on nature search not PPC.

Advantages of Organic Search Engine Optimization :

1. Not require permanent investment in PPC campaign.
2. Organic search engine optimization delivers result for long time.
3. The traffic comes through organic search engine results is free and we can utilize the money in our other project without investing money in PPC campaign.

PPC :

It is time saving process and provides immediate results and very useful for the new business. Google and other search engines takes some times to crawl a new website. So it is better to get listed on top in sponsored ads in the search engines like google, yahoo and msn and other popular search engines. It needs good keyword selection to convert visitors into business and attractive content. It is the temporary solution to get traffic on the website. If you have money then you run PPC campaign else not.

We have lots of option to increase traffic and get good position in search engines. We have plenty of free articles directories, blogs. We could use these options to increase the backlinks of our websites. It takes time but only organic seo is the solution not PPC.

Advantages of PPC Campaign :

1. Instant result in business.
2. We can stop our PPC campaign any time.
3. We use any keyword to list our website.
4. We can send visitors to any landing page.
5. Perfect for time limited scheme.
6. It works with not well designed websites or whose ranking is low in search engines.

Tuesday 15 May 2012

How to Get Rid of Unwanted Backlinks

It used to be that sites linking to yours couldn’t harm you. Then along came Google’s Penguin Update.
Now for many websites (and a lot of business models that involve selling 50,000 links for $10) the sky is falling. Websites that have built an unnatural looking backlink profile using a strategy of aggressive exact match anchor text usage are setting off Google’s spam alarm.

So what now? Well, there’s really no way around a good old-fashioned backlink audit. You can pay somebody to sort through them and identify the worst offenders, or you can do it yourself.
If you decide to go it alone, the first challenge is answering what are bad links? On the upside, they usually aren’t too hard to find.

Now that Yahoo’s Site Explorer has gone the way of Google supplemental results, if you want really good backlink data there’s probably going to be a cost involved in even doing your own link excavation. A few tools offer some free info but require a payment for the really deep data.
If you've been hit by Google Penguin or an unnatural links penalty, you’re gonna want to drill, baby, drill. The more backlinks you can evaluate the better.
A few of my favorite backlink tools:
  • SEOMoz Open Site Explorer
  • ahrefs Site Explorer
  • Majestic site Explorer
All three of these tools will give you a really detailed, clear look at your backlinks and even anchor text distribution. The issue is you may also get different data from all three. They each have their own indexes and may report different backlink numbers. So while only using one tool may let a few links fall between the cracks, right now, using one or more of these tools may be your best shot at finding the backlinks you need to get rid of.

Taking Out the Backlink Trash

Once you get past the exploratory phase of digging through your backlinks and identifying the ones you want to get rid of, how do you get them taken down?
The answer is simple: It’s not easy.
Now, I welcome anyone to weigh in on how they’ve managed to get backlinks removed, but in my experience, the only surefire way I know of to have backlinks taken down is to ask. Somebody put it up, so somebody needs to take it down, and that means reaching out and making the request.
Virginia Nussey on the Bruce Clay Blog offered some advice on how to send link removal requests:
Create a template email requesting link removal that you’ll send to the webmasters in charge of the links identified as low quality. The template should candidly explain that you are an SEO or site owner trying to recover from a Google penalty and would he or she please remove the following links. List the URLs where the links can be found, the URL on your site they point to, the anchor text ─ all the info needed to easily find the link you’re requesting removed.
She noted there are four possible outcomes. The website either removes the link and tells you, removes the link and doesn’t tell you, doesn’t reply or do anything, or responds by saying they will remove the link if you pay them.

During this process, Nussey suggested keeping detailed records of your efforts to remove links (she suggests compiling a spreadsheet with linking URL, contact name, contact email, date of link removal request, and response/action taken by the linking site), which you could then send to Google as part of a reconsideration request to show you’ve made the effort to remove every bad link aimed at your site.

And as though the idea of having to contact a bunch of people wasn’t bad enough, in a lot of cases figuring out who to contact may prove to be your biggest hurdle. But don’t worry, there are more than a few ways to skin this cat.

Go to the Source

Go back to the company or freelancer you hired in the first place. If it wasn’t you and it was your predecessor or a company you contracted with, then hunt down the original agreement. Follow the money backward if you have to.

If you purchased backlink services, whoever sold them to you should be your first contact. If you bought in bulk there’s a decent possibility the process was automated and hopefully whatever easy method put you on 1,000 websites overnight, can also take you off. If they don’t have an automated system, then they probably have a network of minions they can send the message to.

The point is, if you didn’t get these backlinks yourself, start by reaching out to the person who did. Best case scenario: they may be able to turn what could be a long, drawn-out burden into a conversation with a single point of contact. If you can make it this simple, count yourself lucky and think very carefully about whose help you enlist for link acquisition in the future.

Look for a Contact on the Site

If you can’t get them taken down as easily as they went up, then your next course of action may be to start trying to reach each site whose links you’d like to ditch. I know, it sounds awful and it probably will be.
The biggest challenge here, aside from the annoyance of having to send all of those link removal request emails, is to find people to receive them. You may have better luck finding contacts with some blogs, especially those selling paid reviews, or link space independently.

Although some people make it pretty easy for you to find them on the web, others make it more difficult.
For those who are intent on turning you into a cyber-Nancy Drew, there are a few tricks to hunting people down. However, in a lot of cases, there may not be an individual behind the blog or site your link is on because it’s part of a large blog network. In that case, where you can’t reach a real person for one site, you’re going to have to go to the top. Fortunately, there a few tools to help you figure out which way is up.
  • Domaintools: If you want to find out who owns the site your link is on, visit domain tools or type “whois.sc” in front of a URL. Either way will get you to some very useful info. Now, this is another free-ish tool with the most useful and interesting info hiding behind a paid curtain. But without paying, you can often find names, email addresses and even companies associated with URLs. Provided they aren’t hidden behind a private registration. Even if you can’t get those details, you might find the name of the company that owns the domain as well as how many other domains they own and are associated with.
  • C-Class Checker: If you have a list of all the links you want to get rid of, you can run them through a bulk C-class checker to see how many of them are on the same C-class. It matters because if you have multiple sites hosted on the same C-class it may cut down the number of individual sites you have to contact. There’s a good chance that websites on the same C-Class are connected through one main entity. So that means you only have to contact one person, or company, to try to get the links removed from that entire group of sites.
  • SpyonWeb: If you only have 1 URL to work with, this tool lets you find out what other domains they are associated with. Just put in a website URL, IP address or even the Google analytics or AdSense code and you can find all of the websites that are connected to it. Not only may you find a major websites with a person you can contact, you can see if any of your other links come from this network. If one site is really bad, chances are you can afford to lose all the links from the network it is in.
  • Social Media: Whether you’re trying to reach the company that bought your links, a major site in a network or a website that is linking to you, if they have social media profiles, you can use them. I’d strongly advise reaching out to people quietly, and politely as your initial approach. However, if after a few friendly emails you still can’t get results, go all social on them. Outcries on Facebook can get calls returned, and dissatisfied tweets can result in refunds.
Social attacks should be a last resort. In a lot of cases the kinds of sites where you find bad links, won’t have social available. If they didn’t take the time to write content with compete sentences, they probably didn’t bother to make a Facebook page. So you’d most likely have to be looking for the company behind the network to even find someone to complain to.

Getting your links taken down may be as easy as it was placing the order for them, or it might be a miserable battle that eats up months of your life. If you do manage to lose the links that have caused you problems, then consider it a major win either way.

Monday 14 May 2012

How The New Google “Penguin” Algorithm Update Affects Your Business

Thinking of Penguins tends to conjure up images of cute, waddling birds. But now, at least in the SEO world, they’ll lose part of their innocent image with the new Google algorithm update aimed at webspam being referred to as the “Penguin Update”.  This  is expected to impact about 3% of search queries. If you’re engaging in black hat techniques – be warned – yet again (remember the “Panda Update” anyone?). Google is coming after you, continuing its relentless pursuit of offering only high quality, relevant results for its users. Here are the details you need to know to ensure your website stays on Google’s good side.

Over-optimized websites
Matt Cutts, Head of Webspam at Google, had alluded to this update when he described “over-optimized” websites being punished. This received criticism from the SEO world as it blurred the lines between white hat SEO and webspam. Fortunately he clarified this by explaining, “The idea is basically to try and level the playing ground a little bit, so all those people who have sort of been doing, for lack of a better word, ‘over-optimization’ or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little more level.”
If you’ve been in a frenzy over thoughts of your website being punished by Google either by manual changes or automated internet marketing software, you can calm down. Cutts has confirmed the over-optimization warning was aimed towards webspam, not SEO in general.

The Penguin Update
In his latest blog post appropriately titled “Another step to reward high-quality sites”, Cutts explains:
“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines”.
The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.
In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.
We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics”.

What are the implications for SEO?
The Penguin Update specifically targets keyword stuffing, linking schemes and cloaking:
  • Keyword stuffing places repetitive targeted keywords in low visibility areas of a website in hopes of being associated with the term by a search engine.
  • Linking schemes use organized rings of link spammers that spread unrelated links throughout the internet.
  • Cloaking is the most advanced of these methods, whereby a webmaster shows the search engines a fake version of their website specifically designed to game the algorithm.
Most of the black hat techniques have been around for a while. But now Google have improved measures for targeting and punishing websites using these tactics. If you’re engaging in any of these black hat techniques, heed Google’s warning.
Cutts specifically mentions, “In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.

What are the quality guidelines?
What are the guidelines you need to ensure you’re complying with? Here they are below:
1. Avoid hidden text or hidden links.
2. Don’t use cloaking or sneaky redirects.
3. Don’t send automated queries to Google.
4. Don’t load pages with irrelevant keywords.
5. Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
6. Don’t create pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware.
7. Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.
8. If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.
While most of these are straight forward, a few of these guidelines suffer from Google’s well known ‘let’s be as vague as possible so webmasters don’t game our algorithm’ syndrome. What constitutes “substantially duplicate content” for example? Well you can head over to Google’s help center for more information on the Google search quality guidelines.

Is SEO dead?
Cutts also mentions, “We [Google] want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites”.
Google have unleashed a fury of updates aimed at promoting quality content so this Penguin Update isn’t at all surprising. What sparked my interest is it raises the question of how effective will SEO be in the coming future. We now have confirmation from Cutts, Head of Webspam, that the updates are also aimed at de-emphasizing the importance of SEO for websites wanting to rank highly in the search results. While they are in favour of white hat SEO, Google would like it to it to matter less so those creating great content (but aren’t savvy to SEO tactics) stand a chance of ranking well on quality content alone.
I was adamant in my previous article “How To SEO Is Not Important Because Content Is King, Says Google Employee” that I didn’t believe the world could survive without white hat SEO. SEO plays an incredibly important part in driving users to your website by ensuring it is a “good match” with relevant keywords your target audience searches for.

I just don’t believe we could ever get to a stage where SEO is becomes unnecessary. By continuously mentioning their goal for an SEO-less future, it de-emphasizes SEO in the eyes of naïve businesses. Yet, we’re nowhere near this euphoric world and there’s no evidence to suggest otherwise. Surely it’s counterintuitive for businesses that write great content to disregard SEO? What are the chances un-optimized content from an un-optimized website ranks highly within Google’s search results in the near future?

Saturday 12 May 2012

Rewarding High Quality Sites: Google’s Penguin / Webspam Update Has Significant SEO Implications

The recent update to Google’s algorithm — referred to initially as the Webspam update, before Google officially dubbed it Penguin– has caused a good amount of debate within the SEO community.  Although the number of sites affected by it was relatively small, the Penguin update continues to be the topic of much discussion because of the information Google released along with it and its implications for the future of search engine optimization.

Some were concerned that this update spelled the end for SEO; however, their fears appear to be unfounded.  In his blog post introducing the Penguin update, Matt Cutts, head of Google’s Webspam Team, notes:
 
Google has said before that search engine optimization, or SEO, can be positive and constructive—and we’re not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

The Penguin update is another step by Google to reward well built sites that have been optimized using “white hat techniques” while punishing “webspam.”  The way these terms are defined in the Cutts’ post offer and Google’s desire to reward sites that offer a good user experience offer useful insights to conducting SEO in the post-Penguin landscape.

Cutts defines white hat search engine optimizers as those who “often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.”

Black hat webspam, on the other hand, makes use of techniques that offer no benefit to users exploit and loopholes or shortcuts to rank pages higher than they ought to be.  Examples of webspam techniques include keyword stuffing and link schemes.

In short, Google wants white hat optimizers to be able to focus on designing and maintaining quality sites without having to worry that sites optimized with black hat techniques might rank higher despite offering a poorer user experience.  ”We also want the ‘good guys’ making great sites for users, not just algorithms, to see their effort rewarded,” says Cutts.

For those who pay attention to the stance Google has been taking on SEO and webspam recently, the Penguin update should have come as no surprise.  For SEO firms that adhere to tested white hat techniques, it was in many ways a validation of those methods.  Keyword stuffing and link building schemes have been obvious things to avoid for some time now although have still tried to use them to game rankings.  Doubtless, there will be those who continue to do so, but the Penguin update should make such efforts much less effective, while rewarding white hat strategies like improving site usability, creating great content, or optimizing page load times.

It’s also worth noting that Google rolled out a Panda 3.5 refresh on 4/19/12 during the Webspam / Penguin update.  For more on the Panda 3.5 refresh and the biggest winners and losers in the rankings, check out Danny Sullivan’s  recent post on the topic.

If you have any questions about the Penguin or Panda 3.5 updates or would like to consult with us regarding its implications for your SEO initiatives, please feel free to contact us.  At Collaboration 133, we utilize proven and effective white hat SEO techniques to help our clients maximize their rankings without violating Google’s webspam policies.

Friday 11 May 2012

When Will The Next Google Penguin Update Take Place

Google Algorithm Updates
Google has been quite active lately, introducing new algorithms and other minor updates. These updates dictate how well your website performs and how much traffic you get from the search engine. Now Google doesn't want webmasters finding a comfort zone for themselves, because if they do, they will trick Google and effect the quality of results returned on the SERPs (Search Engine Result Pages). To achieve this, Google introduces lots and lots of updates, both major and minor, to counter web-spam. So it might be kinda hard to keep up with them. But blogging is all about flowing along with the wave currents, or with the trends, if you will. So it is very important for webmasters to understand the nature of these updates.
In this post, we will talk about what these updates are all about, and how frequently Google brings updates. We will also talk about some of the flaws in the latest updates, so you know what kind of stuff you should be on the lookout for.

A brief update history

Google Update history
Google was launched in March 1998. In internet terms, that's pretty long ago, considering that the world wide web was invented just 8 or 9 years before Google was born. I can't imagine how internet users survived before Google! We sure as hell can't live without it now :P. Anyways, at that time, Google was just a simple keyword based search engine, and it returned pages that had the most number of user query keywords in them.
The very first update from Google was the Google Toolbar for browsers, and the Toolbar PageRank (TBPR) in 2000. This was when Google started ranking pages intelligently, and the concept of PageRank came into being.
in 2002, Google launched its first official update, code-named the 'Boston' update. Google started monthly indexing of webpages, the so-called Google Dance.
2003 was a pretty busy year for Google. The Boston algorithm was updated, and a new algo was introduced, called the Florida update. This was much like the recent Panda update, and left many websites devastated. The Fritz update ended the monthly Google Dance and now, pages were being indexed on a daily basis.
The Austin update was the major algo in 2004, and it continued Google's crackdown on low quality, black hat SEO websites. Apart from that, Google began investing, and raised its market share.
By the start of 2005, Google emerged as a major search engine, and its share prices doubled. This is when it continued its efforts for fighting against spam by bringing many updates, such as the Bourbon, Allegra, Jagger, and Big Daddy updates. The "nofollow" attribute was introduced around this time as well, and Google started XML Sitemap submission. Google continued working on these updates in 2006 by Supplemental updates. This went on in 2007 as well, although Google started Video, News, and other searches.
2008 wasn't much constructive either, with no algorithm update from Google. However, Google introduced the Caffeine update, which was a major algorithm and infrastructure update. This update was perfected in 2010, and a new update was introduced in May, infamously known as the 'May Day' update. It impacted a lot of webmasters around the globe.
In 2011, most of you will remember, Google unleashed the Panda update, which was a major algorithmic update and impacted lots of websites. Google also introduced the +1 button around that time. Google continued to update the Panda in 2012, and more than a dozen Panda updates have been rolled out to this day. The most recent update from Google was the Penguin update, which was again a major algorithm update.

How frequently does Google bring updates?

Well, judging from Google's update history, Google now brings a major algorithm update each year. But it introduces hundreds of smaller minor updates. Take the Penguin, for example. It has only been here for like two weeks, But in that time, Google has introduced more than 50 minor changes and updates. The total count for last year was around 500-600. So, yeah. Google is busy as a bee these days.
Google also updates its previous algorithms now. The Panda 3.6 was released just after Penguin was rolled out. And the indexing is now done every minute. Google also updates the PR of websites every 3-4 months. The latest PR update was in the start of May 2012, so we can expect the next update to take place in August 2012. So fasten your seat belts, and start working on recovering from the Panda or Penguin updates.

What are these updates all about?

Like I have talked earlier, Google introduces these updates to fight against web-spam. The idea is to bump up quality content on search results, and push down the ruthless and black-hat SEO websites that use spam and unethical SEO tactics as a shortcut to gain the best rankings. So the priority is given to quality and original content, and duplicate and spam content is being pushed down. Of course, Search Engine Optimizers know how to duck and dodge these updates, which is why Google is constantly updating to stop them from finding their comfort zone.
There can be another possible implication from all this ruthless updating and stuff from Google. We now are getting the feeling that Google is trying to monopolize the market for itself. Of course, it already has a monopoly of sorts. But it only has around 2/3rd (64%) share in the industry, and it wants more than that. By introducing all these updates, it wants webmasters to follow its own rules. In short, it is trying to bring the game to its own home turf. Other search engines are presented by a problem here. They either can copy the idea, and make Google look like the big guy who is also their mentor, or they can just ignore the changes and be taken by the tide that is Google. Now of course, Google takes a gamble with such big updates, because it doesn't make webmasters happy. But considering that Google is the biggest in the industry, it can take risks. Smaller engines can only look in awe.

Some problems with Google's algorithms

Now it also seems, that Google isn't entirely aware of what's happening with these updates. There are some loopholes that some webmasters know exactly how to exploit. And Google isn't aware of that. Otherwise, it would never allow such thing s to happen. Let me give you an example.
Type into your browser address bar, www.something.com. You'll find just one word on the entire site; "something". Yeah. That's right! But there's more. If you check the PR of that website, you will find that it's at PR 5! Consider this blog you are on right now. It is at PR 4. See the problem? Previously, this website came out on top of SERPs when you searched for the word "something". Thankfully though, that's been fixed by the Panda and Penguin updates. But the PR remains the same.
Google is also returning irrelevant search results since the Penguin update. Search for "Panda Recovery" on Google, and it will show you Pandora Recovery instead. Weird, huh? There's a lot more anomalies of this kind, and WPMU has written a detailed article on this.
What's happening here? Well, I'm not really sure. But looks like there are some flaws in Google's algorithms, which the veteran black-hat SEOs are striving to exploit. Hopefully, this will fixed in the next algo update from Google, because this gives black hatters an unfair advantage over people like us who write original and quality content.

Thursday 10 May 2012

Link Building Tips to Drive Traffic to Your Website

You may have an amazing website, but not many people will see it if other sites aren't linking to it.

Relevant inbound links from authoritative, trusted and/or quality websites are every search marketer's dream. (An inbound link, also called a backlink, is a link from an external site that points to content on your site.) Google, which owns about 66 percent of the search engine market according to comScore, sees such links as votes of confidence for your content. Because Google wants to serve users the most relevant, freshest, trustworthy results, inbound links from trusted sites to yours can go a long way toward pushing your content up in search result rankings.

Of course, obtaining those inbound links takes considerable time, effort and resources. There are also a lot of myths and misunderstandings related to link building. For example, some believe Google will penalize you for getting too many links too quickly (not necessarily) or that reciprocal links are a surefire way to boost your rankings (it depends).

To help your site develop a quality inbound link profile, we've collected 25 top link-building strategies and tips from three experts:

Set Your Link-Building Foundation

1. Put someone in charge.

Because link building is time-consuming and resource intensive, someone needs to be responsible for driving the effort, Fasser says. "You need someone focused on actively managing the program, promoting the right content and always looking for new opportunities."

2. Set up a process for monitoring and measuring progress.

From the beginning, have a method in place--usually accomplished via SaaS tools--to monitor and measure your link-building efforts on a regular basis. "If you don't have that process set up, when someone asks how effective your link-building campaign is, you won't have a good answer," Fasser says. "And if you don't have a good answer, you're not likely to get the time and resources you need to continue the link building."

3. Don't outsource your entire link-building campaign.

"You can't outsource 100 percent of your link building or website promotion to a third-party and expect to get the same results you'd get if you had someone doing it in-house. You need someone in-house who really knows your industry," Ward says, since that will give link campaign strategies both context and focus.

Every site, Ward adds, "was designed with a specific and potentially unique audience in mind, specific objectives for that audience and specific subject matter. Doesn't it make sense that every site is going to require a specific approach to link building and content publicity? You can't cookie-cutter the process."

4. Begin by examining the links on your own site.


Unlike most inbound links, the links on your site are entirely within your control. Take a close look at how you're linking to your own content on your site. Are you using keyword-rich anchor text to point to relevant content elsewhere on the site? (Anchor text is a hyperlinked phrase, such as click here, that links to content that typically exists on another web page.) If anchor text is not keyword-rich, revise it, Fasser says. This can help the content that's being linked to with anchor text get a boost in search engine relevancy.

5. Create a baseline of existing inbound links.

Use a tool such as SEOMoz's Open Site Explorer to see which sites are currently linking to yours as well as the anchor text used in those inbound links, Fasser advises. This provides a snapshot of your complete inbound link profile, which is useful for tracking progress.

Open Site Explorer can help you improve your link-building strategy by providing a quick look at your inbound links--and the ones your competitors have.

Open Site Explorer data can be exported in CSV format. The basic tool is free. Additional features are included in subscription plans that start at $99 monthly.

6. Study your competitors' links.

You can also use tools such as Open Site Explorer to investigate the links your competitors have, Fasser says. This can provide ideas for directories and other sites to pursue.

7. Go after links your competitors don't have.

It's not enough to simply find out which links your competitors are getting and go after them. At best, that will simply put you on an equal footing with them. You should also pursue inbound links your competitors dont have, Ward says.

8. Focus on link quality, not quantity.

Relevant links from a few high-quality, trusted, authoritative sites are worth more in SEO terms than a ton of links from low-quality sites, Mastaler says.

9. Develop a list of top-priority keywords and use them in your online content.

Determine which keywords have the most search volume, are the least competitive and have the highest relevancy to your business and its products or services, Fasser advises. Use those keywords in your blog posts, white papers, press releases and other online content. "When you get links from other sites to your content, you'll be more likely to get good-quality anchor text links using your important keywords,".

10. Begin with the low-hanging fruit.

Ask for links from industry connections. Suppliers, donors, employees, retired employees, industry associations, forums, fraternal organizations and anyone else with whom you're affiliated can offer a great place to start your link-building strategies, Mastaler notes. Any individual or entity with which you have "a point of commonality" can serve as low-hanging fruit in your link-building efforts, she adds. Ask them to link to your resources page, blog or other page on your site, or for a listing in their directory.

11. Focus on directories relevant to your industry.

General Web directories are fairly useless in helping your site rise in search result rankings or attract targeted traffic, Ward says. A far better strategy, he adds, is to go after vertically oriented, curated directories maintained by people with "extreme knowledge or passion" who take their time to "collect useful resources."

The best Web directories are those maintained by people who are doing it out of passion, not for SEO. "Google loves and respects these sites because there's a layer of human quality control involved," Ward explains. "The more heavily edited or curated the content is, the more likely it is that Google will respect an anchor text link from that site."

12. Go after a diverse set of links.

The best link-building practice is to obtain inbound links to pages across your site, not just your home page, from a variety of domains using different anchor text keywords, Fasser advises. Just as it's important not to invest in one stock, the same holds true for your link portfolio--ideally, you want to get traffic from many sources. Also, a diverse set of links and anchor text keywords gives you more credibility with search engines.

13. Focus on relevant links.

An inbound link from a site that's relevant to your business is worth more for ranking purpose--sas well as for attracting targeted traffic--than a link from your cousin Billy's site about his favorite beer. "Getting a blog or other site that writes about things related to your product is the way to go," Fasser says.

14. Develop high-quality content.

Google's Panda update of 2011 pushed pages it considered to have poorly written and/or spammy content way down in its rankings. As a consequence, Web sites need to focus on creating high-quality content that's informative, useful and relevant, Fasser says. Not only will high-quality content keep you out of Google's crosshairs, it will help you attract inbound links and targeted traffic.

15. Create infographics and make them easy to share.

Infographics are extremely popular and can increase site traffic, Mastaler says. Other sites often link to them, and they can get lots of Tweets and Facebook likes.

For example, BlueGlass Interactive developed a content marketing infographic that Mashable subsequently hosted. As a result, the infographic has attracted more than 3,800 Tweets, 650 Google +1s and 1,100 Facebook likes.

The keys to getting your infographics posted and shared is to make them visually compelling, informative and neutral in tone--that is, not about your company. It's OK to put your brand on an infographic aimed at consumers, Mastaler adds, as long as you understand that businesses will be less likely to share it.

16. Create custom widgets.

Customize a widget that delivers information relevant to your business, make the widget easy to post on other sites (via cut and paste) and embed a link back to your site, Mastaler suggests. She recommends Widgetbox, an online service that lets you use existing or create custom widgets for $25 monthly and up.

For a monthly fee that starts at $25, WidgetBox will help you build a custom widget that you can easily post on other sites.

Together, infographics and widgets are "a great use of your time" in delivering ROI to your link-building strategies, Fasser adds.

17. Write product reviews.


Well-written reviews of products related to your industry or niche are ideal "linkbait" to post on your site, says Mastaler. Include images (and credit the source) with your reviews to drive engagement. To help each review get noticed, post a link to it and a description on LinkedIn, Quora and Twitter. Create a Pinterest board with photos of the products you've reviewed; each pin (or photo) will include a link back to your site. Video and podcast reviews are another way to attract links and traffic.

18. Develop social media press releases.

A social media press release typically includes one or more photos, social sharing links and video clips. As such, it's more likely to get picked up by other sites, Mastaler says. Services such as BusinessWire and PRWeb will host your release and distribute it to news services and media outlets across the Web. Be sure to include your top keywords and one or more anchor text links back to your site within the release.

You can use services such as BusinessWire to host press releases. Include keywords and at least one anchor text link back to your site for even better visibility.

19. Don't forget online forums.

Online forums are "a tremendous resource," Mastaler says, since that's where you'll find people who are passionate and are often active bloggers. If you can connect with them in a meaningful or helpful way without overdoing a sales pitch, forum members may reward you with a link.

Other Helpful Link-building Strategies

20. Be sure you really need a link before you pursue it.

Before you request an inbound link, ask yourself if you really have a good chance of getting it, Fasser advises. "Link building eats up a lot of time and resources, so make sure you've taken the time to understand the site and its content and if it's truly relevant for what you do."

21. Reciprocal links aren't necessarily a bad--or good--strategy.

"Many people mistakenly make a blanket statement that a particular link-building tactic is good or bad" in terms of SEO effectiveness, Ward says. "The reality is, its just not that simple."

His advice: "Always ask yourself if you would pursue a link (reciprocal or not) if there were no such thing as Google. Instead, do it because swapping links with another site will be beneficial in some way to your site's visitors." As one example, it makes perfect sense for a local veterinarian to exchange a link with a dog grooming service in the area.

22. Big, sudden changes in your inbound links may--or may not--get you into trouble.

Some worry that if their site suddenly attracts a ton of inbound links, Google will suspect black hat or unorthodox link-building activity is occurring and penalize that site in the rankings, Ward says.

The truth is, he says, it depends on the site, its history, the links and the circumstances. If a company is suddenly in the news, its site is likely to gain thousands of inbound links in a few days, with no penalty from Google. Conversely, if about 8 percent of your inbound links had keywords in them and, suddenly, 30 percent of your links are keyword-rich, Google might be suspicious.

"I hate to compare Google to an IRS auditor, but, in some ways, it's true. Google is auditing your site, looking for things outside the norm," Ward says. That's why it's best to grow links naturally by developing and publicizing great content, instead of hiring someone to plant thousands of identical anchor text links to your site on low-quality websites within only a few days.

23. Make content easy to share over social media.

Whenever you post new content on your site, such as a white paper or video, Fasser says to be sure its easy to share across social media. Social media updates containing links are great for building traffic and awareness. You should also share the new content with a Tweet or social media update that includes a relevant keyword and a shortened link, such as from bit.ly, to the content.

24. Your site's ideal link builders will do the job for free.

"The person who is your best link builder is the one who visits your site, likes it, and wants to share it with others," Ward says. That's why it's important to ask yourself what can someone do with your content once they see it, he adds. "It's a mistake not to give people a way to share your content with a Google +1, Facebook Like, or Twitter button. Make it easy for them."

25. Don't put all your eggs in the Google basket.

Too many people put too much emphasis on getting traffic from search engines, Ward says. "The more of your traffic thats coming from Google, the more precarious your position is. Your rankings are fluid and subject to every Google algorithm update," he says. "I've had clients call me and say that, all of a sudden, they're no longer ranking well and it's costing them hundreds of thousands of dollars a month."

Instead, your goal should be to get traffic from a variety of sites, of which Google is simply one. Though achieving this takes time, Ward acknowledges, it gives you a solid, stable foundation that will serve you well in the long run.

Wednesday 9 May 2012

How Google Handles Font Replacement

How does Google view font replacement (ie. Cufan, SIFR, FLIR)? Are some methods better than others, are all good, all bad? 


“So we have mentioned some specific stuff like SIFR that we’re OK with. But again, think about this,” says Cutts. “You want to basically show the same content to users that you do to Googlebot. And so, as much as possible, you want to show the same actual content. So we’ve said that having fonts using methods like SIFR is OK, but ideally, you might concentrate on some of the newer stuff that has been happening in that space.”

“So if you search for web fonts, I think Google, for example, has a web font directory of over 100 different web fonts,” Cutts says. “So now we’re starting to get the point where, if you use one of these types of commonly available fonts, you don’t even have to do font replacement using the traditional techniques. It’s actual letters that are selectable and copy and pastable in your browser. So it’s not the case that we tend to see a lot of deception and a lot of abuse.”

“If you were to have a logo here and then underneath the logo have text that’s hidden that says buy cheap Viagra, debt consolidation, mortgages online, that sort of stuff, then that could be viewed as deceptive,” he adds.

Tuesday 8 May 2012

Over Optimization put your site on flag list

Does Over Optimization put your site on flag list ?

According to Matt Cutts, head of Google’s search spam team.Google is working on a tweak to the algorithm that will punish sites that are too optimized for SEO.

Basic Idea is to level the playing ground a little bit. So all of those people who have sort of been doing…over-optimization or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little bit more level.

After news of the pending over optimization algorithm update hit the web the other week, a few of my clients wanted to know how this was going to affect their websites. Were their sites at risk of an over optimization penalty? I read a few more recaps of the panel discussion to get a better understanding of what exactly “over optimization” means to Google, and I told them that as long as their site has been practicing white hat SEO all along (which it definitely has since my company started managing it) then you are in no danger of being flagged for over optimization. Matt Cutts even admitted, “If you’re white hat or doing very little SEO, you’re not going to be affected by this change.”

This update is designed to help smaller, mom-and-pop websites that are producing great content and have user-friendly websites to perform better in the search engines, even if they don’t have the SEO budget to compete with the big brands. Personally, I think that’s great. There are plenty of smaller websites out there that do everything by the book when it comes to SEO, but they are blocked out of the SERPs because bigger brands can just dump money into their SEO campaigns. Giving these quality websites the opportunity to really make an impact in the search engines is long overdue.

Monday 7 May 2012

Google Penguin Update: Don’t Forget About Duplicate Content

There has been a ton of speculation regarding Google’s Penguin update. Few know exactly what the update specifically does, and how it works with Google’s other signals exactly. Google always plays its hand close to its chest.
“While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics,” Google’s Matt Cutts said in the announcement of the update.
He also said, “The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.”
“We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings,” he said. To me, that indicates that this is about all webspam techniques – not just keyword stuffing and link schemes, but also everything in between.
So it’s about quality guidelines. Cutts was pretty clear about that, and that’s why we’ve been discussing some of the various things Google mentions specifically in those guidelines. So far, we’ve talked about:
Cloaking
Links
Hidden text and links
Keyword stuffing
Another thing on the quality guidelines list is: “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.”
Of course, like the rest of the guidelines, this is nothing new, but in light of the Penguin update, it seems worth examining the guidelines again, if for no other reason than to provide reminders or educate those who are unfamiliar. Duplicate content seems like one of those that could get sites into trouble, even when they aren’t intentionally trying to spam Google. Even Google says in its help center article on the topic, “Mostly, this is not deceptive in origin.”
“However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic,” Google says. “Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.”
Google lists the following as steps you can take to address any duplicate content issues you may have:
  • Use 301s: If you’ve restructured your site, use 301 redirects (“RedirectPermanent”) in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)
  • Be consistent: Try to keep your internal linking consistent. For example, don’t link to http://www.example.com/page/ and http://www.example.com/page and http://www.example.com/page/index.htm.
  • Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We’re more likely to know that http://www.example.de contains Germany-focused content, for instance, than http://www.example.com/de or http://de.example.com.
  • Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.
  • Use Webmaster Tools to tell us how you prefer your site to be indexed: You can tell Google your preferred domain (for example, http://www.example.com or http://example.com).
  • Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details. In addition, you can use the Parameter Handling tool to specify how you would like Google to treat URL parameters.
  • Avoid publishing stubs: Users don’t like seeing “empty” pages, so avoid placeholders where possible. For example, don’t publish pages for which you don’t yet have real content. If you do create placeholder pages, use the noindex meta tag to block these pages from being indexed.
  • Understand your content management system: Make sure you’re familiar with how content is displayed on your web site. Blogs, forums, and related systems often show the same content in multiple formats. For example, a blog entry may appear on the home page of a blog, in an archive page, and in a page of other entries with the same label.
  • Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.
Don’t block Google from duplicate content. Google advises against this, because it won’t be able to detect when URLs point to the same content, and will have to treat them as separate pages. Use the canonical link element (rel=”canonical”).

Saturday 5 May 2012

Latest Google Algorithm: Penguin Algo Update

PENGUIN UPDATE! Latest GOOGLE ALGORITHM Update as on APRIL 2012 24 Update

There has been widespread reports about the latest algorithm called "webspam update" which is now said to be called as the PENGUIN UPDATE.

I have found that many say this algo is a total failure as webspam update was released to fight webspam and spam sites to reduce rankings for such sites. But in the process many legitimate sites rankings have gone down or have vanised from the top 100 lists.

Other updates that had taken place in April:
April 19 - PANDA UPDATE 3.5
Domain Classifier update - MISTAKEN Update by google which google then clarified
Webspam update: Penguin Update : APRIL 24, 2012

Google's spam update has done much damage to legitimate sites than spam sites. I have also discovered that ecommerce sites are less hit by the udpate than the service sector sites.

Google Algo Penguin Updates Questions ?

  1. Is off page more important  than on page ?
  2. Is backlinks from social bookmarking sites like folkd.com will be devalued?  As many social sites have lost traffic during this update.
  3. Is Blog commenting termed as spam ?  Does blog commenting from legitimate sites is also spam?
  4. Is Forum signature links completely dead
  5. If a website doesn't do SEO then also will a site rank ?  ( Becz as per matt cutts he says that even if one doesn't do SEO then also its better)

Recovery Tips Google Penguin Updates

Recovery Tips for Google Penguin Updates

I have collected some tips here.

1. Avoid hidden text or hidden links.
2. Don’t use cloaking or sneaky redirects.
3. Don’t send automated queries to Google.
4. Don’t load pages with irrelevant keywords.
5. Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
6. Don’t create pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware.
7. Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.
8. If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

Thursday 3 May 2012

According to NY times, Google will be reviewed by the FTC

According to NY times, Google will be reviewed by the FTC

http://www.nytimes.com/2012/04/27/technology/google-antitrust-inquiry-advances.html?_r=3&pagewanted=1&hp 

For those who suffered dearly on this update - business wrecked, site missing from listing, loss in revenue, you should know that you can report this to the FTC.

Here goes

https://www.ftccomplaintassistant.gov/

The complaints will carry weight if it's done in bulk.

LET'S REPORT THIS TO THE FTC!

PASS THIS MESSAGE AROUND!!

QUOTE FROM FTC WEBSITE:

The Federal Trade Commission, the nation's consumer protection agency, collects complaints about companies, business practices, identity theft, and episodes of violence in the media.

Why: Your complaints can help us detect patterns of wrong-doing, and lead to investigations and prosecutions. The FTC enters all complaints it receives into Consumer Sentinel, a secure online database that is used by thousands of civil and criminal law enforcement authorities worldwide. The FTC does not resolve individual consumer complaints. 

Please note the last line: "The FTC does not resolve individual consumer complaints." - like I have mentioned before, the complaints will carry weight if it's done in bulk.

Wednesday 2 May 2012

Penguin possibly solution

So much talks about this penguin update and lot of versions. but most of things is clear from start.

1. google provided examples what contextual links from un-related page is what that they fight right now. It mean what that kind of unrelated contextual links can lead to negative juice. Really it can be more - unrelated sidebar/footer links also can lead to negative juice.

that lot of spam in serps happen because lot of old sites used such tricks for long time and many have interlinked all their sites + link exchanges (often not in same niche).

Google now giving out penalty to sites with such unrelated links, but looks so little sites pass such 'quality test', it why many spam in serps and mostly broad search results now (it wikipedia, youtube and amazon).

I thinking what page category classifier system launched now for inbound/outbound links and if so much links not in same category (or in lsi or main site category as cancer->health) from/to your site - site get hit. Also avoid sites with mixed content (which hard to categorize).
Just check adwords - ad groups ideas, it show what them have some kind of classifier now.

2. according to matt cuts speech - them looking to give bonus for fresh sites without lot of links but with great content. So them right now analyzing user factors (time on site, bounce ratio) for lot of more sites.

This serps is still updating and why it changing all time at this days. Looks big moving will for long time (peoples will try fixed and new 'zebra' updates coming soon. User factors going to be one from most important ranking factor in future.

I really i not understanding why peoples not looking into things which bigG showing them before update and lot of strange theories at all forums. G not using any randomization, just this factors for now. Also backlink profiles/on-page seo not tell you full story, because this update about links categorization.

General sites can link niche sites, but opposite in some limited amount. Many unrelated links to sites in other niches causing red flag.

Really it good, because lot of competition vanished now, but hard to tell what them will do next (trouble is to small amount of sites passing quality mark).

So penguin solution ways :

1. remove all links from your pages to unrelated urls (which not in your niche). When health site link mortgage site (in context or not) it looks bad for penguin.

This is absolutely need to do now to recover your positions. Also remember about other google new penalization factors as ads at top of screen, only keywords in title/desc/keywords/h1/b. mix keywords with lsi and some additional text. I have found what many pages ranks not for optimized keyword but for something else. So try to use lsi but not exact keyword in your backlinks and use min 50% of full page url.

2. Inbound links, most hard part. But i think removal of outbound links will enough at this time. Few my sites show stable rising after removal of outbound unrelated links.

Tuesday 1 May 2012

PENGUIN UPDATE

after the penguin update my site dropped below the .com, 2 wikipidia pages, the .co.uk and I was placed

so I did some link building with (.edu and .gov) and so far it's working and I visited the number one site and noticed that they have about 10 same site link on the first page and a very good link structure with a two side bars full of same site related links. this site use the keyword on the domain and first page <title> like my site there is about 150 to 200 words no more. and the main keyword is not mentioned any where in the first page but all other word in the site is related to the niche like the site was created for the reader not the search engine.
my site have name on domain keyword on title, description, H1 and bold, link structure is not amazing, it's related but I could done a better job.
but I went to a good design. so my site is not too bad hit by the PENGUIN UPDATE because is not a bad site but I can see very clear that if wasn't for my backlinks I would be anyway.
I will tweak my site based on the site and I'm sure I'll be on the spot again


I HOPE MY POST HELP SOME OF YOU GUYS WITH THE PENGUIN RESEARCH AND UNDERSTANDING this is my experience so far


My site and Penguin update

I was on 1st page of Google on my main keyword and also in many keyword related to my niche.
After the update I went to now I'm on my main keyword.
I also had 100's of keywords related to my niche on first page of google.
with millions of competition. I got my site #1 only mostly using scrapbox only abit of on page SEO.

SO BELOW I'll explain what's happening NOW

when I started this site few years back the site had the keyword on the domain with extention (.com) the was wikipedia  was wikipedia site had the keyword on the domain with extention (.co.uk) so I got myself a domain with the keyword with extension (.org) I used wordpress to create the site
used some SEO plugin like all in one seo etc.. did some onpage SEO but not over, no more them 100 words so didn't take me long to get on  them started doing some manual link building and eventually got to position 2 passing even wikipedia but not the the site with the keyword + .com on the domain name. them I got scrapbox
the best software I ever got. this got me to position. I just used crap over spammed lists that I could get hold here and there. I'm too lazy to scrap myself :) I stayed in the top position for over 2 years I made xxxxx of $.
I survived the PANDA UPDATE with no changes to my Rank in Google.

The Google Panda Fiasco And The “Moronic” Google Penguin Update

The other day Google released yet another Google Panda update. Panda 3.3, 3.4 or maybe it was even panda 3.5? Soon after they released what they’re calling the “webspam algorithm update” which they later named the “Google Penguin Update”.

The specific names don’t really matter; what matters is how they affect you. (Quick Note: this post isn’t about panda/penguin specifics, it’s about Google, algorithm changes and the future of online marketing – a little more generic)

According to Google, the last public update was Panda 3.4 which they officially launched on March 23. But since then there’s been a dramatic change in the algorithm. A change that was supposedly designed to fight spam and improve the quality of search results. (The damn Penguin)
Unfortunately, the Penguin update has been the last nail in the coffin for thousands of people around the globe. A month or so ago I wrote a post about a previous Panda update that was said to affect just 1% of search queries.

At the time hundreds of thousands of webmasters around the world proved that statistic to be very wrong. The results were catastrophic.  And now… Matt Cutts has said this update will “only affect 3% of search queries”. To that I say what a load of BS.

If you log on to any online marketing related forum or even the Google Webmaster’s forum itself, you’ll find thousands upon thousands of webmasters complaining about this update. It was supposed to target sites that were keyword stuffing and using other dodgy black hat tactics. (It has and hasn’t)
When really all its done is brought terrible results to the top of Google and pushed great websites to the bottom. Ok – so it has done some good. Many great sites have risen, but countless others have lost their rankings. In my opinion collateral damage is not ok. 

Google has pushed all the BRANDS to the top. Brand sites are now dominating, more so than ever before. And the big thing that’s got so many marketers outraged is this: niche sites have been killed. 
Really, there are only three types of sites. (Minus article directories and web2.0’s)
  • Brands
  • Niche Sites
  • Authority Sites
As you probably already know, a huge percentage of Internet Marketers make their money from niche sites. Niche sites are generally sites that are built to make money. So it seems Google is out to destroy our incomes and improve search results too?

All jokes aside… Niche sites are typically sites built around a single topic and most of the time they are between 10 and 300 pages.
The problem is a lot of niche sites are full of terrible content. Usually they are 5-10 page sites built for the sole purpose of ranking for 1-10 keywords and making a small profit. Internet Marketers have been building these sites for years because they are guaranteed earners.
Over the past few years however, marketers have gotten lazy and built terrible websites full of outsourced content they know nothing about. Most of those sites have been hit, which is fair enough. But what about the excellent niche sites? What about the sites with 10-300 pages of superb content that genuinely helps people?

Yep – those sites have also been crushed which makes me furious. The funny thing is, black hat SEO’s have been badly penalized. Which is fair enough to a degree. I believe that webmasters should be able to use whatever SEO they like to blast their sites to the top; just as long as they provide value and deserve to rank there.

It’s not just niche sites that have been hit, individual blogs and small businesses have also been destroyed.
The problem is it’s so damn random. Countless white hat SEO users have been hit. People who’ve followed Google’s rules to the T; they’ve been spanked. I was actually talking to a woman the other day who spent 4 years of her life building a health and beauty blog.

The blog had over 600 pages of excellent content. After this update she practically lost every single ranking and her traffic came to a halt… literally overnight. Not cool at all Google…
So niche sites have been penalized badly and thrown to the bottom of the pile. They are ranking in third place despite their individual quality or link profiles. Next authority sites. It’s kind of difficult to define authority sites as they are essentially big niche websites.

Authority sites are enormous sites that specialize in one broad topic. Sites that cover everything on a topic and generally have excellent content. There’s a fine line between authority sites and brand sites. Authority sites as of this last update, have been placed below brand sites.

I used to have a couple of authority sites in obscure niches, they each had 300+ blog posts and were incredible resources for their visitors. I used white hat SEO on them in terms of link building. Obviously white hat link building doesn’t technically exist. But it’s what I call white hat link building…

I built manual backlinks using the 1 article = 1 backlink principle.
I only did guest blogging, article marketing, web2.0’s and social bookmarking + social signals.  While those sites haven’t been sandboxed or thrown to the bottom, they’ve lost a crap load of traffic. All because of Google’s stupid algorithm changes, especially the damn penguin.
My point is, even though those sites followed the rules to the T; they’ve lost a load of traffic and rankings. Which to me is completely unfair. So niche sites have been killed off big time and so have authority sites. The majority of medium to large authority sites have lost a substantial percentage of their traffic if not all.

The problem is this; it never seems like Google targets certain websites. Small sites, big sites, sites using black hat SEO and sites using white hat SEO – they all get hit. Many awful sites seem to just slip through the cracks and even thrive while others get demolished.

How To Index Pages In Under 26 Seconds

Okay people. My new tutorial after so long. And as I am, no long crap with fillers. Straight methods that work for me, anytime, and everytime.

I see so many people asking how to index pages. They try everything,and yet fail. So for them, here's a little help.

Use these methods, and like me, you'll also be able to index your pages faster than you can imagine.

NOTE: If you expect to be able to index your Xrumer/SB links with these methods, I would request you to close this thread ASAP.

1. Install Wordpress

This is a must! Whether you have a blog (dahh!), eCommerce site, membership site, personal site, portfolio site, online gallery etc, use WP. Believe me, you won't regret it!

2. The Pinging Services

Now, after you publish a new page/post, make sure you ping the page to these URLs. The following is my personal list, that I use on each of my sites, and for my clients as well.

 3. The 5-Minute Quick Indexing

Simply enter the page URL, title and RSS URL, and hit Go. Do this for
both of the above sites. (Obviously, you have to select the services
first)

Ping your URL using

Add your page

That's it. Simple, and to the point.

Doing all of the above takes less than 5 minutes, and I have personally indexed my new WP posts in about 26 seconds.. FLAT!