PDA

View Full Version : Compitition Watch



Administrator
06-20-2012, 11:15 AM
Since Google's Penguin update came out, the Web has been flooded with tons of Penguin survival strategies, magic Penguin-killing bullets, etc. What one often hears these days is "diversify your anchor texts," "remóve crappy links," "create unique content," and so on and so forth.

However, none of these strategies seem cable of doing the simple job of bringing one's site back to where it was before Penguin.

As Google said, Penguin is about punishing those optimizers who use "illegal" SEO techniques such as sneaky redirects, keyword stuffing, link spam, copying somebody's content (duplicate content), and others.


However, the questions people ask nowadays are:

* If I used 15 keywords per page, would that be keyword stuffing?
* Which of my links are in violation?
* I'm an affiliate. Would using the merchant's description on a partnér site create duplicate content?
* etc.

Is there a satisfying answer for these questions? There is, and this answer is "competition watch," the point of which is to scrutinize top 10 websites rankings for your target keywords in the post-Penguin SERPs.

How Many Keywords are Too Many?

Let's say your main keyword is "snake leather shoes." Now, how many times can you safely repeat this keyword on your site without being considered a keyword-stuffer? It's not hard to figure out!

See who ranks in the top 10 for that word and carefully analyze the following aspects of their site:

* The number of keywords in page titles;
* The number of keywords in site copy (in general);
* The number of keywords in anchor texts.

The number of keywords on a page as related to the overall amount of text is often referred to as "keyword density." Analyzing competition helps you understand what keyword density is worth high rankings in Google's opinion.

Shaping a Winning Duplicate Content Strategy

Let's say you have an e-commerce store - the type of website that potentially has duplicate content. There are different views on how to deal with internal dupe content. Some people recommend using 301 redirects, some folks tell you to employ canonical tags, while some suggest closing it off with a robots.txt file.

So, what's the best practice in your particular case? To find out, see who ranks in the top 10 for your keyword and see how they use robots.txt instructions, canonical tags and 301 redirects on their site. In that respect, on page optimization software can help.

Then, if you see that some pages of a competing site are restricted from indexing, check whether there are duplicates of those pages on the site. If there are, employ a similar strategy to avoid duplicates on YOUR site.