Sep
14

Was talking to a webmaster friend yesterday and he asked me if the All in One SEO plugin is really doing any good. The reason was that he’ve been using the All in One plugin and could not really find out any significant results with it. I asked him what he did with him, to which he replied that he had installed it and is waiting for the results. Ahha…I asked the same question to a few more webmasters and 3 out of 7 told me they weren’t really sure how to use the plugin. They’ve installed it and kept it as it is expecting results.

So that’s exactly why this post comes to you today. This is for those who have been using the All in One SEO plugin (and knows not fully about it) and for the ones who are planning to use it.

What is the All in One SEO plugin for ?

All in One SEO plugin does not take care of everything SEO. It takes care of three things on your blog.

1 - Your title tags
2 - Your meta tags (Keyword and Description)
3 - Your no index properties for a page

Your title tags are that line of text that appear on the browser window while loading your page.

For a normal blog, they are navigation helpers and appears something like Yourblogname>> Blog Archive >> Post title, and it appears almost the same (last part changes) on all the posts.

Results?

All the pages will carry more or less the same title. And you might already know that title tags are one of the most important SEO metrics.

All in One SEO plugin helps you have unique Page Titles for each post, completely configurable so that you can “make them” SEO friendly.

Essentially, you can use the title tags uniquely on post pages, homepages, category pages, tag pages, archives etc.

The real power of All in One SEO plugin

The real power lies not in the right settings but how you use individual posts to leverage on your keywords.

All in One SEO plugin will help you show a completely different page title (what appears on the browser window) than the original post title. As for the normal blogs, both are the same. Using the All in One SEO plugin, you can change the page title after publishing your post and make it more palatable for the search engines.

I agree that SEO friendly page titles are not always human friendly. For the same reason, many a times, we are confused which format to follow. Now, AllinOne SEO helps you out of the situation. While you publish a post, create a user friendly page title, may beĀ  the same post title. But soon after it get’s read by all and is pushed to to archives, it might be good idea to change the page title to a much more SEO friendly format.

So a user reads a user friendly version (spotted on the post title on the page) and search engines sees a keyword rich seo version (seen on serps and browser window). Just make sure you don’t over do it.

Same formula should help and work with meta descriptions, but I’m unsure if they will directly affect the search rankings, but hey we have an option so use it anyway.

Some more power uses for All in One SEO

All in One SEO comes with an option to selectively use “meta nofollow” tags on posts, pages or categories.

With this option, you can chose if you want to block search engines from crawling category pages, single post pages, pages and tag pages. It simply adds a meta nofollow tag while the option is checked in the options menu.

This is an excellent tool to avoid duplicate content issues on your blog. You could opt for nofollow-ing your category, archive and tag pages and leave the posts pages so that there is no duplication. Check out the screenshot to see what I mean.

(17) Comments    Read More   

Gone are the days when SEO Tools was al about page rank checks and keword density checks. Today we have seo tools to analyze backlinks, wikipedia links, digg submissions, delicious bookmarks, neighborhood ip checks and all that external seo metrics.
Here’s a list of twelve such free tools that you can use to analyze the stats of your site.

  1. SEO Digger [ Keyword Research, Rank Checker ] - Checks your site for top keywords/keyword combination that your site is already ranking high for. Shows the corresponding position for those keywords on the search engines.
  2. Quark Base [ Complete site metrics ] - Gives all the information you would ever need on a website. Right from the hosted ip to page rank, wikipedia links and more. Compare with your competition to see where you lack.
  3. SEO Meter [ Google crawl rate checker ] - Checks for the frequency by which Google crawls your website.
  4. SiteYogi [ Site health check ] - Checks overall site stats on various engines and social media. Checks for all basic data like indexed pages on various search engines. Data is not accurate sometimes.
  5. Exact Factor [ Rank checks ] - Checks one or more domains for ranks and Google/Yahoo site index stats. Broken sometimes.
  6. Internal Pages PR checker [ Page Rank Checker ] - Checks all your internal pages for their Page Rank. Good to determine your internal link structure.
  7. Multiple keywords rank Check [ Site stats check ] - Check ranks for multiple keywords
  8. URL Metrix [ Site stats checker ] - Checks all the site stats like pages indexed on Google, backlinks and everything you coould think of on SEO basics.
  9. URL Trends [ SEO Site stats checker ] - Free SEO reports with details on social media inlinks, user demographics, keyword analysis, incoming links, traffic stats etc.
  10. Website Grader [ SEO Site stats checker ] - Checks and analyses all the site information right from inbound links to keyword stats.
  11. Site Rake [ Log files analyzer ] - Upload your site log files and this tool will help you analyse all your stats.
  12. Xinu Returns [ Site stats checker ] - Checks all your site stats including link backs, digg submissions, wikipedia links etc.
(27) Comments    Read More   
Sep
08
Filed Under (Search Engine Optimization) by Mani Karthik on 08-09-2008

Ann Smarty, wrote an interesting article the other day on SEJ, discussing the metrics to measure your SEO success rate. It’s indeed a very bold article…a question that even the so called “professional” SEOs try to dodge.

Ann points out that there could be three types of metrics that you can track, to measure the success rate of your campaigns. Keyword Rankings, Search Traffic and Conversions.

All the three metrics above are great. In fact, SEO’s have been following the same metrics over time to measure success rates, I can’t think of any other metric to add, but I might want to add that success rates of your SEO campaigns completely depend on your client type and business requirements. And none of these metrics can exist alone to give you significant results. Most of the time, it’s a balanced equation involving all the above three elements that makes the right formula. Based on the business type and website model, one of them may weigh more while other weigh less.

Let me give you an example that involves all the three elements equally.

When I work with websites that are heavily targeted to niche audiences like “bus ticket booking.com”, they have a business model that heavily relies on gaining top ranks. Their audience is clearly the people searching for a particular term/terms on Google, and desperately needs to get the top rank for it.

Now, why would they target those set of keywords? Traffic.

Now,keyword research suggests that there are 11k people searching on Fridays for “bus ticket booking”, so the website got to rank for the keyword. And being there on the top gives it 11k traffic every friday. So far so good.

Now assuming the website sells bus tickets, it should also find out how many of those 11k people coming through the search engines actually buy tickets. So finding out the conversion ratio, helps to check whether we are getting quality traffic or not.

So it’s a vicious circle here.

1. Find out the keywords that generate maximum traffic and find out the best that suits your site. (Numbers aren’t everyhing. Are they?)

2. Optimizing for those keywords will give you the traffic. Enjoy it.

3. Create “target” pages and check how much of your traffic actually converts into a sale/download/whatever.

4. Analyze which are the keywords that pass “quality - converting traffic” and the ones that generate “non-converting” traffic.

5. Back to Step 1.

Essentially, SEO can be a swiss knife “application” to your website that will deliver you results around the graph, but specilazed efforts helps you to zero in on your targets. And finding out what your target is and what’s not, is the real skill of a webmaster.

I might be talking vague here, but hope the point is clear, and that’s all I was trying to say.

(3) Comments    Read More   
Sep
04

But, honestly, it’s fair to have one(or a few). Because there are lot of changes happening, and algorithm updates that it’s not possible for everyone to stay updated and tuned on SEO. Many of us bloggers have certain SEO misconceptions just because we haven’t updated ourselves, aligning to Google.

Here are some SEO misconceptions that I found common with more than one person while engaging in conversations with them. Feel free to share your thoughts on them. I hope you will clear them and grow yourself up to SEO 2.0 and break out of the old-school SEO kid image.

  1. Meta tags and descriptions affect the SERPs
    Meta tags somehow is the most loved onsite “optimization” technique for bloggers, it’s one place that many use for stuffing in as many keywords as possible. As a matter of fact, meta tags (keywords and description) are no longer helping you impact your search engine rankings. If they are relevant to the content of the website, then it’s fine, but again it doesn’t help to add all sorts of possible keyword options into it.
    I’d say meta description is a good tool to increase your “clickability” from SERPs, so you might want to write a compelling and attractive copy there, but that’s about it.
  2. Keyword match on domain means a top rank
    In fact, while selecting domains, it’s a good idea to have a perfect keyword match, but that doesn’t mean that you have to go to the extent of selecting a domain like searchengineoptimizationservicesonlinefreeindia.com.
    It’s ugly and it doesn’t help you but may rather hurt you. Just go through any google SERPs and see if you can find a first page listing for any domain longer than max 20-ish characters. No, is it not? Therefore don’t over do this bit of selecting domains for perfect keyword match.
    Had keyword matching been only the metric for top ranks, flickr.com wouldn’t rank for “photo sharing” - would it?
  3. 6% Keyword density means better Optimized pages
    One of the most popular formulas for SEO copywriting is the 6% keyword density on the keyword. Sadly, this isn’t very strong these days and is reduced to the level of one of thousands metrics available to Google to decide what the content on your website is all about. I’m not trying to say that you should completely forget about it. You certainly can follow a 6-7% keyword density if it makes sense to you and is normal but don’t go for it mathematically and if it makes a bad unreadable copy.
    Of course, Google has other methods (many of them) to find out what the content on your website is all about. So let’s say, it would work on a case to case basis.
  4. Don’t link to others or you’ll lose your page rank
    We’ve discussed this many a times already. Linking to other websites will not bring down your page rank, it only passes the link juice to them without reducing the original page rank. Your page rank is completely dependent on who links to you, and not whom you are linking to.
  5. Flash means bad SEO
    Initially, flash was a big monster SEOs had trouble with and we always recommended people to either avoid flash or go for HTML alternatives. But things have changed and fortunately now, search engines can spider links inside flash files embedded, and even crawl text in flash files. Just that you may have to guide them to it, if they don’t do it automatically, but clearly, flash is not a SEO spoiler anymore. We’ve learned to live with it.
  6. You have to wait weeks together to get indexed on Google
    Getting indexed on Google is not hard any more. Gone are the days when you had to submit your site to Google ad wait for weeks to see it appear on the SERPs. But now, all you need is 48 hours to get indexed on Google.
  7. Directory submissions will boost your ranks
    How I wish that directory submissions worked. No, no way! It’s an old school concept and does no impact whatsoever on Google SERPs. Yes, you can show the number of incoming links, but invalid, non authoritative links from directories are no good compared to relative, contextual links, fewer in number but from trusted sources on Google.
  8. Link building campaigns are cool
    Link building is good, real good. But I somehow can’t agree to the “campaign” part of it. It’s no exercise you ought to do, but something that should support the growth of your site. As a matter of fact, natural link building by “convincing” people to link to us using methods such as social media optimization and link baiting is the best way to do it. Short term link building and sudden backlinks should raise red flags on Google, so refrain from such “artificial” link based campaigns, rather focus more on conten development aiding to natural links.
  9. Two months quick fixes is all that you need
    One of the things that makes you lose your entire credibility on the search engines is trying to do something too quick and hastily. Even with all the metrics in place, it’s definitely not possible to gain consistent ranks on Google within a short span of time, so none of those quick fixes are going to work. Take your time, set your targets and work towards it in a healthy way.
  10. Traffic does not help in SEO
    Traffic is a great “catalyst” in gaining good reputation with Google. It may not directly help you with the SERPs but it sure helps you get into the good books of Google if you can consistently produce good traffic. Well, having good traffic just means that you have some potentially good content, is it not? And google knows it well, may be. So gaining traffic is great and an easy way to get some traction.

Essentially, it means that there is no formula for SEO available, that if applied will give you instant results. All the metrics have to be weighed and calculated to design the right strategy for each website.

(9) Comments    Read More   
Sep
01
Filed Under (News) by Mani Karthik on 01-09-2008

DSB is going pretty consistently, with focused traffic and healthy growth, showing every signs of loyal readership. Revenue is increasing consistently, I wish I had more than 24 hours to pick up more projects, as of now I’m booked 18 hours a day.

Google analytics stats

Alexa ranks may not be the best way to represent your traffic, but that’s the only method I have been checking to compare site traffic with others and this time around it’s looking interesting with DSB in league with SEOMoz and Mattcutts (Now, I don’t want to comment on that.)

(11) Comments    Read More   

There are of course a bunch of basic SEO elements that you should check your site for occasionally like the W3C validation, redirect errors, sitemap generation errors etc. These crop up unknowingly and are identified only when a problem occurs. Example - If you are using the All in One SEO plugin, in the settings there is an option to specify the xml file template. Now, if you had uploaded the SEO plugin to a directory after renaming it, then chances are that the template fails to load and your sitemap will be broken. Where as if you upload the plugin directory without renaming it, then there won’t be errors. This happened to me sometime back with another site of mine.

So, basic errors like this one goes unnoticed and often requires checks. A few things that you mus check regularly are,

1 - The sitemap of course
Check for updates, frequency set, unwanted URLS etc, Rebuild the sitemap if updates have not been made, check the sitemap accessing from the Google webmasters, if template is broken, check the folder name on FTP and suggest the right URL.

2 - Broken links
Check for broken links on Webmasters, if you find them, download the entire table and correct them by either suggesting a 301 redirect or pulling them down from the site.

3 - Titles
You know that titles are crucial, so checke them for duplications and wrong code. Sometimes, there could occur a clash between the theme code and plugins that even double entries occur in some cases. You might want to do a bit of code tweaking to correct it.

4 - Meta headers
Of course, meta headers like description and keywords are not an y longer going to help you in deciding your SERPs ranks, but the meta description tag is very important as it decides the “clickability” on the SERPs. Assuming that your site appears on teh SERPs in the first or second age, if people have to click it, the meta description or the snippet should be interesting and compelling. So make sure you don’t end up using the same meta description for all posts, instead make it exclusive for each article.

5 - Typos in URLs
Well, I can’t surely tell if they are errors or not because some bloggers say that they have good traffic coming from typos. But for the guys who wants to have things in place, make sure you check out your site listings on the SERPs with the “site:” search and find out typos on URLs. If there are typos, make sure you give a redirect on the old URL to the new one, before the new one is updated on Google index.

6 - Dual H1 headers etc
Sometimes due to your theme options, there could occur two or more H1 tags on a single page which is not good. It is always advice able to use the standard format of H1, H2, H3…H4 according to decreasing priority of text. H1’s generally occur only once on a page, but duplication of the rest are fine.

7 - Robots.txt exclusions
Sometimes even though we might carefully use the robots file to exclude files and folders, due to continuous usage there might be files and articles we wrongly placed. Like placing an article into a category that was once banned by the robots file. It is advised to check the robots.txt file once a while for indexing issues.

8 - Inline style v/s Stylesheet clash
We all have issues with stylesheets (at least I do) and sometimes we all use inline styling to get things fixed. Excessive use of inline styling can lead to a bloated code and this might result in bigger file size and cross browser incompatibilities. It is advised not to use inline styling a lot, and if there are particular codes that you’ve been using, try to incorporate into the stylesheet.

9 - Redirect Issues
This is not very common but happens with guys who have done a lot of moving around from hosts and deal with Apache mod redirects. Although 301 redirects are fine, usage of generalized code for redirect can cause unwanted redirect issues on articles that came later. Like a 301 redirect issued to all URLs in one category, that redirects all articles that came past the redirect.

You can either do these checks manually or use the numerous site validation and SEO check tools available online. I use this tool, for basic checks and of course the Google Webmasters console occasionally.

(6) Comments    Read More   

Been speaking to a young blogger who asked me, “Isn’t all blogspot blogs owned by Google, so why don’t they promote it up the SERPs?”

We’ll I’d be running out of job in that case, and I’m glad it’s not. But the answer is that Google wants an accurately indexed data that’s completely based on relevancy. It does want to show up the most relevant site/sites based on a keyword, be it on Google or Yahoo network. So it’s the accuracy and relevance that takes the front seat (and all the rest of them too), well nothing else matters to be honest. And Google has an algorithm that’s always updating to flush out irrelevant results while updating with the most precise and accurate information.

So, in the flurry of blogs, how can one make sure that his website meets the right criteria for relevancy that it gets the right visibility on the search engines including Google?

Clearly, the answer has to break out of H1 tags, and Keyword densities to more complex SEO and SM strategies.

So imagine you have a very young blog or a pretty old blog with great content. Somehow, you’ve not been able to get the right kind of visibility and movement.

You are there on the Google index but not at the first page.
You have great content but does not get lot of comments for it.

Let me suggest you a simple ten step strategy that will catapult your site to getting the right traffic and the right search engine love. I’ll make sure I don’t use any jargons, if at all something in comprehensible creeps in, feel free to comment for all your doubts.

Step 1 - Get a clean, error free, SE optimised backend
This is the foundation of your entire empire. So take lot of time analyzing and tweaking it. Whatever you do today is an investment for the future and if you are not doing it right today, you can never do it. Get a complete search engine friendly and optimised theme/design/platform. Once you set it right, make sure you forget about it and you’re not going to waste time on it anymore. So do it once and for all.

Step 2 - Find out how strong your competition is
Next, find out with whom you are sharing the same domain with. I don’t want to call this competition, because who knows, at one point of time they simply won’t be. So find out who are the big players already there in your domain, if they are too big, find out a niche, or else study your competition thoroughly. Find out what they are doing, how good and how bad. Many a times, when you are clueless about where to start, just take advantage of your competition and take inspiration from them. Do what they are doing - but in a better way, don’t ape them. The goal is to cover whatever they’ve done in a better and fresh way. Don’t rush with it, but plan accordingly. For example, say you have a big blog in the same domain with 360 articles to their credit. Plan on writing 4 posts per day and in three months you’ll be better than him.

Step 3 - Find out where and how big your market is
Also, find out who else are you competing with, and how big the audience is for all of them collectively. If there are five great blogs in the same domain that you wanted to start, remember that all the subscribers to all the five blogs is what you want. And it doesn’t mean that you have five guys to fight with, the audience on all the blogs would be the same, and you just need to beat one guy and you’re up there. Finding out how big your market is really important as you don’t want to spent your time and energy on something that isn’t there at all.

Step 4 - Understand your market and audience
Next up, find out what your audience behavior is. What type of content do they like, and what do they hate. What would they buy and what would they deny? Carefully analyzing your competition will answer many of those doubts. Pick up clues from it and mark them down as your core values. You will only leverage on those kind of topics and products. This ensures zero wastage of resources and maximum returns from your efforts.

Step 5 - Find out where the demand is, and make it your core competency
Now, that you know whom you are catering to and whom you are competing with, find out where the demand lies. This is all about finding new opportunities. Your competition might not have been doing this, but while you do your homework, you can easily find out what’s missing in the market. Once you find this out, plan ahead on how and what tool you are going to use to deliver, to it precisely. Now, this would’ve given you a clear idea on what you need to do when you start and what your core competencies should be.

Step 6 - Discover what you can deliver - Stretch
Now that you know what your product is and what your core competency is, find out ways to deliver it. Your competition churns out 5 articles daily? Then you got to do 10 posts a day and that too with all the targeted data we found from the above points. This initial stretch work you do will give you a great push that will out you in top gear.

Step 7 - Raise your quality bar, Increase your time spent
Now, while you do the stretch work, it is quite possible that you lose focus or face lethargy. For this raise your quality scores. Try to exceptionally well, churn out exclusive stories that none has done before. And people will instantly recognize you.

Step 8 - Indulge in networking, utilize it
You cannot win alone. All the big guys are already up there, and have a huge fan following. If you have to do something similar, with less time in hand, the only thing you can do is contribute to others, as many as possible, and try to network with more. Write down somewhere in your mind that you are not here to benefit from others but to help them win. But as you try and help others more, your network will grow without your knowledge. The only way to succeed in social networks is by contributing more to it. It is directly proportional.

Step 9 - Listen to feedbacks and tweak yourself
Now while you are at it going full throttle, spare some time for feedback. Feedback helps you to check if you are in the right path or not. Listen to criticisms and suggestions, share it with others and filter out the weeds. If a particular element in your template code is creating problems with the user, tweak it. Do this constantly over a period of time and at one point of time you’ll get only the weeds you filtered out earlier. You know what to do with it.

Step 10 - Remain consistent at whatever you are today
That brings us to a full circle. Now go to step 1 and check if you are in the same domain or nearing it to the competition. If you notice any remarkable changes, like in SERPs that means you are doing things right. The only thing that you have to keep doing is be consistent on your model. Keep researching on niches and keep delivering articles that are better i quality to your competition.

Well, the above said points are strategies that will help you catapult your product/service/blog to the major leagues if you are starting new within a smaller time period. I haven’t mentioned the technical side here, because it will differ on a case to case basis and cannot be generalized. What strategy you pick depends largely on what domain you are in and what competition you are facing, so that’s up to you.

If you are looking for examples on how this 10 step model works, here’s a client of mine called Great Wraps. They are one of the first sandwich franchise chain in America. Based in Atlanta, Greatwraps had a pretty cool website(beware of the flash stuff, it’ll start making funny sounds), but they somehow couldn’t make it to the top of SERPs, which even some late comers made it to. I had worked on this site for a while now, and we are only about quarter past the total efforts and we are making good progress. For keywords that we weren’t ranking at all for, we are now on the first page or the second page. That’s a good start according to me. Apart from the keywords, it’s the quality of content and the clarity in what you want that will make your path to success clear. With a good vision and planned strategies like the 10 step one above, you can hit the sweet spots wasting zero energy and time. So, start with the low lying fruits today.

(4) Comments    Read More   
Aug
21

What is a ‘Nofollow’ link attribute? Is it the same as ‘Nofollow’ meta tag?
Many of us often get confused with this question. Most of whom I ask this question to believe that there is only one Nofollow tag. One that’s got to do with the links. I’m inclined to believe that there are two usages of the nofollow tag.

One - The Meta Nofollow “tag” and..
Two - The Nofollow link attribute..

Technically the nofollow tag does a basic function. Instruct the search engines how they should value a particular link, by either following it or not. Despite the command, search engines behave differently in understanding the Nofollow tag.

The Nofollow Link attribute
A nofollow link attribute is used on selected links by adding the rel=”nofollow” attribute to it.
As far as Google is concerned, it does follow the link (technically) but does not indexes the linked pages content or passes any value to it (link juice or page rank). In effect, the linked page is irrelevant to Google.
Format Example: <a href=”http://www.dailyseoblog.com” rel=”nofollow”>Anchor Text</a>

The Nofollow meta tag
This meta tag is added onto a page instructing the search engines to clearly stay away from all the search content and/or the links on that page.
Format Example: <meta name=”robots” content=”noindex,nofollow”>
There are two factors specified here. One- the content and second the links.
Noindex attribute means none of the content on the page will be indexed by the search engines (May be read but not saved or remembered).
Nofollow attribute specifies that all the links on the page must be ignored and not valued for.
(If the page is to be indexed,and links ignored then the attributes can be content=”index,nofollow”)

When and How to use the Nofollow link attribute on blogs - The Good practices
If you are on Wordpress, by default all the user generated links (comment area) are nofollowed unless you modify the code or use a nofollow remover plugin.

  1. Use nofollow on sponsored links or advertisers
    Sponsors or advertisements on the site may not be relevant to your sites content all the time, so it’s better to nofollow them and let the advertiser know why you’re doing it.
  2. Use nofollow on comments
    Comments can throw up any number of completely unrelated links, and is a major concern to your authenticity. While removing nofollow on comments will encourage comments, it largely affects the quality of content on your site.
  3. Do not use nofollow on all the links on a page if content is genuine
    Some bloggers believe that they can save all the google link juice coming into their website by not letting any of it go to external websites, by nofollowing them. This is not true. Linking to external sites will not affect your pagerank and is purely based on who links to you. I’d say it’s foolish and selfish not linking to any external website.
  4. If there are pages or information you don’t want to appear on Google, use meta nofollow tags
    Be wise. There might be some content on the website like duplicate content, or some files/pdf’s that you want not to appear on Google index. Use nofollow in these cases wisely.
  5. Use nofollow on external links that are not related to your sites content
    When linking to websites for reference, check if the sites content matches with your’s or compliments your sites content. If it does not, and have the possibility of falling into bad neighborhood, it’s good to nofollow those links.

Essentially, the selective use of Nofollow is encouraged because it tells Google that you respect their algorithm and will do all that you can to support and not corrupt it in any way.
While it is possible for us to completely ignore this, and link to anyone and everyone you find interesting, there is nothing wrong as Google has it’s own ways of finding out if a website is good/bad/ugly and does not depend on one particular website.
But as a good practice, it’s always good to stick to good networks and selectively allow/disallow links so that you gather the authority and authenticity as you move along.

More information on how differently Google, Yahoo and Live sees nofollow tags, check this article.

(5) Comments    Read More   
Aug
07

Overheard this question at a discussion - “What happens when I link to many sites from my site? Does it affect my Page Rank? I have a looong blogroll, you know.”

Indeed a very authentic doubt. Fortunately there’s nothing to fret about.

Let me take an example now.

Let’s assume we have four sites. Site A - And Site 2, 3 and 4.

This is how the linking pattern is for all of them.

Site A (PR 6)
||
||
v
Site No. 2 (PR 5) [your site]
|| ||
|| ||
v v
S.3 S.4

Now, Site A has lot if incoming links from valuable sites that it manages to get a PR6. As per the above diagram, Site A links to Site 2, which gets a PR4.

Now, let’s assume that Site 2, links to lot of sites (including Site 3 and 4..upto Site 10).

The question is whether Site 2 will lose all it’s Page Rank linking to 10 different sites or not.

And the Answer is NO.

What happens is instead of Site No.2 losing it’s page rank, all the 10 different sites it linked to will get probably PR 1 (PR5/10)

Had it linked to just one site, (Say only S3) then it would’ve received a PR 4, because PR5 is not shared with ten other sites.

So in effect, when a site links to more number of sites, there’s nothing much happening to the original Page Rank of the site, but it will affect the sites it is linking to. If the PR5 site link to fewer sites, then they have a higher chance of getting a better PR, but at the same time, if PR5 site links to ten different sites, then obviously the PR5 has to be shared with all of them so, each website will get a lesser PR. Again, no effect to the initial Page Rank of 5 to the original site.

This is the reason why you should be careful while “selecting” links from other websites. If there are many other sites sharing links with you (like on a bloroll), the you are likely to get lesser PR, while if you source it from sites that links o lesser sites, you have a better chance to get a higher page rank.

So guys with long blog rolls - no problem , add more.

Related reading- Imp of incoming links.

(15) Comments    Read More   

When there are more new websites sprouting up everyday, at one point or other (soon after the launch frenzy mostly), this question is asked.

You launch the best products, the best website with the most modern technology and th best content etc, and one fine day you find that there’s this old website, which has been ranking for your targeted keyword for years together on Google. They don’t have a web2.0 website, probably a bunch of HTML pages but they’ve been there for years and you know that Google loves it. Can’t think of one good reason, why you should rank better than the old guy.

Well, let’s explore some possibilities.

Let’s take this example. Great Wraps Sandwich Franchise, is a food chain that sells American wraps (not the normal sandwiches). Now, they are in an interesting situation. Food franchise industry is big in the US and let’s assume that Greatwraps wants to rank for keywords top keywords that the old websites are already ranking for (for years). What does it do to rank higher?

Strategy 1
Adding more fresh content on the site
Yes, that means a blog. As you can see, the site is built mostly on flash and has very little text to support it. And Google loves fresh text and new pages, so this is a great opportunity GreatWraps can embrace, making sure that new and fresh content is dished out daily targeting a series of keywords around the primary keyword.

Strategy 2
Tweaking the most important pages and targeting them for the primary keywords
Along with the numerous fresh content being dished out, it’d also be a good idea to re-work on the available text on the most important pages, and tweak them to leverage on the primary keywords. This would involve adding more textual content with the SEO metrics in place, also reducing the flash content and it’s positioning.

Strategy 3
Targeted link building
Apart from the onsite optimization, it’d also be nice if it’s possible to source some very relevant, high authority, very very targeted links. One or two and that’s about it. No directories no link farms, may be some .edu’s if possible.

Strategy 4
Competition digger and finding the alternate way
I like this and is one of my personal favorites. Analyse your competition and it’s very much possible that they’d be still sticking on to the highly concentrated and saturated keywords. Older websites though ranks high for the core keywords fails to diversify their reach. This is an opportunity. Being a new site and possessing the latest tools, what you should focus on is the long tail keywords and the primary keyword variations. And I think it’s very effective because, suppose the older website gets 1000 visitors and is rank no.1 for “main keyword”, forget that and focus on five variations “main keywords” “main keyword place” “main keyword also” and the like. And you’ll make 500 visitors on each keyword making it 25,000, which is clearly above the old guy.

Strategy 5
Use the social media and make friends everyday
Now, with all that fresh text, link building and alternate ways opened up, use the social media to get some fresh burst of traffic consistently. And you know that Google loves sites with more traffic. And while they find that you have more traffic, and has the basic stuff in place, then it’s a cakewalk and only a matter of waiting to see your site on the top.

So essentially, getting a new site to rank better than an age old site is pretty much possible provided you know what to do and what not to do. :D Again, SEO is not about formulas but about alternate strategies.

(9) Comments    Read More