Oct
10
Filed Under (Search Engine Optimization) by Mani Karthik on 10-10-2007

We had seen earlier that Google is now indexing your site quick and fast as never before. More of this has got to do with the frequency of content change that’s happening in your site. And blogs are at an advatange here compared to other static paged websites.

Google quick indexing

This is a screenshot from my indexing results I took this morning. Soon after I had published an article (Reader’s Questions), the article was absorbed/indexed by the Google bot. It took only 39 minutes.(See the screenshot) The indexing must have been done earlier but I checked it only at 39 minutes late.

One more proof that you can decide yourself how often the crawler should visit your site. Just give it food every often and the poor crawler will visit frequently.

If you'd like to stay updated with SEO, grab the RSS feed now !What's this?

(8) Comments    Read More   
Oct
09
Filed Under (News) by Mani Karthik on 09-10-2007

Google has announced officially that it has now tweaked it’s settings that if you have more content updates, the crawlers will absorb your data more frequently.

The quicker the content update, the quicker the crawling. This is with respect to the data from your verified blog in the Google Webmaster’s dashboard.
I’m assuming that this might have some effect on the data collected by the Google search bot too. Anyways, the more the content update, the better - Thumbrule!

(3) Comments    Read More   
Sep
06

gold seo tips
Photo courtesy - Somma

Here is a compilation of all the search engine optimization tips for bloggers ( Wordpress, Blogger, Typepad and other platforms) featured here on DailySEOblog. Bookmark this page and you can refer to them any time - honestly quite some of them are really good tips particularly catering to Google, which are not featured elsewhere. Some of them are basic SEO stuff, you may want to have a look.

Importance of primary and secondary keywords - What are primary and secondary keywords? Why should you select them, and how they will help you rank high on search engines.

Five sure shot tweaks to rank high on Google - Must read for bloggers - The top 5 things you should do to ensure high ranks on Google. Only the relevant SEO tips for Google.

How interlinking your pages will help you rank higher on the SERPs - Importance of interlinking, why and nhow should you interlink your pages so that the google bot catches it.

Optimize your Wordpress categories to avoid duplicate content - Wordpress creates a lot of duplicate content by default. Here are tips on how you can reduce it.

Create a user-navigation sitemap for your blog - Crawlers like it - Sitemaps are not only for crawlers, here is how you can create a manual sitemap with ease.

How to avoid duplicate content on Wordpress? - All the tips you need to know on avoiding duplicate content on Google.

User server location advantage to rank high on Google - If you are hosting a website, did you know that the location of your server will help you gain edge on regional search engines?

Optimize the robots.txt file for Wordpress, allow your blog to rank high - Robots.txt is an often ignored file, which is actually an excellent tool that will help you get more files indexed on google and thus rank high. Here ate the tweaks.

Create and submit a sitemap for Yahoo - Sitemaps are different for each search engine, here’s how you can create a sitemap customized for Yahoo in easy steps. 

Tweak title tags of your blog to rank high - Title tags are very important to get search engines attention. here are tips on how you should write an attractive title.

Highlighting your content with a SEO friendly layout - Having great content is not enough. Here are tips on how to decorate it to cater to the spiders.

Importance of footer text in SEO - Footer text is an ignored element which can be used effectively to feed information to the search engines.

How to build a sitemap for large websites and blogs - Building a sitemap for a small blog or site is easy. But if you have a large website, it turns messy. Learn how you can still get a great sitemap ready without mess.

Importance of allinanchor text - Why is allinanchor text and how you should take care of them.

What are supplementary results? Do they affect you? - Everyones, night mare once upon a time.

5 vital SEO stats that you should keep track of - If i were to suggest you 5 SEO metrics that you should constantly keep track of so that you can keep your positions in tact, here are they.

Does Google page rank affect your rankings? - Google page rank is often given more importance than what it actually deserves. Read this article to study the facts.

Importance of incoming links - Why is an incoming link important and how they will help you rank high.

Keyword usage in site content - facts and myths - How should you use keywords in your site content. Do’s and dont’s.

How to SEO on an Ajax-ified site? - When your site is AJAX-ful SEO becomes tough, but here’s how you can effectively harness the power of AJAX as well as SEO together.

How image ALT tags helps you to rank high - Why should you use image ALT tags, where you should use and how?

Flash and SEO - moving together the web2.0 times - How to optimize pages in flash.

Creating SE friendly post titles - What is the importance of titles, how should you write titles to get the edge over other blogger who has the same content.

How to get indexed by Google in 48 hours - Getting indexed on Google is not easy, but here’s a tip that will ensure you indexing in 48 hours or less.

Selecting keywords for SEO - What are keywords and what keywords should you select for your blog?

SEO friendly layout - Ensuring a SEO friednly layout will help you rank high on search engines automatically. See what factors makes your blog SEO friendly.

(7) Comments    Read More   
Sep
06
Filed Under (SEO Misc) by Mani Karthik on 06-09-2007

Let’s talk today about good and bad menus first. Good menus are those that are SEO friendly, bad ones are those that are not. Simple logic.

When dealing with clients, one thing that’s common to all, is about the use of fancy javascript navigation menus, that troubles me most. If you use one, please refrain from using it. If you are a blogger, there is very little chance that you are using one. But the JS enabled menus are mostly found in portfolio like pages.

A few words about portfolio pages. They are a gold mine for onsite optimization, there would be many wrongs. They are all frills and no stuff. They are stupid. They make lot of noise.

Many a times, they have this animated JS enabled menus that are not crawlable by bots. Big mistake and out subject of talk today.

Crawlers by default, avoid anything that’s Javascript enabled and Ajax. This is one fallback of modern crawlers, even the google one. They haven’t deviced a technology that will help them crawl the contents of JS enabled elements on a webpage and AJAX. They had a problem with Flash, but it is almost rectified now.

  • Javascript/AJAX powered navigation menus may look cool and attractive to the eyes, but avoid them whereever possible. Instead use CSS and you could create almost similar spectacular and eye-catching menus.
  • Don’t use images for making menu images, instead try CSS, it’s cleaner, quick and easy to manage.
  • If it’s completely necessary, keep the JS menus, but submit the linking pages to google.
  • Avoid sound effects please, it’s so 90’s!
  • Use text to link to pages in the navigation menus if possible.
  • Place the navigation menus wither on the top, or the left hand table, or at a place where crawlers won’t miss it.
  • Don’t embed them in frames - big mistake!

So the idea is that Navigation menus should be simnple, crawlable, easy to navigate and thus serving the purpose. Let them not be too fancy, complex and JS enabled. It defeats the purpose.

You can find excellent resources on making CSS Navigation menus here
CSS Tutorials on Navigation menus CSS Menu help High quality CSS Menus

(2) Comments    Read More   
Aug
20
Filed Under (Search Engine Optimization) by Mani Karthik on 20-08-2007

With the advent of Wordpress and other arrays of blogging softwares, poor old forums have lost their charm. 3 years earlier, PhpBB was the most used software in forums, almost every webmaster had a forum about something. It was an instant hit since it’s launch because of it’s folksonomical nature, but in a more un refined way as compared to the latest softwares.

One major reason why a forum failed is because of it’s non SEO friendly nature.

  • Forums did not have SEO friendly urls
  • They did not allow bots to crawl the inner pages
  • They had SEO un friendly titles
  • Forums in general did not look at SE’s with importance at all, so they did not leave any room for tweaks as well.

With all these problems in front of us, let’s see how we can make the forums (PhpBB mainly) SEO friendly?

  • Optimizing the page titles
    Page titles are a crucial factor for search engine optimization. Crawlers pick up the titles from every page and it’s important to give maximum information in this area in a non-spammy way. Here’s how you do it by some quick edits.

    - Edit file ‘templates/subSilver(or your template)/overall_header.tpl’

    Replace -

<title>{SITENAME} :: {PAGE_TITLE}</title>

with

<title>{PAGE_TITLE}</title>

or

<title>{PAGE_TITLE} :: {SITENAME}</title>

Edit Viewtopic.PHP

Replace -

$page_title = $lang['View_topic'] .’ - ‘ . $topic_title;  with
$page_title = $lang['View_topic'] .’ - ‘ . $topic_title;

  • Tweaking the robots.txt file to allow better Search Engine indexing.
    Robots.txt can be used to control what pages is available to search engine robots and what pages are not. Generally on PhpBB, there are lot of sub directories that are allowed to crawl by default. But actually this is not required. We only need to allow certain folders and files to be crawled - so that only the necessary information is available to the crawler.

    Including the below code in to your robots.txt will disallow all the unwanted folders from being crawled by search engines.
    This way only the necessary information (those in the posts) are available for crawlers and the junk is filtered out.

User-agent: *
Disallow: /admin/
Disallow: /db/
Disallow: /images/
Disallow: /includes/
Disallow: /language/
Disallow: /privmsg.php
Disallow: /profile.php
Disallow: /search.php 
Disallow: /templates/
Disallow: /common.php
Disallow: /config.php
Disallow: /faq.php
Disallow: /viewonline.php
Disallow: /groupcp.php
Disallow: /login.php
Disallow: /memberlist.php
Disallow: /modcp.php
Disallow: /posting.php
  • Plugins to help create sitemaps for PhpBB
    Here are some cool plugins/mods for PhpBB that will help you create SEO friendly urls and even sitemaps that will help you get more pages indexed on Google. Here goes -
  1. Google, MSN and Yahoo sitemap generator mod for PhpBB - This mod will create a Google SitemapIndex and auto generated Google Sitemaps for phpBB
  2. Google Puller - This hack creates pages with 500 links to posts on your community. The words are taken from actual posts on your community (from the search_word table) and they link to the post that contains that word.
    The aim of the hack is to help search engines like Google or MSN to index your pages.

Part 2 of this post will follow with more methods on how to optimize your forum software for more search engines.

(19) Comments    Read More   
Aug
13

I had to write about this. There are lot of fraud SEO’s around. No doubt. Many of them somehow downloads a pirated copy of some popular SEO book ( there are many around ) and becomes SEO’s overnight (That’s not my idea - I read it on the DP forums ). Now they go on to set up a one - page website with a Google, MSN and Yahoo logo with a dart picture (un-licensed) saying something like “The world’s best SEO - Rank No.1 guaranteed “. Have you seen one?

It’s easy to make them out as frauds from their overly detailed testimonials and super-duper client list (Some even have Yahoo on the client list - what the..?). It’s essential for genuine SE optimizers to make aware the public on these frauds and hence this post.

This post is inspired from this article by Philip Lenssen. I thought this post might enlighten you to make out between fraud SEO’s and genuine one’s from their claims. Though Philip has mentioned many white-hat terms here, I think it’s OK to assume that many of them are used by fraud SEO’s than genuine ones.

He says

In the SEO industry agencies, experts and even bloggers have adopted a special mode of speech not to say slang that might be misunderstood by outsiders like clients, website visitors or the general public. To help you understand what search engine optimization experts really mean I devised this real glossary of SEO speak:

And here is the article -

What they say…
What they mean…

We offer Search Engine Optimization/SEO
We assume you are the Google bot and want you to index this page for both keywords

We offer Search Engine Optimisation
Our SEO company is based in the UK

Guaranteed top positions
We place Google Adwords for you

We do SEO, SEM, PPC to increase your ROI
We do not want you to know what we do

We stick to the Google Webmaster Guidelines
We only break them in a way that we assume Google won’t notice

We tell you how to make money online
We want you to click on our ads

10 ways of making money online
Those are our 10 affiliates, please click on the respective undisclosed ads

We offer social media optimization
We got several accounts banned at Digg

We offer link baiting services
We want to put those drunk naked ladies video on your site

Our network
Our link farm

Authority sites
Sites that do no SEO

Black hat SEO
We do anything to get rich quick, even if your site gets banned

White hat SEO
We only cheat Google where we have to, others do it too, come on!

We optimize for Google, Yahoo, MSN, Ask
If we fail in Google you still have to pay

Search Engine Submission
We need your mail address, those guys offered us $$$ for each 1000 verified addresses

Partners
People we never heard of until we exchanged links

PageRank optimization
Sorry, we just started doing SEO and do not have a clue

SEO India
We offer 1000 links for 30$

Alexa optimization
All our employees have the Alexa Toolbar installed, it really works!

(1) Comment    Read More   
Aug
04
Filed Under (Wordpress) by Mani Karthik on 04-08-2007

Did you know that your wordpress blog is not completely SEO friendly?

Of course, wordpress is 90% SEO friendly with it’s title tags, seo friendly templates and the like. But there is one loose point in wordpress that spoils the whole show.

I’m talking about duplicate content. This matters most with Google.It does not like duplicate content on any site. And Google’s handcuff to this problem is Supplemental index.

Let’s see how Wordpress is responsible for creating duplicate content on your blog and ultimately your blog ending up in the supplementary index.

Villain No.1 - Archive pages

Villain No.2 - Categories

Archive pages
Have you set your default archive settings to daily, weekly or monthly? If yes, you are in trouble. When google bots visit your site, it sees the same content first on the post page, then on the archive page, and third on the index page. So it is duplicate content three times.

Categories
Do you have the habit of tagging a particular post in more than one category? Trouble again!

How to overcome?

Solution 1
- Do not archive pages. If at all you are archiving pages, make sure that robots/crawlers does not crawl those pages.
- How to ensure robots don’t crawl archived pages.
- Use nofollow robots meta tags on the archive template.Here’s how you do it -

<meta name="robots" content="noindex,follow">

-Or use this Duplicate content cure plugin - It automatically adds the nofollow robots meta tag on to the archive pages, so that archived pages are not crawled by google.


Solution 2
- Do not categorize posts in more than one category. One post - one category. Mixing up categories makes duplicate content for the crawlers as well as creates a non-user friendly navigation structure as users are likely to see the same posts on each category.

(11) Comments    Read More   
Jul
31
Filed Under (Blogging) by Mani Karthik on 31-07-2007

Google penalizes blogs and sites for a number of reasons. John Chow is the recent victim. He was taken off the google index recently for doing something that violated Google webmaster guidelines.

John had been running his “Link to me - to get linked back” scheme, whereby if you write a small review on his blog and link to him with the anchor text,”Make money online”, he would link back to you. Many bloggers linked to him and got linked back from John.

Now that John Chow is kicked out of Google Index, what does it mean to other bloggers who have linked to him?

Google says it clearly in it’s guidelines that -

Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.

So that means anyone who linked to John Chow is going to get the penalty this Google PR update. Yikes! Are you one of those?

Does it affect John Chow?

Nope. The man had made the blog hugely popular and enjoys a steady source of traffic.He’s even declared that banning from Google has done no detrimental effects on his income.

Does it affect you?

If you’ve linked to John, and he’s linked back - well yes may be. Only this Google PR update will tell you.

So what are the things that I shouldn’t do to get a bad reputation with Google?

Yeah, the guidelines are a bit boring to read through. so let’s me list out the most overlooked things by most of the people.

  1. Choose a good server that has no downtime - Google does not like servers that has frequent downtimes. Worse - when the Google bots visit your site, if your server is down, they flag you red!
  2. Don’t do those black hat SEO stuff like keyword stemming, cloaking, doorway pages etc.
  3. Don’t use a hosting that has a shared IP of a banned user on Google - Google may think you’re friends?
  4. Too much similar anchor text linking to you - Google guesses it out that you’ve done something artificial to get those links, hence banned.
  5. Stupid link building campaigns - Like John’s stay away from them. They are temporary.

Since the Google PR update is nearing, make sure that your blog is safe of these troubles. Remember, no matter how huge or popular you are - if you are not white hat and genuine - Google is going to get you one day - once and for all!

(9) Comments    Read More   
Jun
26

Today, let’s see more about sitemaps. Every webmaster must have a sitemap ready for his site and submit it to google in order to get all the pages listed on Google.Sitemaps are of two types, as you know the HTML sitemap you use to navigate a site and second the sitemap used to help crawlers crawl the pages more effectively.

Why are they necessary?

Sitemaps are not necessary.(Yep i said that) Even if you don’t have sitemaps the crawlers will crawl your pages and find the content. But, it is like letting them crawl in a dark room. What if you had a well lit room with all navigation and helpers around which will take them to each room? It will be more effective right? Sitemaps serve this purpose.

It has the site structure ready giving indication to the crawlers as to which are the folders/files that are important, which are not, which are the folders/files that are to be visited frequently, and which are the ones to be visited only once. This helps the crawlers to undersand your site more effectively.

Now, how to build a sitemap for blogger?

It’s very simple in Blogger. It only requires you to go to the Google Webmaster Central and ass your site feed and the sitemap is automatically created. You can get detail instruction on this here.Make sure that you submit your full feed and not partial one.

Which is the best sitemap generator program around?
There are lot of free online and downloadable sitemap generators.
Here’s a simplified listing of what is best.

1- Python Scipt - This is the most difficult one to install. But if you are familiar with python, then this is the best one around.It’s automated and requires no additional support.I don’t recommend it for a beginner.Requires technical knowledge.

2 - Online sitemaps - This is best for small websites. It’s easy, simple and online.Just go to this site and submit your url.Fill in some basic details like time and priority settings for the files and click go!The whole sitemap will be generated online.You will get both ROR file and the Google sitemap XML file.If you are interested only in Google, use the XML sitemap.The format is according to Google sitemap protocol and is faultless.
Best choice for beginners and small websites of less than 500 pages.

3 - Gsite Crawler - This is a downloadable application. If your website is a bit large and you have time to tweak some settings and is serious about sitemap, then i would recommend this guy for you.
It requires you to give the website url, then select the types of files to be scanned from it, priority settings are automatically detected, and you can create bot Google sitemap and Yahoo url.
It has report generation as well that will give you an idea of how many urls were crawled and broken links etc.This is very useful while handling large sites.

How to make sitemap for large sites?

If you have really large websites for instance a one million page one, then it’s really going to be tough creating a sitemap. Practically this is possible with the Python script but if you are not okay with the technical stuff then you got to depend on sitemap generator programs.(If you don’t have a really large website the follwing piece of information may not help you.)

Step 1 - Download a free sitemap generator program like Gsite crawler.
Step 2 - Use it to crawl each folder of your website as separate projects.Make sure that you create a new database each time a new project is opened.
Step 3 - Now you have separate sitemaps for each folder.
Ex:- yourdomain.com/folder1 has a sitemap called folder1.xml and yourdomain.com/folder2 has a sitemap called folder2.xml
Step 4 - Download this simple index generator program.
Step 5 - Copy paste all the folders (containing the sitemaps) from thh projects folder of Gsite crawler(C:program files…) and put it into one single folder.
Step 6 - Run the index generator program against this parent folder.
Step 7 - Now a sitemap index would be created with links to all the child sitemaps but one problem, since in Gsite Crawlers projects folder(C:Program Files) each crawled folder will be named with underscore replacing the forward slash.
Ex:- yourdomain.com/folder will be named as yourdomain.com_folder
Therefore the sitemap index produced will have the links too this way.
Step 8 - Use notepad/wordpad to open the sitemap index file. Find and replace all the underscores with forward slash.
Step 9 - Upload the child sitemaps in the respective folders online.
Ex: - yourdomain.com/folder1..folder2 etc.
Step 10 - Upload the sitemap index file to the root folder and submit it to google.

Bingo! There you go you have now created a sitemap index and child sitemaps for a large website. Now submit it trough the webmaster central window and keep waiting!

(18) Comments    Read More   
Jun
07
Filed Under (Blogging) by Mani Karthik on 07-06-2007

You might have come across this topic on various other tech/seo blogs, but what prompted me to write this article is that - they all seem too techie to me, so i’m guessing it’s not reached all the bloggers yet. And this is my attempt to reach all the non-techie bloggers on what are Google supplemental results and how they are affecting an ordinary blogger..

By ordinary bloggers(no offense please) i mean passionate bloggers, who wouldn’t want to tweak the HTML code for search engine optimization but who doesn’t want to get into trouble as well due to any ignorance on SEO.

So, what are the Google Supplemental results?

When you search for a term(keyword) on google, you get results from many websites on the same topic/term.Right?
Suppose, there is a website called cats-dogs.com. It has information on “cats” as well as “dogs”. So when you search for cats, the pages on the site with information on cats are shown on the results page. but when you search for dogs the site does not come up at all.

Or simpler yet - Google sometimes think that some of the pages on your website(here cats) is not as important as other pages(dogs) on the same website. They are seen with different weightage and pages with less or no importance are kept in a separate “folder” called supplemental results - which will not be shown on the google’s results page at all.

So in the above example, if there are two pages cats.html and dogs.html on the same website, dogs.html will not be considered at all to be

shown when some one is searching for “dogs” on google, while cats.html will be shown for searches on “cats”.

So two types of pages on a site - 1) Normal Index and 2) Supplemental index. Get the idea?

Now, if you want to see how many pages on your site are in the supplemental results category, just use this query at the google search.

site:www.yourblog.com *** -view (note that there is blank space before and after those three stars)

Every search result that appears on the results page now has this green tag on it which says - supplemental results.
Now you have an idea on what pages in your site will not be considered to be shown on the google search results.

Now, there’s nothing to panic. Having supplemental results on google is but natural. For example, if you check with the above query on how many pages are supplementary on google, you will be surprised. Search for site:www.google.com *** -view

What causes supplementary pages on your blog?

1 - Google thinks there is duplicate content on your blog.
It needn’t be true but if Google thinks that some of the pages in your site are having the same content, they may place them in the supplemental results.

2 - Short posts or posts with less content.
If you have posts that are very short, it is likely that google cannot make out the real content of the page, and when you have several short posts, then it is more likely that they are labelled as supplementary pages, because google simply doesn’t know what your pages content is.

3 - Template generated pages.
This is applicable to websites that generate many pages using the same template and very less change in content among thm.Since the same template is used, the content will be repeated on every single page.

4 - Non usage of meta tags.
If you haven’t used meta tags properly on your blog, bots will have trouble detecting their content from the body alone, so they may push those pages to the supplemental index.

5 - Poor linking structure.
This may be the most significant point out of all.
If you have a chunk of template generated pages or some pages that you didn’t care to link since you posted them, may be ignored due to the fact that they are redundant and have not been linked(external and internal). For this reason, google thinks these pages are less important and pushes them to supplemental index.

6 - Inequal distribution of PR.
Another important one. As i have said in another post, internal linking is very important because, you don’t want your valuable content to be ignore just because you don’t promote them yourself. It is quite natural that the wonderful piece of information you wrote in your early blogging days had not been read by anyone, so it’s your responsibility to link to it in one of your new posts, so that people read it.
This also distributes your blogs page rank in an equal fashion to all the posts, new and old. If you miss to do this, certain pages would have higher PR and some less PR.
Since google sees a pages importance on the basis of PR, make sure that none of your pages is deprived of Page Rank.

So those six points shows us that yes - supplemental pages are quite natural and any site can get it(even google).

Let’s see how we can get out of this, or stay away from getting supplemental index on your blog.

1 - Practice good internal linking.
Make sure even your oldest post is linked once in another post, and read by users.Maintain a column of best posts from your archives on the sidebar for quick access.

2 - Get quality inbound links to your blog from other sites.
This is the most repeated statement on the blogosphere these days - Write quality content rich articles and gain incoming links. Well, nobody says how patience is required for this and nobody can guarantee you anything. But hey, it is an important point.

3 - Categories
When you post articles, make sure that you don’t tag them into more than one category. If you do this way, it creates duplicate entry in each category and the chances of those pages landing in the supplemental index is high.

4 - Don’t encourage duplicate content
When you post an article, make sure that it’s not a repetition of what you had posted earlier. If it is, make the necessary changes in it’s text content so that it seems different from the old article. May be add more of a different keyword in it.

So, that brings us to the end of this article - What is supplemental index, why does it happen and how can you get of it or stay away from it. I hope this was a easy to understand article and not as techie as many other articles on it is. So check today, if you have any supplemental index pages on your blog, and pull your socks up to fight them out.

References - Matt Cutts Search Engine guide Google groups Seobook

(6) Comments    Read More