Read further only if you are serious about SEO.
Have yo got a footer text on your blog? Most of you do.
What’s a footer text?
The small code of text you see below your blog template.
Well, it isn’t always small text. That was the old times. These days, we have wordpress templates that have large footers.
You thought this is just an “out of the box” thinking eh? Not really.
In SEO, footer text carries some importance.As much as the link density on that page.If you have read my earlier post on what’s a best SEO friendly layout, then you might have noticed what i said there.
- Make sure that the last and the first thing a crawler sees is the most important thing to you.
So the first thing a crawler see is your header image/links/about you url? Well, I’ve also talked about how to keep things in place and without re-arranging how to get crawlers crawl the best portion of your page.
So, let’s talk more about footers and how they will help you rank high?
Footers are of two types(as of now.)
1 - The traditional small text footers(like you see here) mostly with the copyright text and a few email links etc.
2 - The bigger,evident web2.0 style footer text with category links and content.
If you have the first type of footer - nothing wrong.In fact it’s easier to tweak to a crawler and get our stuff done.But if you have the second type of footer(web 2.0 style), then you have an advantage - you attract the crawlers as well as human visitors - the golden rule of SEO!
Be it first or second type, what you should be bothered about are the following things.
- Is your keyword present in your footer text?
- Is there a link with your keyword?
- Is there the right keyword density within your footer text?
- Is it clear and evident to your human visitors?
- Does it look spammy?
Now you’d have got an idea on what’s important in a footer and what’s not.
In the footer text - please avoid special characters and unwanted content.Like the copyright notice.Instead, please add more relevant content.A footer text is as imporant as your body text content.
So i would not recommend a footer text like this - “Yourdomain.com Copyright - 2007, Your Address, and your Signature damn it!”
Instead i would recommend this - “Yourdomain.com, Your keyword or your blog description with your keyword with a link to one of those most important pages in the blog.”
So you get the idea.If possible, keep the category links at the footer, this will help the crawler know what the stuff is on your page.This s because, crawlers give more or same importance to the footer text(or what they see last on the code) than the content on the page.
Now it explains why a web2.0 styled large footer is preferred to a traditional style copyright statement right? But if you have one panic not. Just change it to a well drafted statement embedding your targetted keywords and links or categories even.
So much for an evil tip on How to rank high on search engines.
Today, let’s see more about sitemaps. Every webmaster must have a sitemap ready for his site and submit it to google in order to get all the pages listed on Google.Sitemaps are of two types, as you know the HTML sitemap you use to navigate a site and second the sitemap used to help crawlers crawl the pages more effectively.
Why are they necessary?
Sitemaps are not necessary.(Yep i said that) Even if you don’t have sitemaps the crawlers will crawl your pages and find the content. But, it is like letting them crawl in a dark room. What if you had a well lit room with all navigation and helpers around which will take them to each room? It will be more effective right? Sitemaps serve this purpose.
It has the site structure ready giving indication to the crawlers as to which are the folders/files that are important, which are not, which are the folders/files that are to be visited frequently, and which are the ones to be visited only once. This helps the crawlers to undersand your site more effectively.
Now, how to build a sitemap for blogger?
It’s very simple in Blogger. It only requires you to go to the Google Webmaster Central and ass your site feed and the sitemap is automatically created. You can get detail instruction on this here.Make sure that you submit your full feed and not partial one.
Which is the best sitemap generator program around?
There are lot of free online and downloadable sitemap generators.
Here’s a simplified listing of what is best.
1- Python Scipt - This is the most difficult one to install. But if you are familiar with python, then this is the best one around.It’s automated and requires no additional support.I don’t recommend it for a beginner.Requires technical knowledge.
2 - Online sitemaps - This is best for small websites. It’s easy, simple and online.Just go to this site and submit your url.Fill in some basic details like time and priority settings for the files and click go!The whole sitemap will be generated online.You will get both ROR file and the Google sitemap XML file.If you are interested only in Google, use the XML sitemap.The format is according to Google sitemap protocol and is faultless.
Best choice for beginners and small websites of less than 500 pages.
3 - Gsite Crawler - This is a downloadable application. If your website is a bit large and you have time to tweak some settings and is serious about sitemap, then i would recommend this guy for you.
It requires you to give the website url, then select the types of files to be scanned from it, priority settings are automatically detected, and you can create bot Google sitemap and Yahoo url.
It has report generation as well that will give you an idea of how many urls were crawled and broken links etc.This is very useful while handling large sites.
How to make sitemap for large sites?
If you have really large websites for instance a one million page one, then it’s really going to be tough creating a sitemap. Practically this is possible with the Python script but if you are not okay with the technical stuff then you got to depend on sitemap generator programs.(If you don’t have a really large website the follwing piece of information may not help you.)
Step 1 - Download a free sitemap generator program like Gsite crawler.
Step 2 - Use it to crawl each folder of your website as separate projects.Make sure that you create a new database each time a new project is opened.
Step 3 - Now you have separate sitemaps for each folder.
Ex:- yourdomain.com/folder1 has a sitemap called folder1.xml and yourdomain.com/folder2 has a sitemap called folder2.xml
Step 4 - Download this simple index generator program.
Step 5 - Copy paste all the folders (containing the sitemaps) from thh projects folder of Gsite crawler(C:program files…) and put it into one single folder.
Step 6 - Run the index generator program against this parent folder.
Step 7 - Now a sitemap index would be created with links to all the child sitemaps but one problem, since in Gsite Crawlers projects folder(C:Program Files) each crawled folder will be named with underscore replacing the forward slash.
Ex:- yourdomain.com/folder will be named as yourdomain.com_folder
Therefore the sitemap index produced will have the links too this way.
Step 8 - Use notepad/wordpad to open the sitemap index file. Find and replace all the underscores with forward slash.
Step 9 - Upload the child sitemaps in the respective folders online.
Ex: - yourdomain.com/folder1..folder2 etc.
Step 10 - Upload the sitemap index file to the root folder and submit it to google.
Bingo! There you go you have now created a sitemap index and child sitemaps for a large website. Now submit it trough the webmaster central window and keep waiting!
Not getting traffic even after writing good content on your blog?
It may be because you have not optimized your blog for google. If you didn’t know, optimizing a blog is different from optimizing a normal website.The best part is that it’s easier.
Before reading further, please refer these articles.
Implement these tips today and i can assure you that your statistics will improve in a few days.
Is your blog generating dynamic numbered template headings? Like "www.yourblog.com/entry.php?id=3" ?
Big mistake - you should change this to "www.yourblog.com/your-post-title.htm"
If you are using wordpress you can simply change this from the publish settings.On blogger, you don’t have to worry much as by default the pages generated are titled based on your title usage. In this case you have to be careful with your title.How to write titles that attract traffic?
This is because search engines can understand what your content is based on your url/permalinks. So a url that is generated out of the title makes more sense than a randomly generated numbered page.
This may be a little complicated to understand for those who are not familiar with web-design. But let me put it the simple way. Make sure that you have a template that meet the following criteria.
- In your source code, make sure that there is less junk(CSS and HTML) and more of content.
- In your basic template structure, the links(categories or links to your best posts etc) are the first thing that bots will crawl.Example, in a three column template,if your links are included on the left hand side, wen can ensure that bots crawl them first and them move on to the body.Check out this post LINK where I’ve clearly stated on how to select a perfect SEO friendly template for your blog.
Bots remember the first and last things they see on your site to a certain level of importance. Like for instance,the footer. Suppose you have a footer that says - copyright - yourblog.com. Change it to "copyright - yourblog.com - SEO tips and Tutorials", this way bots will remember the SEO tips and tutorials keyowrd they read on your blog.Make sure you don’t spam your footer though - with more number of keywords.
In free blogging patforms like blogger or wordpress, unfortunately image ALT tags are not used by default. Make sure that in every image you upload(on every article or even the bullet images on the template), you use the keywords as ALT tags make sure you don’t spam it again, but optimum usage is recommended.
Extremely important point.This is a pain taking process honestly. Because you got to dig out your old posts, get their urls, remember them ah! Pain in the arse! But let me tell you, it’s a good habit to maintain a spreadsheet with all your old and present urls to posts updated, so that you can hand pick them any time.Using this, post links to your old posts within posts. This will ensure that none of your pages are brushed beneath the carpet but are equally given importance and the Google juice will equally fill them.
These 5 tweaks will ensure you that none of the articles, that you wrote with much pain, is left un-noticed but helps you get more traffic on the engines.
Google Sitemaps is an excellent way to submit your site to Google. If you are not sure of what a site map is -
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
See here to see what are the data stored in a sitemap and why.
Google provides the option to tell it through the robots.txt file asto where the sitemap file is located.
Just add this into the robots.txt file with the link to your sitemap file -
Sitemap:http://www.mysite.com/sitemap.xml
You can submit sitemaps(in xml format) in the Google Webmaster’s Tools but in case you are worried how to make one, here’s a quicky!
Goto XML-Sitemaps and submit your site url, give in the settings(updated hourly,monthly etc) and allow it to make a sitemap. It may take some time but it’ll also show how many files are processed and how many are yet to. A maximum of 500 files will be taken care of - not more(T’was free remember?)
You can get the sitemap done in ROR Format,Text,Compressed XML Format,Uncompressed XML format, or even HTML format(for site use).