Dec
22
Filed Under (optimization strategies, seo review) by Mani Karthik on 22-12-2007

Digital Photography School is a blog that’s more famous for it’s author than it’s awesome content. It’s owned by the most influential blogger on the planet - Darren Rowse.

Darren is a very good mentor to me and often have given me valuable advices on topics knowingly or unknowingly. If you are a regular reader of DailySEOblog, you might have spotted Darren’s comments here and there on some articles. Most interesting of them being this one, where he was a bit miffed about an article I wrote which according to me was wrongly interpreted. We cleared things up, soon after,that’s a different story.

Well, recently, I came across this interesting post by Daniel, where he discusses about the factors deciding a good domain name. And today I came across this little discussion between Daniel and Darren, which provoked me to write this article.

Daniel thinks (and we agree) that good domain names should not have hyphens in them. According to Daniel,

Domain names containing hyphens and numbers are cheaper for a reason. They suffer the same problem of domains not using a .com extension or with complex spelling.

Daniel raises this doubt to Darren over here and Darren replies to Daniel over here -

Daniel - yeah it(Digital-photography-school) does well on an SEO front (has really increased in the last 6 months) but not so great on a memorability front.

So when Darren thinks that DPS is doing good in terms of SEO, I guess this is what he means.

For the keyword “Digital Photography”, a Google rank of 6. (Regional ranks may differ.)

Not bad for a blog like DPS right?

Here’s a look at some metrics.

Age of domain - Almost 2 years
Pages indexed on Google - 30,900 pages
Pages indexed on Yahoo - 62,337
Incoming links(Google) - 231
Incoming links(Yahoo) - 295,000
No: of pages in the main index - 2150
No: of pages in the supplemental index - 28,750 (Pages Indexed - Main Index)
Page Rank - 6/10
Alexa Rank - 19,382 (as of 23rd Dec, 2007)
Home page size - 34181 Bytes = 33 KB
Code to content ratio - 35.03 %
Incoming .edu links - 5
Incoming .gov links - None
Issues encountered - Canonicalization. My guess is that Darren have set the domain name preference in the webmasters tool to http://digital… rather than http://www.digital-pho… which is why a search for site:http:/digi.. returns results while as site:htp://www.digi… does not. May be Darren should fix this, and make all the URLs to www.url.com format and not http://url format. For Problogger, he has used the www.url.com format.

Why should this be fixed?
Though seemingly both the formats are the same, Google prefers to use one format for a site. Which is why it has given you the  option to make a selection in the Webmasters tool. If some people link to http://url and some to http://www.url it does not look good on your site, and you lose some value there.

Supplementary pages
DPS has quite some huge number of pages in the supplementary results. Though Google pulled off the importance of supplementary pages along with the operator(site:www.yoursite.com ***-view) last July,you can still determine the number of pages through a simple calculation, and I found that DPS has almost 30k of it’s pages in the S-Index.

Now, you know the problems of having a huge supplementary index right? Google is doubtful regarding the relevance of those pages and it may keep away from showing them in the normal organic search results. So, here DPS have not been able to convince Google that 30,000 of it’s pages are relevant and original in content.

How to get out of supplementary pages?(Rel page)
- Remove archives from the sight of robots using nofollow tags.
- Don’t tag articles in more than one category.
- Create distinguishing titles and content on every article.
- Get deep links from external sites.

Other than the canonicalization issue and the number of supplemental results, DPS is in good shape.

If Darren would like to do something about it immediately, I’d suggest that instead to the homepage, from his network of blogs/websites, he should make an attempt to individual posts in the various categories. We all know that Darren has been linking to DPS from many of his Problogger articles, but almost all of them are linked to the blog homepage. Instead of this, had it been the internal article pages, he could reduce the number of supplementary pages.

Let’s do some very basic SEO checks

Meta Tags

<meta name=”keywords” content=”Digital Photography School, Digital Photography Tips,
Digital Photography Training, Digital Camera Tips, Digital Camera Advice, Advice,
tips, photography, digital camera, training,”/>
<meta name=”description” content=”Digital Photography School -
Digital Photography Tips for You” />

Looking at the meta tags, I’ve the impression that Darren and his team has not been working on it lately. It’s a very basic meta tag, with the bare essentials. And the meta description is just not impressive. As you all might already know, the purpose of having a meta description is not to attract the search engine crawlers but human visitors.

The meta description is the text that you see beneath your site name in the SERPs. Only if it is attractive enough would people click on your site name. If you’d ask me I’d rephrase both the meta keywords and the meta description as below. 

<meta name=”keywords” content=”Digital photography, Digital Photography School, Digital Photography Tips, Study photography, Digital photo Digital Photography Training, Digital Camera Tips, Digital Camera Advice,
Advice,tips, photography, digital camera, training,”/>
<meta name=”description” content=”Take stunning photos with your digital camera using our digital photography tips and tricks - Digital Photography School” />

The title tag could also be changed to something attractive to both search engines and visitors. As of now, it looks like this.

<title>Digital Photography School &#8212; Digital Photography Tips for You</title>

Another grave mistake I found is that the Robots.txt file is put in the blog subdirectory (www.digital-photography-school.com/blog/robots.txt) Yikes! This simply won’t work. The robots.txt file should be placed  the root directory and if the blog is present in a sub directory, the commands should use the subdirectory URLs to control the crawlers. And if at all it worked, the syntax is wrong. Here’s how a healthy robots.txt should look like (only a suggestion)

Sitemap: http://www.digital-photography-school.com/blog/sitemap.xml
User-agent: *
Disallow: /wp-content/
Disallow: /wp-admin
Disallow: /wp-includes/
Disallow: /wp-includes/
Disallow: /wp-
Disallow: /*.php$
Disallow: /*.js$
Disallow: /*.cgi$
Disallow: /*.xhtml$
Disallow: /feed/
Disallow: /trackback/
Disallow: /cgi-bin/
User-agent: Googlebot
Disallow: /*.php*
Disallow: */trackback*
Disallow: /*?*
Disallow: /z/
Disallow: /wp-*
Disallow: /*.inc$
Disallow: /*.css$
Disallow: /*.txt$

I couldn’t find a sitemap file anywhere, so that’s something Darren may work on to ensure that the great content is spotted by the crawlers. It may/may not help in the fight to put back the supplementary pages too.

Bottom line is that, even though there is great content on the DPS website, a major part of those pages are in the supplementary index, and thus will not show on the search engines results. Keeping in mind that the blog has a domain name that’s not memorable, the major traffic is likely to come from search engine organic results, getting these pages out of the S-Index may be first thing Darren should work on.
Also, please note that I’ve only checked the SEO basics here, and only after these basic stuff is fixed could the rest be analysed and worked on. Here’s wishing al the best to Darren and DPS.

If you enjoyed this post, make sure you subscribe to my RSS feed!

(4) Comments    Read More