So you’ve signed up to twitter, your twittering away and now you want to track the number of followers on twitter. Well you can use twittercounter which tracks the number of followers over a period of time.

I’ve hit the grand total of 20 follower on my twitter account (@johnpcampbell), I’m looking at trying to increase that over the next few months. I’ve not actively tried to increase followers and before I do that it’s important to work out which methods of increasing visitors work.

twittercounter creates a page for your account ( and then generates a graph so show follower levels over time.


So over the next few weeks I’m going to be trying different methods of increasing followers and should act as the tracking to see which method is the most successful.

I’ve been using Spotify this week and have to saw I have to say that I’m really impress by the quality of the tracks on there, no load time for the streaming of the music too. It’s basically itunes but it streams music from the internet with 20 second adverts every 10 songs or so. It’s a great way to check out the back catalog of artists too. There are some tracks missing for one or two artists but on the whole you can listen to whole albums in full.

What I really like is the ability to download playlists via one click, there lots of Spotify playlist sites popping up,, and are just a few. It’s a great way to discover new music. Head over to download and start listening.

An increasing issue facing some webmasters in the UK seems to be Google not ranking sites in The trend seems to be that sites that have a non country specific top level domain e.g. .com .net .org etc struggle to rank in for key phrases, yet the site will rank for the same key phrases when searching from a UK IP address (Which is different to searching from a US IP).

It can be rather frustrating for webmasters as they might have a perfectly working site with a good amount of inbound links but Google’s made the decision that the site isn’t 100% focused to a UK audience so Google will class the site as a US / international site thus excluding from the results.

So if you find your site in this predicament then you should take you site through this flowchart to make sure you’ve done everything possible to advise Google that your site is based in the UK.


Remember more help can be found over at the Google Webmasters Help.

The Manchester WordPress User Group (MWUG) pretty much does what it says on the tin, it’s a group for WordPress users based in Manchester. As a platform WordPress is pretty much everywhere so it makes sense that a group should be started. There’s already a WordPress UK North group but that does include members from Yorkshire.

manchester wordpress user group

The group has sprouted from the MDDA and (MWUG) have site over at which is currently been worked on. There’s a Google Group too over at and they are on twitter too @mwug. So if you like WordPress and your based in Manchester get signed up to the group and down to a meeting!

Google, Yahoo and MSN Live have introduced a new tag that all three search engines will recognise. The tag which can appear in the head of document will instruct the search engines back to the canonical URL a page.

This mean there is now a new option available to SEO’s and webmasters when fixing duplicate content. Most importantly this will help for sites hosted on IIS server when re-direct are no possible. For more information on how to implement and what the tag can and can’t do head over to the post on the Google Webmasters Blog.

The credit crunch is hitting us all (well internet marketing seems to be doing well) Google is trying to help business by promotion some of their products which help you do more with less.

google do more with less

Now, it’s great that Google are trying to help business by pointer people to their “free products” but at the end of the day it all comes back to them generating more money through advertising. I sometime forget that they are all geared up for people spending money on adwords. The IPA warned today that Google rising ad costs could make some business reduce budgets or look to spend on other networks.

Although that might make a slight dint products like Google Insight, Webmaster Tools, Analytics as useful as they are for SEO are also set up with the goal of getting you to spend more money on ad words.

It will be interesting to see if Google releases credit crunching products over then next few months that deep down will drive more traffic to AdWords.

In the past few weeks I’ve been playing around with twitter and I must say I’ve been converted. I signed up for the site a few months ago for another site of mine but there’s not much twittering about Blackpool FC.

The basic concept seems to be write and post interesting stuff, get followers and then you pull traffic to what you post. Hopefully some of that traffic will link to some of the stuff you post. So it’s a great tool for SEO. Also you might be able to in contact with ‘influential’ people, which again can help you gain links.

It seems to help having an iPhone so you can post on the go helps, but I’ve not done any drunken twittering yet which might not be helpful.

Something that I spend a lot of time doing at work is finding content management and e commerce platforms that create duplicate content. It’s generally generated from print views, pdf’s, differences in URL generation or clients ripping off other sites.

It can lead to big problems if it effects your entire site. Often getting these problems sorted out can lead to a good increase in long tail traffic by 10 to 20 percent. I noticed in the past that the BBC site does spurt out duplicate pages. For example.

Andrei Arshavin signed today for Arsenal, the BBC have a nice article with a video at the top. The URL for that page is:

You also have a second URL, the difference it’s in the folder sport2 and not sport1

That URL 302 re-directs to the 1st one, but Google does cache the second URL. This can be seen at

Thats not it, soon the low graphics version will get cached.

And under the sport2 folder

So it’s duplicate content, Google says that you should try to make sure that you only have one version of a page on your site. The reasons why you should sort this out are pretty simple.

– Splits the flow of link juice
– Splits the possibility of inbound links
– Search engines spend time caching pages it’s already seen rather than picking up your new pages

So should the BBC block Google from caching the duplicates? Well, yes, but for a site of that site and the speed that Google caches the content and the inbound links that generate it’s not going to cause a problem.

If you see similar problems in new sites then you do need to get fixes in place. 301 the pages that have been cached already and then block the search engines!

Posted in SEO.