As I’m changing my domain from .com to co.uk sometime next week I though I’d document the process from an SEO point of view.

1. Change all internal links to point to the new domain. If you are doing this on a text server block from google until your are ready as you don’t want the new domain to be indexed containing duplicate content.

2. Add in you 301 re-direct from the old domain to the new. Make sure you also re-direct inner pages from the old domain to the new domain.

3. Re-run your xml sitemap and update the sitemap reference in the robots.txt

Sitemap : http://www.aukseo.co.uk/sitemap.xml

4. Submit a new site to webmaster tools, verify and submit the new sitemap

5. If possible point some links at the new domain this should help with the indexing of the site. If possible change any external links to point directly at the new domain, if you can’t do this don’t worry too much as the links will still have effect due to the 301.

6. Make sure you change anything that depends on your domain, analytics, feedburner, ranking software and payment systems.

EDIT : 14/07/09
7. On Google Webmater Tools verify the new site and use the change of address tool under the Site Configuration to notify Google that your changing to a new domain.

Give Google a couple of days and then check back by using the site: command in Google on the old domain. You should start to see the page reduce from the old domain to the new domain.

If you have some stragglers of pages that don’t re-direct to the new domain then use a HTML Sitemap and point links to the old url’s. This should give Google a reason to index the link. From experience this works but don’t worry too much if the old page doesn’t move over as they still will receive traffic they used to do. Once indexed remove the links from the sitemap.

Any other ideas please add to the comments below!

Posted in SEO.

Slipping into TV and Print advertising in the last 6 months are requests to search online for … Asking the user to search online then means agencies can achieve almost direct conversions from TV ads. The first occurrence of such request was from mobile phone company Orange who asked people to search for “I am” so nothing too difficult for the SEO team to implement!

orange-i-am

With no natural listings in sight they had to go with PPC paying for the traffic that followed instructions to search online for I am. The idea was good but the implementation was slated in the SEO community.

Similar to Orange computer animated film Monsters V Aliens asked people to search for MVA, with www.mvaconsultancy.com firmly holding down the number one organic spot PPC was used again to grab the traffic. I made a post with more details about MVA, the PPC campaign wasn’t well implemented and will have costed a small fortune. .

It seemed that most agencies were not getting the grasp of asking people to search online but then along come Tales of the Road from direct.gov, Think! and the Department of Transport.

The advert teaching kids about road safety asked the viewer to search online for “tales of the road”. Not to hard to remember and relatively easy to optimise for. The search result for “tales of the road” is is owned by the website with a PPC ad, number one ranking, site links, number two ranking and a video ranking.

tales-of-the-road

Here a subdomain of direct.gov matches the search term. Using a sub domain then allows for site links (site links can appear for a inner page of a site but easier to gain on a sub domain) and a second listing for a inner page. Placing the video on YouTube a trusted domain then allows for the video to rank easily on the first page page.

One small point is that you can place a nofollowed html link in the description to a video on YouTube.

tales-of-the-road-youtube

For any agencies implementing a “search for” campaigns then the Tales of the Road is an excellent blue print to use.

It looks like Google Webmaster Tools is having a few issues at the moment with listing a 404 error on /index.html from external links pointing at that URL, but when you look at the external links they all point to the root of your site e.g. http://www.domain.com. I’ve seen such an instance this morning on a client’s site.

This has been noticed by Fizzarotti who posted on YouMoz.

The fix implemented by Fizzarotti decided to 301 /index.html back to the root. This can be done by rule in the htaccess file if your an an apache server.

RewriteEngine On
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html
RewriteRule ^index\.html$ http://www.domain.com/ [R=301,L]

This screenshot from Google Webmaster Tools shows the 404 error for /index.php with 100 links pointing to the page. (click to see larger image).

index-404

A second screenshot lists the pages where that link to /index.html. Check over them all point to the root of the site and not /index.html (click to see larger image).

links404

In this example the rankings of the site haven’t been poor in recent weeks, Fizzarotti’s case on SEOmoz the site had dropped several rankings. I’m guessing there is another problem with Fizzarotti’s website and this ‘issue’ in Google Webmaster Tools is a reporting problem.

If you’ve seen the same problem add a comment below.

Search Engine Rankings move, they move a lot and there’s pretty much nothing you can do about it but keep on the track of building content and links which should reduce the movement.

How much can they move? Well in the matter of 12 hours my Blackpool FC blog move from 3 to 9 on the first page for the key phrase “blackpool fc“.

So happy happy, joy joy with a number three ranking, the official club site takes up number 1 and 2 spot so no room for improvement. That was about 11 last night.

bfcblog-ranking-3

This morning at 11, whammy all the way down to 9 on the bottom of the first page. I’m signed out of my Google account which can skew the results.

bfcblog-ranking-9

It seems more and more tv commercials are now using the “Search for……” rather than giving out websites address. The latest is the 3D movie from Dreamwork’s Monsters Vs Aliens, I went to see the movie yesterday and have to say it was really good, especially with the 3D glasses.

Anyway, the TV advert this afternoon promoting the movie this afternoon ask the viewer to search for “MVA” which is quite precise. It’s working according to the stats from Google insight with a recent surge in traffic.

picture-42

Amazingly this is the SERP you see searching for MVA.

picture-32

So you get one PPC advert about the Monsters Vs Alien’s movie and the top organic spot is taken by www.mvaconsultancy.com. Nothing in the top ten is related to the movie. This type of campaign must be costing a fair amount of money for Dreamworks and maybe bringing a fair amount of traffic to www.mvaconsultancy.com.

Posted in SEO.

I’ve pointed out a few sites in the past that have working black hat technique another one cropped up the other day, I mentioned it on twitter.

Showforce.com are a company that provide staff for events, promotions etc, I’m sure they are a great company but it seems that the SEO company they are using are not too great (well they are ranking in Google…). In this case plenty of key phrases have been hidden in a h1 tag at the bottom of Showforce’s homepage.

showforce-hidden-text

So how is this text hidden? Well the text is placed in a H1 inside a div called “SEO”, the style is set to hidden.

Does this work? Well yes, the text is hidden and the Showforce site ranks number one at for “event staff”.

event-staff

Will it work for longer? We’ll have to see. It seems that Google need to fix up and look sharp on their spam detection.

An increasing issue facing some webmasters in the UK seems to be Google not ranking sites in google.co.uk. The trend seems to be that sites that have a non country specific top level domain e.g. .com .net .org etc struggle to rank in google.co.uk for key phrases, yet the site will rank for the same key phrases Google.com when searching from a UK IP address (Which is different to Google.com searching from a US IP).

It can be rather frustrating for webmasters as they might have a perfectly working site with a good amount of inbound links but Google’s made the decision that the site isn’t 100% focused to a UK audience so Google will class the site as a US / international site thus excluding from the google.co.uk results.

So if you find your site in this predicament then you should take you site through this flowchart to make sure you’ve done everything possible to advise Google that your site is based in the UK.

google-uk-rankings

Remember more help can be found over at the Google Webmasters Help.

Google, Yahoo and MSN Live have introduced a new tag that all three search engines will recognise. The tag which can appear in the head of document will instruct the search engines back to the canonical URL a page.

This mean there is now a new option available to SEO’s and webmasters when fixing duplicate content. Most importantly this will help for sites hosted on IIS server when re-direct are no possible. For more information on how to implement and what the tag can and can’t do head over to the post on the Google Webmasters Blog.

Something that I spend a lot of time doing at work is finding content management and e commerce platforms that create duplicate content. It’s generally generated from print views, pdf’s, differences in URL generation or clients ripping off other sites.

It can lead to big problems if it effects your entire site. Often getting these problems sorted out can lead to a good increase in long tail traffic by 10 to 20 percent. I noticed in the past that the BBC site does spurt out duplicate pages. For example.

Andrei Arshavin signed today for Arsenal, the BBC have a nice article with a video at the top. The URL for that page is:

http://news.bbc.co.uk/sport1/hi/football/teams/a/arsenal/7831046.stm

You also have a second URL, the difference it’s in the folder sport2 and not sport1

http://news.bbc.co.uk/sport2/hi/football/teams/a/arsenal/7831046.stm

That URL 302 re-directs to the 1st one, but Google does cache the second URL. This can be seen at http://tinyurl.com/bdsj3q.

Thats not it, soon the low graphics version will get cached.

http://news.bbc.co.uk/sport1/low/football/teams/b/blackpool/7831046.stm

And under the sport2 folder

http://news.bbc.co.uk/sport2/low/football/teams/b/blackpool/7831046.stm

So it’s duplicate content, Google says that you should try to make sure that you only have one version of a page on your site. The reasons why you should sort this out are pretty simple.

– Splits the flow of link juice
– Splits the possibility of inbound links
– Search engines spend time caching pages it’s already seen rather than picking up your new pages

So should the BBC block Google from caching the duplicates? Well, yes, but for a site of that site and the speed that Google caches the content and the inbound links that generate it’s not going to cause a problem.

If you see similar problems in new sites then you do need to get fixes in place. 301 the pages that have been cached already and then block the search engines!

Posted in SEO.