I’ve seen in the past scraper sites cause problems for sites in Google and this blog is currently been scraped and outranked sometimes by brokencontrollers.com. A quick spam report to google and the offending site soon goes away. But it seems that the page to report spam to google has been removed recently and you can only report spam while signed into a Google account…

The page used to be located at http://www.google.com/contact/spamreport.html known as the “unauthenticated” you could report spam anonymously, so no ties to your Google account which links to your site.

google-report-spam-result-form

That page now 301 re-directs to https://www.google.com/webmasters/tools/spamreport?hl=en which does load up the form but only when logged in to a Google Account with Webmaster Tools.

The landing URL from the natural ranking for “report spam” takes you just the dashboard of webmaster tools which is pretty useless, guessing they haven’t added the correct re-direct.

google spam report serp

This seems a strange move by Google as this may decrease the number of spam reports.

1. Not everyone has a Google account.
2. Even if you have a Google account your might not have Webmaster Tools.
3. You might not want to report on an webmaster tools account that contains your site if you use any black hat methods. E.G. reporting your competitor may draw attention to your site.

Google’s best method of detecting spam and paid links are reports from webmasters, SEO’s and users. This move to to remove the unauthenticated / anonymously report method may have made their jobs a little harder?

Another strange result from Google with the Wikipedia result for Sherlock Holmes (currently on at the cinema).

The listing has a address which expands to a map centering on Matt Barker Road in Kentucky USA. That should be centred on 221B Baker Street, London, UK.

sherlock holmes Kentucky

Google grabs thats data from the Wikipedia page.

As long as Google continues to use user generated content they’ll continue to produce poor results.

I’ve blogged about problems with Google maps with see tickets spamming. This morning I found another strange result on a map search for “Blackpool Pleasure Beach”, next to Blackpool Pleasure beach is Lightwater Valley, which isn’t in Blackpool.

blackpool pleasure beach-light water valley

How do that get there? Easy.

Click on more information on the map pop up and under the user content section there is a custom map by Srab. Remember you can make a custom map on anything and position anything anywhere.

custom-map

Go to his map and you’ll see that he’s put Lightwater Valley in Blackpool, it’s actually near to Ripon in Yorkshire.

blackpool pleasure beach-light water valley custom

Nothing major but it’s a little dangerous for Google to be using custom map data in a product people use to get directions from A to B.

This week Matt Cutts confirmed that natural search will start to look at the speed of your site as ranking factor. It’s something thats been in the pipeline, and an obvious change if you’ve been following the “let’s make the web faster” drive from Google.

fast-bolt

I think it’s a great move, there’s nothing more frustrating than a slow Internet connection or a slow site. If you head over to the “let make the web faster” from Google there are instructions to how to make your site faster some are a little complicated such as compressing JavaScript and CSS, HTTP caching, minimizing browser reflow etc.

If you know what your doing then thats great but if your an average webmaster here are five simple tips to speed your site up for Google and users.

1. Optimizing Images

Optimizing images is a simple process, there are two methods. Reduce white space, rather than giving your image a boarder or space crop the image tight and then use CSS to create boarder and positioning.

Save in the correct format, rather than using JPG’s GIF’s can be sometimes reduce file sizes for logo and simple images. Using “save for web” setting on programs such as Paint Shop Pro or Photoshop will enable you get the best size while not compromising on quality. To get old images into a compressed format use a bulk process, remember to keep the file names the same and if changing extensions add re-directs to preserve image rankings.

2. Don’t use tables

When making the HTML output of pages don’t be tempted into using tables to display data. With a bit of skill you can recreate the same thing in CSS. A great article gives 13 reasons why CSS is better than table the number 1 being faster load times.

3. Navigation

Two points with navigation, a) how many items in the navigation can effect load time. Listing every single category and subcategory in some sites would create a huge navigation over 100+ items. Only link to the top levels on every page and other relevant pages, a good example of this in action is the BBC site, go into the sport sections to see..

b) Coding of the navigation is also important. Using Flash or JavaScript (all the code) is a no go, using css with a little JavaScript can create efficient drops downs, even better don’t use drop downs just a plain navigation will keep load times down. Remember that navigation is on every page so improving navigation improves load time on every page in the site.

3. Reduce loads form external sites

Each http request adds time to the load of your site, this includes loading items from a different site. Where possible host images on your own site, don’t have multiple tracking codes, try not to pull twitter / RSS feeds into every page.

4. Move External JavaScript and CSS to external files

By placing the JavaScript and CSS in an external file users and search engines don’t have to load it on each page load. You place the code in a file either .js or .css and link to in the head of the page e.g.

5. Get a decent server

Yes it costs money but if you’ve spent time on the last 4 points they might have no effect if your server is slow. To get a good indication of speed ask the company for some high traffic example site and go on them at peak traffic times. Ask if any are “digg proof” and the specifications of the server.

Over all these 5 points might make a difference but the objective is to make improvements to lots of different areas, added together they will make a difference to the speed of your site.

I’ve recently started working in Manchester City center so been looking at Google maps for local places to eat at dinner. As ever, the first place to look is Google Maps. After a few clicks on locations strangely See Tickets, a ticket sales company seems to have claimed a lot of local listings that have nothing to do with them, thus having a link to their site.

Each one links to seetickets.com
see-tickets-google-maps-spam

Looking at the locations is it possible that See Tickets could sell ticket for the Palace Theatre, but not for 5th Ave a nightclub or Retro Bar.

A search for “see tickets” with the map in Manchester brings up some great results:

Piccadilly Gardens (Bus stop and open space)

Piccadilly-gardens-seetickets

The old ODEON on Oxford road which shut down about five years ago.

old-odeon-seetickets

The big wheel in the Triangle

manchester-big-wheel

and Long Legs a gentlemen’s club.

long-legs-seetickets

All of the links go to http://www.seetickets.com/see/index.asp? and it looks like see tickets haven’t stopped there, they’ve done the same for locations in London too!

parliament-square

You can imagine tourists trying to buy a ticket from See Tickets to go to Parliament for the day!

Anyone else getting this for their location? Go to your local area on Google maps and search “see tickets”

Yes, yes we know that Google doesn’t use the meta keyword tag. It was mentioned in a post back in December 2007 they ignore it. Today’s video and blog post just to any that missed it.

Yoast had a great idea to get the new post ranking for meta keywords so SEO’s don’t need to answer the question from clients. We can just say “Google Meta Keyword” for an explanation.

So if you have a blog link to the post using the anchor text meta keywords. It’s at 27 in the UK rankings at the moment being out ranked by a few blogs published today mashable, searchenginejournal and malcolmcoles. So if we all link together…

Looks like Channel 4 has a little duplication problem at the moment on their IP address http://83.98.28.10/. It’s resulting in double listing with the domain and the IP ranking for some queries which we know Google doesn’t like.

channel-4-duplication

In total around 8,000 pages of 6,390,000 pages of Channel 4’s site have this problem which I’m sure isn’t having a massive negative effect in the grand scheme of things.

channel-4-duplication-site-command

All that’s needed is a 301 re-direct from the IP address to the domain. On apache servers this can be done with something along the lines of

RewriteEngine on
RewriteCond %{HTTP_HOST} =999.999.999.999
RewriteRule ^(.*) http://www.domain.com/$1 [R=301,L]

I noticed at the start of the month that Google had moved about some of the links on the results pages, it hadn’t gone unnoticed as Pete Young at holisticsearch also spotted the changes.

It seems that the My Account button had now come back under the Setting link as “Google Account Setting”

google-account-setting

Google were playing around all this weekend, at one point all the search options had gone.

Although a little change it shows that people do pick it up as it’s a button that’s used often. It reminds me of the time all my door knobs at my house were changes to open the other way, took some time to get used to!