Since I’ve been away there’s been a few tools built using Google Docs using the importXML function, I’ve linked to some useful blog posts at the bottom of this post if your interested.

Anyway here’s a tool that I put together quickly that takes 25 key phrases, checks for serps that include review / ratings displayed. It’s useful if you needed to decide if implemented rating and review rich snippets would be a useful exercise, e.g. none of the serps have them = a good opportunity to be the only one.   Or if certain key phrases had a low CTR you could see if the serps are saturated by rich snippets.

The rich snippets it looks for are the ones with star ratings such as the ones below.

It’s pretty simple to use.

Add up to 25 key phrases on the first sheet, rows B2 to B26. 

That’s all you need to do, it grabs the top 10 results for each serp, and then looks for the div in the search results that contains the star ratings. The function in Google spreadsheet is,

=importxml(C2,”//div[@class=’f’]”)

You can change the search engine version and  language etc by changing the query URL’s in column C in the first sheet.

Another sheet tots up the numbers with percentages. It looks for all reviews, reviews that contain ratings and reviews that contain price information (used for restaurants). I’ve stuck a couple of graphs in the last sheet to show totals.

 

Something I learnt when using importxml –  when importxml runs it pulls back the code from the URL you supplied, in this case it’s a Google search result. When it pulls back this page it does so without running JavaScript. For example, author pictures in search results are added by JavaScript as the page renders, so you can’t see them when using importxml, also some names of divs are changed by JavaScript. When using importxml you need to look at the source code of the page you are importing to decide what element you need to grab, be sure to open that page in your browser with JavaScript turned off thus to simulate the way importxml works.

This is in no way a completed tool, it most likely will break. Please feel free to use, copy, and improve. If you make any improvements leave a comment with a link to the file URL and I’ll repost with updated versions.

Just open the link below and make a copy to use.

–> Bulk Review / Rating SERP Checker Tool in Google Docs

 

Note : It’s limited to 25 key phrases as you have a maximum of 50 importxml calls, I used two for each key phrase.

 

Have a read of the posts below to find out more information about importxml, the links open in new tabs.

http://www.distilled.net/blog/distilled/guide-to-google-docs-importxml

https://seogadget.co.uk/playing-around-with-importxml-in-google-spreadsheets/

http://www.seerinteractive.com/blog/importxml-cookbook

http://zoomspring.com/learn-importxml-tutorial/

From around 10:00am GMT yesterday the Google Keyword Tool seems to be no longer outputting monthly data. Great for looking at seasonality of search data you would download a csv with a monthly search figure. E.g.

From that data you can graph easily to see when search volumes are to peak, allowing you to link build ready for the key months. You can see this data in Google insights but you won’t be able to download into a csv from Google insights.

Over at the adwords forum there’s a few posts discussing the issues http://www.google.com/support/forum/p/AdWords/thread?tid=4c7cb60cb3650b7d&hl=en&fid=4c7cb60cb3650b7d00049f4afdbb923a and http://www.google.com/support/forum/p/AdWords/thread?tid=5c267a93386d6d6a&hl=en and http://www.google.com/support/forum/p/AdWords/thread?tid=764de08bfc286a17&hl=en

Anyone else not getting any monthly data? Anyone else use this data often for search campaigns?

One of the core services of any SEO campaign is to report back to the client on search rankings. Over the past five years reporting has shifted more to sales and conversion tracking, we now see another shift with reporting on touch points of user journeys. No doubt in the next five years we’ll see more changes in reporting SEO campaigns, however reporting on key phrases positions will always be a key aspect of SEO reporting.

I’m my last role I was part of a team that developed a web ranking tool that scraped search engine results, calculated rankings and reporting back to a client interface. Having your own tool has it’s benefits but for me will also be restricted by the need to have internal resource to initially create such a tool and the continues work to maintain. How many companies are able to pull programmers off paid work to work on internal projects, it doesn’t happen often!

Often costing at companies for programming time is calculated by an hourly or daily rate. A quick estimate to make a ranking tool could be calculated as:

£70 per hour * 7.5 hours a day = £525 per day.

You may need around 6 days of planning time, requirement gathering and meetings to get things started. – £3,150

3 week of development time. – £7,875

1 day of testing – £525

3 days of tweaks – £1,575

So your looking at around £13,125 just to create the tool.

Your alternative is to buy a tool already, for that I would go with Caphyon’s Advanced Web Ranking.

To start the cost is minimal compared to building your own tool, $399 which is about £257 will buy you the Enterprise version. Even with an annual cost it’s still cheaper over many years than building your own ranking software.

As for features it has everything, different types of reports, no limit on key phrases, every search engine you could need is on there to record a site ranking, scheduled reports, scheduled ranking checks, upload reports to ftp, save locally, email to a client, reports come out in multiple formats, reports can be customised and so on and so on everything you could think of from an seo software package.

There’s even a keyword research tool which uses the webmaster tools API and SEMRush API to generate key phrase suggestion lists. That list then can be easily imported into a project.

Everything and anything you can think of the tool has that feature, proxies – yes, multiple users – yes, graphs – yes, on page reports – yes.

As for scalability I’ve had it running 24/7 for 7 months with not one crash on a dedicated server, just under 1000 clients and around 10,000 key phrases.

If you have your own client log in center you can easily feed into that by having csv or txt reports sent to a server, your server then grabs the data adding into your client center. This is a set up I’ve used in the past. Advanced Web Ranking grabs the data, sends it off, your own software displays that data directly into your own reporting suite. There’s even a export rank data feature which will export all the data in a friendly format which then can be uploaded into your own tool.

Accuracy is spot on, and most importantly if Google changes their interface an update will be on it’s way pronto.

Once concern if your planning on changing to the Advanced Web Ranking is the time taken to migrate over from an old system. Features such as the import key phrases from a txt list makes the process quick and painless.

There’s a 30 day trail with no commitment to buy, so give it a try!

Here’s a neat solution to find the Page Rank score for multiple domains / pages. Using Google docs you can use an addon script to pull out page rank. Made by dr.nailz all you do is reference the cell that contains the domain

=iferror(pageRank(A2),”N/A”)

The iferror just displays a N/A if there is no returned result.

Just head over to this public spreadsheet to try it out

>> Multiple Domain Page Rank Checking Tool >>

When conducting key phrase research little tips, tricks and tools can save time. Being a wizz at Excel helps, being able to create scraping tools is also great but sometimes you just need to copy and paste from sites to generate key phrase lists.

Google docs

I’ve come across a neat method of using the ImportXML function on Google Docs to extract lists from a web page into spreadsheet (from there you can paste into the Google Keyword Tool to get volumes). Credit goes to this post for the how to us http://ouseful.wordpress.com/2008/10/14/data-scraping-wikipedia-with-google-spreadsheets/.

First you need a page with data on, could be a list of products, table of locations, for example list of dog breads, list of football teams, or a list of settlements in the uk ordered by population.

In Google docs you then use the ImportHtml function specifying the URL, that you want to grab a table and which table e.g. the 1st, 2nd etc.

=ImportHtml(URL, “list” | “table”, index).

For example

=ImportHtml(“http://en.wikipedia.org/wik…..m_settlements_by_population”,”table”,1)

A full explanation can be found on the help files http://docs.google.com/support/bin/answer.py?hl=en&answer=75507 which also show other uses and variations.

You hit enter and hey presto you have a nice table with all the figures separated out.

google docs import html

If you look into the “ouseful” blog, linked above, you can then publish from google docs into RSS, then to Yahoo pipes into RSS then use the RSS on your site as content.

You can also import from RSS and product feeds.

With Chrome coming out on the Mac I’ve moved away from FireFox at home and work, it’s just too slow and does have the odd crash. There is still no extension support on the Mac version but these all work with no problems on Windows XP SP3.

chrome billboard
Photo Credit – iVinay

Chrome Flags by josorek – https://chrome.google.com/extensions/detail/jhejngphiacapbgllhagbpdkkdieeaej

PageRank by kalehrishi – https://chrome.google.com/extensions/detail/pneoplpmnpjoioldpodoljacigkahohc

Firebug Lite by andrei.pervychine – https://chrome.google.com/extensions/detail/bnbbfjbeaefgipfjpdabmpadaacmafkj – Or use the one built into chrome Ctrl + Shift + I

IE Tab Classic by josorek – https://chrome.google.com/extensions/detail/miedgcmlgpmdagojnnbemlkgidepfjfi

No Follow Checker by Dan – http://www.chromeextensions.org/webmaster-seo/nofollow-checker/

Anyone else with any good extensions for chrome?

PHP Weby is a Free Directory script that can create a SEO friendly directory for you. It’s prety SEO friendly but it does lack an XML sitemap generator. So I’ve brushed off some PHP skills last used a few years ago and I’ve made a script that generates an XML sitemap for PHP Weby Directory.

All you need to do is add in your database details, the URL of the site and upload the php file. This then generates sitemap which you can then submit to Google and reference in your robots.txt.

For instruction and to download the file follow the link

PHP Weby Directory XML Sitemap Generator

The most popular question from clients to SEO’s is How Many Links Do I Have? My answer, it depends.

There’s a little more explanation to it. What seems to work well for me is to show the difference in the number of links different tools report for my Blackpool FC Blog.

There are multiple websites that you can use to get a total on the number of links your website has, I can think of six in total

Google Link Command – 6
Yahoo Site Explorer – (Inlinks, except from domain) – 1,297
Google Webmaster Tools – 197
Majestic SEO – 2,702
Linkscape – 516
Alta Vista – 319

how-many-links-do-i-have

Which is the best site to use to check links? To get a list of them all then Yahoo / Majestic SEO will give you a comprehensive report. If you need to check new links coming it’s very hard to do as you need to compare one month to another.

To get an indication of what is happening with your links you want to track the trend buy keeping a total every 2 / 3 weeks. Track numbers of all of the different tools above as you then get a good indication of the general movement.

Something I’ve read this morning on the icrossing blog is that Google Insight tool is now showing predictions on some search terms. By looking at previous years movement the next year’s traffic level is predicted. For example SEO.

google-insight-seo

Obviously it’s not going to be able to predict trends that pop out of nowhere e.g. Susan Boyle. Google have also stuck a disclaimer to treat the data as a estimate and not a prediction.