On Friday it was the #MancSEO Mini Conference at the Hive Building in the Northern Quater. A free event thanks to sponsorship from the Eword and Manual Link Building organised by Pete Young. Four speakers in total all with some interesting views and data to share.

There’s a few blog posts with reviews of the presentations on other blogs so I’m not going to write another one. Drinks after at The Warlus bar were free for a bit thanks to the Eword. Overall a great afternoon and evening!

The last session of the day and the event was targeted at people managing multiple twitter accounts or wanting to spam the hell out of twitter. Predictably all the members of the panel agreed that there was a level of automation to be achieved but there was always going to be the need for a human element. The 80 / 20 rule was brought up by Tracy Falke from Freestyle Interactive, 80% can be automated, but 20% still needs someone to add input.

The panelists presented in terms, with questions at the end with Cindy Krum in charge. First up was Ralph Tagtmeier aka Fantomaster http://fantomaster.com/ http://twitter.com/FantoMaster who came straight out with the fact he’s earning $500 a day with gaining a customer with a lifetime value.

The main idea was that the automation isn’t what makes you money, it saves you time allowing you to concentrate on the bits that make money.

His first tool up was Social Oomp http://www.socialoomph.com/ which is free to use with a paid option for more features.

Social Oomp can ping new tweets, send to face book, schedule tweets for times, repeat the sending of tweets over a period of time a certain number of times.

These features are really useful for tweeting when you have timezones to take into consideration. The main killer feature is that when you send another tweet you can change the tweet about.

Twitter doesn’t really like you sending the same over and over again. This toll can then change variables before the tweet is send e.g.

“Wow ses London was {amazing|great fun|interesting}, can’t wait to go again. “

That way different tweets go out but with the same meaning.

The scheduling of the tweets sometime doesn’t work but the flexibility is great to randomise the frequency of your tweets e.g. 30 over a 3 week period.

The second tool was Tweetminer http://tweetminer.net/, Ralph didn’t go into this one much but it was a paid service that would allow you to find relevant tweets without having to follow lots of people. Not really sure what the USP for this tool was, maybe a sign up to the free account may be best.

Ralph felt that this tool was better organized that tweetdeck for managing lists of what he’s been following.

Third Tool was zi.ma http://zi.ma/. This wasn’t working at the time but Ralph explained that the tool can take a bulk of long URL’s and shorten them. I’m guessing that you can also add tracking code at the same time.

One question for Ralph was about mistakes so far, he felt that the misleading avatars where a downfall.

Up Next Paul Madden aka SEOIdiot http://www.crea8.co.uk/ http://twitter.com/seoidiot, who from near my home town of Blackpool. He works for a company called Crea8 New Media and I remember downloading some of his twitter software a while ago.

Paul’s problem was that you need a decent coverage of people. To get more coverage you need to get more people to manage accounts, all of which can cost money. This is where automation comes into play.

Pauls methods are a little more on the “spammy” side but always with safety at mind so nothing permanent can happen. He also felt bans are hard to come by as most people at Twitter are working on keeping the service running and not banning people.

Paul explains that he uses a brand account and then creates supporting accounts. The band account is the safe one, no automation and no links to the supporting accounts.

To build account you can try to the follow / unfollow campaing but the 2000 followers threshold causes problems. This is why he’s gone for the hybrid model.

The Hybrid Model

1. Server bases app that will upload the tweets on mass / scheduled.
2. Set up accounts (not on gmail), different email for each one, get pictures by Gumtree adds for a few dollars. Get normal looking people. Women, but not stunners.
3. The outsource tweets to be created in a spread sheet. He does this by giving a description of the person. “Lives in London, has a dog, likes running”
4. Spreadsheet comes back load them in and away you go.
5. Do lots of standard link, then 1 in 10 a link. Maybe less.
6. If people reply do this manually.
7. Rather than using a cron job to send the tweets, he uses traffic rises to send the tweets. So traffic up then send the tweet. That way the timing is random.

One other point is he sometimes uses other people’s tweets to use, no retweet but copy and use. Maybe not a great idea at times but can work.

Paul then went over the rules, most of them the previous process broke. Main thing was not to be unsociable and don’t risk the main account.

Best of all they have a site live with some of this software which runs as a wordpress plug-in which you can download. http://automatingtwitter.com/

I think the main aim is to build up 50 + accounts with a good amount of follows, so you can easily drive traffic to a site on mass. To build up such accounts would take one person ages, this is where the automation helps.

Third to take the podium was Pierre Far http://www.pierrefar.com/ http://twitter.com/pierrefar

Pierre makes tools and one he managed to sell. Following on from a point from Jim Sterne in the morning is that if you’re watching a phrase e.g. SES there are going to be hundreds of mentions and you need to cut through the crap to get to the good stuff and then engage with that.

He had a model to follow

1. Monitor
2. Engage
3. Analytics

So watch what is happening, engage where you need to but then track the clicks, what happens when.

A few tools backtype http://www.backtype.com/, – looks at all your tweets that link to your domain. Best of all it can cuts through URL shortener to find links.

To look at keyword / brand / person / competitor he likes to use http://search.twitter.com/ then grab the RSS feed. From there use a parser to store and do what you want from there, store it for later analysis.

The popular http://tweetmeme.com/ can help to look at all links to one particular landing page, you can create reports from there any site, this is a paid service.

All of the tools are realtime, so if you get a spike in traffic, think about the page it’s going to, has that page got a call to action.

Big mistake he feels are the auto DM when a work is mentioned or when your followed, he goes for a simple “hi, thanks for the follow”.

Tracking, you need to fill out the tracking on URL’s so the data appears correctly in Google analytics. Big problem is a large percentage of people use tools like Tweetdeck so traffic from there is direct, unless the tracking is working.

He’s also made a tool that looks at your followers, maps them out. That’s great for a bigger brand to work out where their followers are. If they are all in foreign countries maybe schedule tweets for them, or tweet in a different language or set up a new account for them.

New things to come to twitter will be split A/B testing from twitter traffic. I’ve noticed some people e.g. distilled land their twitter traffic on a specific twitter page. This leads onto the fact that twitter is big on mobile. So segment referral from twitter on mobile, look at how that traffic performs.

Basically some nice tools, tips and thought about using twitter

Final one from Tracy Falke from Freestyle Interactive. http://www.freestyleinteractive.co.uk/ http://twitter.com/tracy_falke

Her company is looking at making a tool to wrap the whole process up at once. A standard statement about the brands we worked with are only now looking at accounts even though they have been banging on about it for the last two years.

Problem they feel is the man hours it takes to create accounts. Why tools and automation are needed to help people.

She gives an example of companies creating bad press about johndeere tractors, looks like it’s been done with bots to create negative content. Another example was toyotabzz, maybe companies are using it to create bad press?

Her main point was about a system that will creating a custom content database, that interacts with the platforms, custom parameters, rules triggers, to alert you with tweets you need to look at. Then the cms makes sure that the tracking on any links is done correctly.

They then want to find the important people, who are the influences the people that count.

A good point from that presentation and also one raised in a presentation from Ciaran Norris from Midshare http://vimeo.com/7536535 was that get the content, that’s the important point as the platform changes, blogs, myspace, twitter, the ????. It might be the case that in the next few years we are not using twitter. But if you have the process in place to monitor, engage, track that should able to be applied to the next platform.

Overall a good little session, as expected you can’t automate everything but tools are going to help you cut the crap and find the important bits out. If you want to experiment with some of the dodger points then don’t use your own account.

Edit : Cindy Krum http://twitter.com/suzzicks from http://www.rank-mobile.com/ was moderating the session.

I’ve been lucky enough for my company to pay for me to come to SES London for day three. It’s dinner time and it’s been a enjoyable morning so far.

The morning started off with Jim Sterne who had a great presentation on Social Media Metric. The output was that there are plenty of tools to gather data and record whats happening but pulling out context from them is the hard part. To put it in his words you have to try and find the pony in a load of horse shit.

The conclusion was a calculation to work out the value of a lead from social media media which could be implements and a high level ROI for a social media campaign.

I headed over to the smaller room for the Crossing Borders: Global Site Clinic with Motoko Hunt and Andy Atkins-Krüger. A few sites had a going over but the main message was that you have to approach each international site with the care and effort you do for your own local site.

Andy has his own company with 36 employees all natural speakers, getting a person who’s learned a language still won’t do a correct translation.

One tip from the session was that although the subfolder method with geo targeting in Google Webmaster Tools works, for engines like Baidu in China and Yandex in Russia that method won’t work so you’ll need the local domain.

One interesting example from the session was James from ASOS.com the big online fashion shop, one of the biggest in the UK. They are going to be launching in the US and other countries but haven’t sorted out their domains to use. One of the other problems was the ASOS.com site has 100′s products a day added all needed to be translated, they were looking at computer aided translations….

After that we had Dave Naylor and Erica Schmidt rip some sites apart. Some of the sites on offer were really poor and hadn’t had any optimisation. Dave came up with some great link ‘hooks’ for a Devon site and a Poetry competition site.

There were loads of sites reviewed and most needed SEO, build, design, PPC all looking at. It was also good to see a few members of the audience being invited to be part of a subpanel. One of the lads (a yank) had some really good points on the sites.

Overall a good morning and an enjoyable first search conference for me.