Spy on competitor adwords

February 27th, 2008

I came across this very interesting site recently -
SpyFu
it lets you “research” your competitors adwords and guestimates their adspend. Interesting stuff.

I’m doing the website for Hook Norton’s Music at the Crossroads. Its a great music festival that supports several causes close to my heart, and the misconfigured website needs a visit from Google now its fixed. Here Googlebot!

Average ecommerce site conversion rates are about 2.5%, however, according to Nielsen/NetRatings Retail Report for January 2008, many retailers in the US at least do far better than that.

The Top 10 online retailers in the U.S based on conversion rates were:

proflowers.com, 14.1%
Coldwater Creek, 13.3%
FTD.com, 13.0%
QVC, 12.8%
Office Depot, 12.4%
eBay, 11.5%
Lands’ End, 11.5%
Tickets.com, 11.2%
1800flowers.com, 10.0%
Amazon, 9.6%

Although, as I’ve mentioned before, QVC site visitors have already been pre-sold the item by watching TV and are coming deliberately to buy, rather than to browse. In which case its maybe not such a good stat as 87.2% dont buy….

Web Analytics. Its all about interpretation….

UKs most popular Ecommerce sites over Christmas

Ebay and Amazon are givens, but interesting to see HMV hanging in there at number nine…

Log analysis with Unix

February 3rd, 2008

Sometimes I have to dig deep to find the information I need in a sites server logs which will enable me to do the analysis I need to do.
I have a variety of log analysers which between then can process most logs I come across, but sometimes I get something in a custom format which normal tools baulk at.

The answer? Use the unix tools on my mac to process the server logs, and extract the information I need.

Say I have a logfile from a client which is in a custom/weird/old format. I usually want to analyse the path to purchase. If I know that the 5 steps in the path to purchase are called shop1.php, shop2.php – up to shop5.php I can search for lines in this logfile which relate to someone calling one of the shop*.php pages using grep:

grep -i ‘shop[1-5].php’ serverlog.log

This pulls out a line of text, but its got some information in there I don’t need if i’m just interested in how often a page in the purchase path is called. In this case I want to pull out the first 20 characters of the 6th (so you tell AWK to look at the seventh) column of data

awk ‘{print substr($7,1,20)}’

Then I want to sort the log,

sort serverlog.log

count how many occurrences of each line there are

uniq -c

then sort the lines in descending order

sort -r

I can pull all of this together and use pipe “|” to join up all the commands:

grep -i ‘stage[1-5].htm’ serverlog.log | awk ‘{print substr($7,1,20)}’ | sort | uniq -c | sort -r

Using unix I can pretty much get any info I need. Cool eh?

If you really want to get into this kind of stuff, which you may, given you are still reading this, you should checkout the unix text processing bible
“Unix Text Processing, by Dale Dougherty and Tim O’Reilly”.