ia play

the good life in a digital age

e-commerce project: competitive review

This article is part of a (rather drawn-out)  series about our e-commerce redesign.

Competitive reviews do what they say on the tin: they review what your competitors are doing. They are particularly useful in a busy, well-developed marketplace where you can find good matches for your site/product.

With our e-commerce project, my first step was to identify what I meant by competitors. The definition is much wider than other charities for blind and partially sighted people with online shops. You are looking for sites that your audience will be familiar with, with similar product sets, with similar challenges and sites that may be interesting/innovative in general. They don’t have to be all of these things.

Some are easy to identify. If you are looking for market leading commerce facing sites that you can probably reel them off yourself.

You can also:

  • ask your colleagues
  • ask your network (Twitter is pretty good for this)
  • do some Google searches (try searching for all the sites you’ve already thought of, this often brings up other people’s lists)
  • look for market reports from Nielsen, Forrester etc…

I then bookmark the websites, using delicious. This means I have quick access to the set as I can reopen all the websites in one go (or in smaller tagged sub-sets) by selecting “open all in tabs” (I think you need the Firefox plugin to do this, I can’t see a way from the main site).

My four main sub-sets for the e-commerce project were

  • mainstream shops
  • charity shops
  • alternative format bookstores
  • disability/mobility stores

1. Mainstream shops (link to delicious tag)
These are sites that UK webusers are likely to be familiar with e.g. Amazon, Argos and John Lewis. I chose some for the breadth of their catalogue (a problem we knew we were facing) and some for specific range matches e.g. Phones4U or WHSmiths

Where these sites consistently treat functionality or layout in a particular way, I considered that to be a standard pattern and therefore something the users might well be familiar and comfortable with.

(it is worth noting that we don’t have definitive data on the extent to which RNIB shop customers also use other online shops. On one hand their motivation to use online shopping may be stronger than average UK users as they may face more challenges in physical shops, but on the other hand the accessibility of mainstream shops may discourage them)

2. Charity shops

These sites are slightly less useful as competitors that it might appear at first. They were useful when considering elements like donations but in many cases the shops were targeted at supporters not beneficiaries and they carried much narrower ranges. There are however some very high quality sites where it is clear that a lot of thought, time and effort has been invested.

3. Alternative format bookstores

This included mass market audiobook stores and some that are targetted particularly at people with sight loss. Most of these sites were dated and a little awkward to use. I reviewed them briefly but mostly didn’t return to them.

4. Disability/mobility stores

There are quite a number of these sites. They often feel like print catalogue slung on a website and weren’t very sophisticated from an IA perspective. I did look in detail at the language they used to describe products as there was likely to be a heavy overlap with our product set.

I had a number of initial questions that I wanted to research.
1. The number of categories on the homepage
2. Other elements on the homepage
3. How they handled customer login

I created a spreadsheet and when through the sites one by one, recording what I found. It took me about 2 hours to review 60 sites against this limited set of criteria.

I did the original review ages ago but I went back to the sites reasonably regularly during our design phase, usually when we couldn’t reach agreement and we needed more evidence to help make a decision. Sometimes I would just add a column to an existing spreadsheet e.g. when checking which sites had a separate business login. At other times I created whole new spreadsheets e.g. when auditing how the search function worked.

These later reviews took less time, either because I was checking for less criteria or because I dropped less relevant or low quality sites. I’m still going back to the competitive review even during testing, as various testers start finding their own favourite website and asking “why doesn’t it work like this?”.  It is always useful to know if they are right that “normal” websites do X. The competitive review  saves a lot of argument time.

Written by Karen

March 2nd, 2010 at 6:54 am

Posted in e-commerce,rnib