Archive for the ‘analytics’ Category
e-commerce: google keywords
This article is part of a series about our e-commerce redesign.
Analysing your search referrals only tells you about the traffic you were successful in attracting. Even if you are getting lots of traffic for a particular keyword that might be a tiny fraction of the number of people searching for that keyword. And the referrers says nothing about what you missed out on completely.
So it helps to look at search engine traffic for keywords in the kind of space your website sits in. The free tools like Google AdWords keyword tool have generated lots of debate about how useful they are but I tend to see them as worth a look if you’re just looking for rough ideas about language and relative popularity.
With our shop research, I didn’t get much data for easy to see, easy to read, giant print, big print, canes, liquid level indicators, and (my favourite) bumpons. I couldn’t find information about Moon (the alphabet) because it was drowned by references to the satellite and all the other things called moon.
What I’ve learnt:
Generally people refer to concrete properties of the product rather than their condition. So it is ‘big button phone’ rather than ‘easy to see phone’ or ‘low vision phone’.
Singular is much more important than plural for objects like clocks and watches but the opposite is true for book formats e.g large print books. Which is kind of obvious…you only want one watch but you may want many books. This might have a bit of effect on our labelling policy, but not much as Google doesn’t seem to make a huge deal about singular verus plural.
There’s clearly a big opportunity around low vision products. The interest in products for blind people (like Braille) is less significant, which makes perfect sense when you compare the size of the audiences.
And loads of people are interested in magnifiers.
webinar on SEOMoz tools
I often refer back to SEOMoz ranking factors article when I think teams are getting hung up on minor SEO issues.
Will from Distilled just ran a free webinar about the SEOMoz tools so it seemed a good opportunity to learn more about what more is available from SEOMoz.
Will says that SEO tools (some free) give you three things:
- Quick research (basic understanding)
- Deep dive research (actionable insights)
- Making things pretty for boss/client (ever important)
The Pro tools aren’t particularly cheap, so it was useful to have someone talk you through what the return on that investment would actually be. In places the data looks a lot like the stuff you get from your web analytics tool e.g. Google Analytics. But remember this is data on your competitors as well as your own site.
Using AutoTrader as an example, Will talked about
- SEOToolbox: Free tools. Will likes and uses Firefox plugins instead of some of these. Still likes and uses Domain Age tool
- Term Target: free, aggregates data on a given page, identifies keyphrases
- Term Extractor tool: free, uses for competitor and keyword research. 3 word phrases might give you something new.
- Geotarget. Get Listed is an alternative.
- Popular searches. Particularly likes the Amazon content.
- Trifecta. Useful aggregator. But has the comparison of your site to the rest of the web as whole (possibly unique data).
- CrawlTest: pro-tool. Xenu is an alternative.
- JuicyLinkfinder: finds linking opportunities
- Keyword Difficulty: how hard a keyword is going to be to rank for, regardless of domain.
- Rank Tracker: Will keen to stress that individual keyword ranking isn’t the important thing. Often your boss will demand it. Makes little graphs and will export to CSV. Can combine with analytics data e.g. using Google Analytics API
- Firefox toolbar Will loves this. Uses it more than any other SEO tool. Pro version better. Shows some pagerank-esque data for page and domain. Going up 1 MozRank point is equivalent to 8x stronger. So decimal points are important.MozTrust is similar but restricted to links from trusted sites. Page Analysis also part of the toolbar? Alternative is Bronco tools.
- Linkscape: the tool SEOMoz are heavily investing in. Web graph of which pages link to each other on the web. Will doesn’t see an alternative to this. Free version does basic stuff. Pro version produces more data and prettier data. Will recommends the Adv Link Intelligence Report. You can get data on who links with “nofollow” which Will thinks is unique data.
- Labs: Online Non-Linear Regression is scary. Visualizing Link Data is more mortal friendly. Link Acquisition Assistant helps you construct queries for search engines to find link opportunities.Other tools include Social Media Monitoring and Blogscape.
(As a side point, Will recommends learning Excel functions MATCH and LOOKUP. And pivot tables.)
Distilled are going to do more conference calls, including one on keyword research tactics. Could be useful. Free webinars are another useful alternative to conferences when budgets are tight but you need to keep learning.
NTEN redesign: bounce rates
NTEN continue to share lots of useful information about their redesign process, including this insight into their web analytics:
“Our bounce rate is pretty darn high for folks who find our site through search: about 68%. New visitors also bounce at a high rate: about 67%. Our blog, which gives us the most traffic from search, has a bounce rate above 75%.
Friend of NTEN Avinash Kaushik says that organizations should aim for a bounce rate under 50%. We don’t expect our new visitor bounce rate will get THAT low, but there’s some work we can do to make sure people find MORE great content and stick around our site.”
via Wireframe Testing: Failing Informatively | NTEN: The Nonprofit Technology Network.
use of Google Analytics
Where search analytics is concerned it appears the RNIB is actually doing what everyone else is doing i.e. using Google Analytics:
“The use of Google Analytics is very much on the increase. Just under a quarter of responding organisations (23%) now use Google Analytics exclusively compared to only 14% a year ago.
A further 57% of respondents are using Google Analytics in conjunction with another tool (up from 52% in 2008), which means that 80% of companies are now using Google for analytics compared to 66% last year…The majority of responding companies believe that they have set up Google Analytics properly.
There is more doubt among those who do not use Google exclusively, with 23% of these
respondents saying they don’t know if it has been properly configured”
And I’m firmly in the later 46% camp these days:
“since 2008 there has been an increase from 8% to 15% of companies who have two dedicated web analysts and a decrease in the proportion of companies who have one analyst (from 32% to 26%).
But while this is a positive development, it can also be seen that exactly the same proportion of companies (46%) report that they do not have any web analysts.”
search log analysis
This article is part of a series about search log analysis which includes what people are searching for, bounce rates, spotting real opportunities and the geographical element.
I’ve been rooting around in the search logs for RNIB.org.uk. We use Google Analytics which isn’t accessible so most data has to be exported and shared in Excel.
So far I’ve got my hands on:
- the top 500 keyword referrers from external search engines (2008)
- top 500 keywords used on site search (last six months of 2008)
- top referring search engines
But that’s plenty to be getting on with.
I have to remind myself I’m only looking at the most popular terms and there’s a whole long tail I have no visibility of. There’s also some clearly dubious queries in the logs.
So far I’ve gone through the top 500 from external search engines and loosely categorised them. The categories aren’t particularly scientific;Â I’ve grouped all eye conditions into one category and grouped all queries about Helen Keller into another. Those don’t seem particularly equivalent categories but there are similar in size of queries. I’m following my instincts a bit at this stage.
For each category I’ve added up the total visits, and then worked up the average bounce, time on site and new visits per query type. I’ve also started adding information about whether the query is likely to be answered with a quick fact or should generate a longer journey.
Some of the questions I am trying to answer:
- Which queries should influence navigation design?
- Where should we be encouraging further/longer journeys?
- What content isn’t represented in the logs? We might need to work on optimising those.
- Which queries are a poor opportunity since the referral was accidental or mis-directed
As a side benefit I’ve already learnt what Bump-ons are.