BACK
BACK
GetYourGuide told me many visitors were looking for tickets for popular POIs (aka points of interest... think Eiffel Tower in Paris).
And although many visitors landed on the appropriate page, far too many were leaving the page without buying.
My task was to find out what was wrong, and suggest a fix for the problem.
GetYourGuide's SEM was quite sophisticated and they bid cleverly on a PPC basis on specific search terms on Google (e.g. "things to do in Paris" or "tickets for the Eiffel Tower").
Those tourists were looking for things to do on holiday, and were considered to be far along their purchase decision (jargon: they demonstrated high intent).
Although tens of thousands of visitors visited the website each and every day, the bounce rate was way too high at over 54%.
Everything was tracked that could be tracked and frequent a/b tests were run, yet no one knew why so many visitors were leaving.
Data-driven is not the same as evidence-driven and although large amounts of click data are important, clicks seldom reveal the why of users' actions.
So why were so many visitors leaving?
What led to the high bounce rate was the listing of all activities on a very long page (duh!).
GetYourGuide's algorithm highlighted one activity at the top of the long page with a "Top Pick" label.
Testing made it clear that no visitors knew why GetYourGuide's "Top Pick" was indeed top, and this was exasperated by the fact that it didn't look any different to all the other activities on the page.
As a consequence of the lack of transparency, everyone I interviewed felt they had to look at every single possible activity on that page (phew!).
It turns out that scrolling up and down a long page full of similar looking items requires a great deal of mental effort (jargon: cognitive load) and is hard work for all users, irrespective of their age or savviness with technology & screens.
The key words here are "similar looking".
Where too many things are similar looking, there's a natural tendency to blend all of them into one big blur.
Further research, interviewing and tests revealed a consistent need & desire and even a mental model, amongst almost all participants.
And a desire to have some kind of pre-selection performed for them... which was really just another way of saying "it's too hard to find something".
Having performed more-or-less all of the research, I was in an advantageous position to suggest a solution.
At first I suggested smarter up-front use of the existing filters.
Then I refined the filter idea further and came up with an automated curated list.
I presented my idea to the team.
I found some pictures online (cheekily ignoring copyright issues) to demonstrate why similar presentation of similar products means hard work for those viewing the list (jargon: cognitive load).
I talked about signposts… used to help indicate the best way to reach something.
And I presented a reworked page with a fully automateable, yet apparently curated, list which was easy to read & inspiring in its presentation and which was found to be highly positive by all test participants that I tested with.
A further test round provided confirmation that I was on my way to cracking the bounce rate problem.
At the time of writing, this brilliant solution for the various landing pages was waiting for its a/b testing... to add quantitive valiadtion to the qualititive validation.
Lean UX & iterative design
Data-driven
Evidence-based
Build - measure - learn
Husband to a lovely wife
Dad of two brilliant girls
Fan of two-wheeled transport