Skip to main content
Articles
Meet The Authors of the The Customer Onboarding Handbook: Caroline Jarrett

In this series, we’re getting to know the authors of The Customer Onboarding Handbook, our ebook on how to create experiences that lead to meaningful long-term engagement with a product or service, available to download for free directly from The CX Lead. 

This time around we chatted with Caroline Jarrett, the forms specialist, who’s advising organizations on how to make forms easier to fill in and how to improve websites and business processes that include forms.

Caroline’s research on topics like “How do people answer questions?” led her to explore survey methodology and to write her book, Surveys That Work: A Practical Guide for Designing and Running Better Surveys, published by Rosenfeld Media in 2021. 

Here she explains why the rule “less is more” also applies to survey design, why she’s not a fan of the Net Promoter Score, and what you can do to get more from customer experience surveys. 

What role do surveys play in the customer onboarding process?

That’s a really interesting question because my instinct was to say none! Onboarding is a fraught process for the customer anyway—they just want to get on with it. They want to get some value out of the product or service they're trying to use. 

My area of specialism is forms, and I’ve found that every extra question you ask is a bit of a barrier. It's another thing the customer has to do before they get to the experience of actually using the product or service. So the first thing I would want to do is to challenge any survey and question; do we really need to ask this during onboarding? Could we ask it before in terms of market assessment, or could we ask it later when the customer has been using the product or service for a bit and they might feel more relaxed and more reflective?

So I asked myself if there is any value at all [in an onboarding survey]. And yes, very lightly asking someone how their experience is going, perhaps with just one question that we especially want to answer, and one space that’s open to anything they want to tell us at this stage, can add some value.

What's your chapter in The Customer Onboarding Handbook about?

It’s about using and abusing surveys but I also wove in some ideas about forms because a certain amount of asking customers questions is pretty much inevitable in the onboarding process. 

I introduced my concept of whether something is a form question or a survey question. My theory is that it's a form question if you intend to use the answer individually. Let's say you ask me for my name. You might then be going to write to me or use my name somewhere in the interface, so it's very personal to me. If you ask me a question like “how did you find out about our product”, I would expect it to be used more in a survey way. You'd be aggregating all of that information together. 

So one of the things that I've encouraged people to do in my chapter is to really think very hard about how they're going to use the answers. Obviously there's some overlap, some things you might be using both in aggregate and individually but my advice is to really forensically examine how you're going to use every answer. Have a good hard think about whether it’s really sufficiently valuable or whether it’s another barrier for your customer who's excited to use your product or service. 

Each hurdle you put in front of them is another hurdle for them to jump over, so you want to be pretty well-assured that it’s really going to deliver enough value to be worth asking the question at that time. Remember that we want the customer relationship to be a long one, and we may have opportunities to add in extra questions at a later date when perhaps that initial anxiety or excitement of onboarding is a little bit further in the past.

What kind of mistakes do you see organizations make with the surveys that they use to improve the customer experience?

One of the things that comes up a lot is thoughtlessly using predetermined questions. For example, many of us have now become a little bit cynical—to be polite—about the well-known Net Promoter Score question, “Would you recommend this to a friend or family member?”

It has become a cliché, it’s greatly overused, and there’s nothing fresh about it from a customer point of view. Everyone is using that method. It's the opposite of differentiating yourself. 

I’ve also learned that very few of us actually see a really good survey because well-designed surveys use much smaller samples. Generally [in a good survey], you only use the smallest sample you can to get the quality result that you need. I think our colleagues and clients can be really surprised at how small a sample can be effective and worthwhile. 

Using a very small sample also protects customer goodwill because a lot of customers haven't ever seen the questions - they haven't been bothered by them. But communicating to a small sample of customers how special they are is a way of building their trust and encouraging them to actually respond to that survey which gives you better quality data.

What’s the ideal length of a survey?

One question is good. When I’ve been testing a survey or form, I've never had a user say “the form would have been so much better if it asked another 20 or 30 questions”

Sometimes you do need to ask slightly more than one question, though. It could be that you need to create a little bit of context for your customer around the question, so you might want to ask two or three lead-in questions.

I’ve seen a very jargony question recently, not in an onboarding survey to be fair but in a  post-experience survey where the organization was asking something like “have you seen our xyz jargon promotion deal?”. A slightly gentler introduction, like “have you seen any promotional material from us at all recently?” before hitting the customers with a load of jargon would have been a better way of asking that one important question. It would have been easier for a random customer to answer. 

Generally these days I'm trying to persuade my colleagues and clients to boil their surveys down to one thing that they really want to ask. It’s one opportunity for the respondent to tell you something they want to know. Then add perhaps a couple of very gentle easy-to-answer questions, which I call representativeness questions. For example, you might ask a customer, “Is this your first experience with our product?”. That would give you an idea if they're a brand new customer or if they're rejoining.

Limit it to one to five questions maximum, though. It's the modern world. It’s pretty easy to send surveys to people via the internet. Make them really really short if you possibly can and do more of them but send them to a smaller sample.

How difficult is it to convince clients of asking just one question when we’re so obsessed with gathering data these days?

It varies. I was pleading with one customer to do a shorter survey the other day, which had 300 questions. To be fair it was quite complicated, there were 15 topics, and they were asking a number of questions per topic. So there were reasons to believe that the respondents really would be genuinely interested in the 15 separate topics but any of us who work in user experience know that it's always an ongoing process.

I've been working in form and survey user experience for 30 years, and I think there's been three single occasions when I've told a client to do something and they just did it. It's always much more complicated, and sometimes the most exciting and interesting clients are the ones where we work together to arrive at a better solution by iterating my ideas and theirs.

Can you suggest a few easy things people can do straight away to get more from their customer experience surveys? 

The number one thing is to actually test them. If you're planning on sending out a survey, find a small number of customers that are willing to try it for you before it goes out. By small I really mean small, five customers perhaps.

In the world of usability and user experience we're pretty used to this these days. A lot of people do some kind of usability testing on their websites or apps. They're not quite as good at thinking they need to apply those techniques to surveys, too, and yet it's so quick and easy to do and it's actually the most fun part of the work. They help you learn the most.

Test individual questions but also test the whole end-to-end process, including how the link is sent to people. Sometimes you’ll find that the most important question has accidentally got left out because the process of creating the survey has been a long one.

I recently received a survey from a brand I really like. It was sent to all their customers, and it had a very lovely and generous incentive: 10 percent off the next order plus free delivery. Due to a technical error, however, the discount code had expired the week before. Clearly no one had actually done the end-to-end testing to make sure it all hangs together. They’re a lovely small business, and they dealt with it very well by writing to everybody who'd received the survey invitation with a generous extension of the discount code but it's an embarrassment that you really don't want. Proper end-to-end testing would have sorted it out. 

I also wish the survey tool vendors would pay more attention to the accessibility of their products. They're getting better but still lagging a long way behind where they ought to be in making sure that the product itself is accessible. I have colleagues in user experience who are blind, or screen reader users, or need various assistive technologies, and I can't recommend any survey tool yet, which is quite disappointing. 

Learn more about all our authors and download The Customer Onboarding Handbook for free!

By Hannah Clark

Hannah Clark is the Editor of The CX Lead. Her goal is to bring together a community of CX professionals to learn, interact, and voice their own opinions on this ever-evolving industry.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.