I just finished reading Forms that Work -Designing Web Forms for Usability by Caroline Jarrett and Gerry Gaffney. Wow! I’ve been thinking a lot about forms over the last 5 or so years. Typically, I take court forms (many of which are challenging to use on paper) and try to breathe life into them as interactive tools, like MADE, UpToCode, and domestic violence protection forms in Massachusetts and Washington State. Some of my thoughts can be found in the Assembly Line documentation that I help edit. Reading this take on the topic was refreshing and helped crystallize a lot of vague concepts. I’ll certainly be coming back to this book again to improve my own work.
1. Not all answers are created alike
Jarrett and Gaffney talk about 4 different kinds of answers that your forms might require:
- Slot-in: answers that your user will be able to type without thinking, like name, address, etc.
- Gathered: answers that might come from looking around the user’s house, like an ID number.
- Third-party: answers the user will need to ask someone else to collect for them.
- Created: any answer to a question that your user needs to apply judgment, imagination or synthesis to provide. Narrative answers are “created” but even yes/no questions like “Do you want a jury trial”? qualify as created.
I think I would rename slot-in to “fill-in-the-blank”, but that might be a language quirk between the UK and US. This framework seems very useful. I especially like the concept of “created” answers, which helpfully shifts the focus away from the length of a field to the fact that the user doesn’t have an answer in their head. A lot of legal forms require “created” answers, even if they are short.
The authors recommend tailoring labels to the answer type. One or two-word labels work for slot-in answers. Gathered answers may require a little help. Created answers require thought about structure to constrain the answer. Third-party answers might justify rethinking your form altogether. Perhaps it’s being sent to the wrong audience?
2. Keeping plain language simple
Plain language is a slippery concept, one that I’ve struggled to teach my students in the past. Some get it right away, but teaching it effectively is an ongoing task for me. In the past I’ve relied on the content at PlainLanguage.gov, but I’m going to reconsider it in light of this brief and pleasant chapter.
Jarrett and Gaffney discuss plain language primarily in the lens of what they call “instructions”, which can run the gamut from landing pages to in-context help bubbles.
They boil the plain language movement’s recommendations into 5 specific goals to improve your writing:
- Use familiar words in familiar ways (they suggest limiting to ~ the top 2000 words. I sometimes have my students try using the top 1000)
- Use short, affirmative, active sentences
- Get rid of walls of words (and I would add: use tables and lists)
- Put choices before actions
- Use helpful headings to group your instructions
3. Group questions into topics
Making a form flow easily requires digesting the questions into topics that relate them to one another. We all do this naturally in a conversation, but we also get feedback. It takes more thought and effort to do it in a form.
Jarret and Gaffney’s suggestions:
- Keep to one topic at a time. Don’t jump around.
- Ask expected questions before surprising ones. Example: if a user clicks a “contact us” link, the first field should be about contacting the user, not “How did you learn about us”.
- Ask less intrusive questions before more intrusive questions.
4. Put labels where users will see them
This is one I’ve certainly noticed on my own. You can’t count on your users to read all of the information on the page, no matter how obvious you think you’ve made it. Users’ eyes will jump right to empty fields, and then they will usually look the left, in the well-known “F-pattern” for eye tracking.
Help your users out by putting key information in the right place: put labels on the left of the fields that your users will fill in. Do your best to include enough information in the label so that they can answer the questions correctly if they speed through without reading headings or explanatory text.
5. Some scripted prompts for usability testing
I loved reading Steve Krug‘s Usability Test Script, but testing a form can require a more specific approach. For one thing, the “task” when testing a form can be much more singular than the “tasks” you might want to test on a website. Usually, when testing a form you are most interested in knowing what happens when a user goes straight through to the end.
Here are their suggestions for some specific prompts you can use to get your users to “think out loud” during the usability test of a form:
To find out whether the form is | Ask questions like |
Meaningful | Could you tell me what that question is asking you? What sort of information do you think that question is asking for? |
Relevant | Did you expect to be asked that question? Does it explain why it asked that question? Did it leave out a question you expected? |
Easy or difficult | How would you work out the answer to that question? Where would you look for the answer to that question? |
Asking appropriate questions | Is it okay for the company to ask that question? Is that the sort of information you’d be willing to provide? |
It’s by a totally different author, but this is a good place to share Steve Krug’s helpful walkthrough of a usability test. I can easily envision combining Jarrett and Gaffney’s helpful prompts with the script and approach outlined in Steve Krug’s demo below.
Does it translate to court forms?
Court forms certainly have unique challenges. In addition, wizard-like interactive forms, like the ones that I create on the Docassemble platform, are not exactly the same as the kind of web forms that Jarrett and Gaffney spend the most time discussing. Most of the forms the book discusses fit easily on one page. There’s often more room to decide which questions are necessary in a typical web form project, or to convince the stakeholder to get the information on their own from a third party.
In contrast: when working on court forms, I have found that we are usually working with a legal aid group or nonprofit who is trying to improve the experience of litigants. Courts are less interested, for now, in providing their own simple and easy to use tools for litigants. The paper forms, and the information that they make the litigant provide, are often a given with no room to make anything other than minor changes.
Yet, the advice that Jarrett and Gaffney give certainly applies across the board. If you’re interested in designing usable tools for attorneys and unrepresented litigants, this is one you should definitely add to your library.
0 Comments