User-Centered
Design for VoiceXML Applications
(Continued
from Part 1)
Creating
a Mental Model
A
mental model is a user's perception of what an application
does and how it works. Users form a mental model of
a VoiceXML application as they listen to its prompts
and messages, provide input, navigate among the application's
functions, and encounter errors. A task-oriented design
will help you understand users' mental models for the
current workflow, and you can leverage that knowledge
by cueing users to similarities between your application
and the "old" task model.
Why
It's Important
Users
always develop mental models of an application. If their
mental models match an application's design model, they
will perceive the application as easy to use. Otherwise
the application will seem illogical and hard to use.
How
to Do It
Building
a mental model is as much art as it is science. A mental
model is based on the tasks an application supports.
The task analysis step identifies tasks and task dependencies
that should be represented in the application, and by
the mental model. You can create a mental model by:
- Using
terms in the application that match users' terms.
- Maintaining
the expected sequence of events that represents task
dependencies during prompt design.
- Mimicking
observed error recovery processes or resources in
help and error recovery routines.
- Mimicking
accent, speaking patterns, or culture-specific slang
to match the target user population (only for narrowly
focused applications).
How
to Use the Results - Inputs to Design and Development
Creating
a mental model overlaps work performed in user and task
analysis, application design, and prompt design. Implement
the mental model in the form of requirements or goals
for:
- Prompt
and message design
- Voice
and speaking style selection
- Grammar
terms
- Tasks
supported by the application
Designing
Prompts and Messages
Prompts
and messages are the "spoken" outputs of a
VoiceXML application that you control (as opposed to
text your application retrieves and delivers to users
as synthesized speech). Prompt and message design includes
developing effective message structures and developing
message contents that match target users' terminology
and speaking style.
Why
it's important
Prompts
and messages are the user interface. They communicate
an application's purpose and navigation to users. Users
will quickly abandon applications that don't effectively
communicate their purpose and their navigational model.
Prompt and message design is also important because
it's closely tied to grammar selection and grammar impacts
to voice input recognition.
How
to Do It
In
my experience, effective prompt and message design is
the most difficult aspect of VoiceXML application development
for software developers. Yet in many cases developers
must write "spoken" content because there's
no one else to do it. Here are some tips that should
help you get started, or improve work in progress:
- Use
established guidelines and follow them consistently
within your application. For example,
- Use
a welcome prompt that clearly indicates the purpose
of the application and the fact that it is an
automated system. ("Welcome to Rick's online
bike shop.")
- Follow
a goal->action structure for prompts ("To
review your shopping cart, say 'review.'").
- Streamline
- keep prompts and messages short but clear, provide
shortcuts for frequent users, use barge-in to enhance
navigation.
- Keep
the "best-case" path to success short. I
suggest starting with a usability goal
of no more than one minute to primary task success.
For example, if your application sells books, a person
should be able to find and purchase a book in one
minute or less, assuming a best case scenario path
through the application.
- Design
prompts that help users provide correct responses
- those that match application grammars.
It's
beyond the scope of this article to cover prompt and
message design in detail. For additional information
see the book and web resources listed at the end of
this article.
How
to Use the Results - Test Them
When
you complete designing modules of application prompts
and messages, you have application prototypes that should
be usability tested without delay:
- Read
them aloud, preferably
to another person.
- Simulate
the application by reading application outputs to
another person and acting on their input to select
the next prompt or message to read.
- Enhance
the prototype by inserting the prompts and messages
into a simple VoiceXML container. Test it on a simulator.
Error
Reduction and Recovery
Error
reduction and recovery includes:
- The
steps you take to ensure that users encounter as few
error conditions as possible, and
- The
prompts and messages you design to enable users to
recover from error conditions.
Why
It's Important
Designing
to avoid errors is a critical aspect of human factors
for VoiceXML applications. Error conditions cause users
to hang up. While usability testing the applications
I've worked with, many users hang up immediately when
they encounter an error. Even motivated users typically
hang up if the first attempt at error recovery fails.
How
to Do It
You
will have ideas of possible error causing conditions
as you begin to design an application. As you develop
the detailed design you will identify other potential
error conditions. If you want to have a popular application,
avoid error conditions.
When
an error condition is truly unavoidable offer
layered assistance that helps users recover
from possible errors.
For
example, consider the prompt, "Say your credit
card's expiration date."
- Avoid
input errors by recognizing a variety of correct input
response formats. (oh seven oh one, July two-thousand
one, July oh one, etc.)
- If
a response is not recognized, avoid
cryptic error messages ("Invalid response.
Say the expiration date of your credit card now.").
- Provide
recovery assistance ("Sorry, I didn't
understand that. Say the expiration date like this,
'July, 2001.'").
- Don't
disconnect your user.
Offer options to continue resolving an error condition,
get help, or start over, rather than forcing a disconnect.
How
to Use the Results
Rigorously
assess the points in your application that are supported
by error messages. Try to eliminate the error condition
through improved prompts, grammar modifications, redesigned
navigation, or improvements to checks performed before
input is submitted to a database or external system.
Conduct usability tests that examine the quality of
your error recovery designs.
Summary
This
article scratched the surface of human factors issues
associated with VoiceXML applications by describing
how human factors knowledge can be implemented via the
tasks of a user centered design process.
To
learn more, see the resources below, check for user
centered design resources or training on your favorite
VoiceXML developers' site, and try out a UCD method
at your next opportunity.
Resources
Books
- Human
Factors and Voice Interactive Systems, Edited
by Daryle Gardner-Bonneau, ISBN: 0792384679
- Usability
Engineering, Jakob Nielsen, ISBN: 0125184069
- The
Usability Engineering Lifecycle (Mayhew), ISBN:
1558605614
Web
Training
Resources
back
to the top
Copyright
© 2001 VoiceXML Forum. All rights reserved.
The VoiceXML Forum is a program of the
IEEE
Industry Standards and Technology Organization
(IEEE-ISTO).
|