Previous Table of Contents Next


4.4 A TASK TREE EXAMPLE

The portion of the example’s task tree might look like this:

Build an Application
Get a component
Locate candidate components
Locate in Group (or)
Turn to section of catalog that might contain similar components
Turn to subsection — repeat for nested subsections
Locate in Index, if name is known (or)
Turn to Index
Scan names — may be more than one found
Find by description
Indicate attribute — enumerate set of attributes
Enter value
Query — may find more than one
Find a suitable component among the candidates
Select a component
Show its description
Copy selected component to application

At this point it is a good idea to review the task tree and reduce the marginal functionality either by folding it into other tasks or by eliminating it. In our example, the “Locate in Index” could be folded into “Find by description” if one of the supported search attributes is a component’s name. This is a good choice since it eliminates an object, the Index, while only adding another simple use for an existing attribute, a component’s name. It is likely that a user would rather query-by-name than by scrolling through a potentially long list.

4.5. OPERATIONS ON OBJECTS

Each task defines an operation involving one or more objects. As the task tree is refined, the operation of each atomic task and its description should be added to the list of operations for the appropriate object. The user model objects are the nouns that appear in the task tree; the operations are the verbs. These operations will appear in the user interface as some form of command or gesture. From the example:

Objects Operations
Section Turn (displays the candidates)
Attribute Indicate kind, type-in value, query (displays the candidates)
Candidates Select one (a component)
Component Show description, copy to application
Description

As mentioned above, it is likely that some of the objects and operations may not occur in the metaphor. When possible, these should be cast as plausible extensions to the metaphor. As these extensions are not part of the metaphor and prior user experience, they must be clearly delineated and defined.

6. THE USER INTERFACE

A user model can be used to generate many different user interfaces. This step from user model to user interface can be one of the most intimidating. How are the concepts embodied in the user model going to be represented and how is the designer to choose from all the choices?

The basis for any user interface designed to fit within an existing environment will be the platform-specific guidelines for that environment, e.g., Macintosh (Apple, 1987), Windows ’95 (Microsoft, 1995), or OS/2 (IBM, 1992) for desktop computers, or PenPoint (GO, 1992) or Newton (Apple, 1996) for pen-based computers. They describe, in varying levels of detail, both general design principles and the presentation and interaction elements for their respective platforms. These guidelines are used to map elements of the user model into views, windows, controls, and various standardized data representations. While these guidelines are similar, they differ in their emphasis on the general design principles and/or their use of different input and output devices.

Necessary as the platform-specific guidelines are, they are not sufficient for mapping from user model to user interface. Designers also need a deep understanding of how people perceive, think about, and interact with the real world. This knowledge is used in all phases of the design process, from initial interview to final usability studies and redesign. The author structures this knowledge for his own work as a set of models used to guide the transformations as a design progresses from step to step as shown in Figure 4.1. These models and the areas they affect include:

  Perceptual — Grouping of visual elements; task layout; use of color, sound, and animation.
  Cognitive — Response times; short-term memory; task flow.
  Learning — On-line reference materials; use of prompting queues.
  Interaction — Use of direct manipulation vs. language; social patterns.
  Work — Purposeful activity, collaboration.

Perceptual, learning, and cognitive processing models suggest how information is best presented so that people can recognize and utilize it most easily, while models of work and interaction with the physical world and with other people provide guidelines for direct manipulation and the use of social cues during collaboration. These models provide a number of patterns that can be used to structure the overall design, fill gaps in the platform-specific guidelines, design for nonstandard environments, and anticipate the consequences of design decisions.

The design of the user interface proceeds in three overlapping phases based primarily on the representations used in each phase: rough layout, focused on transforming task flow into simple sketches of proximate window layouts; interaction design, which transforms the task tree and objects into interactive prototypes; and detailed design specification, which defines the final graphics, terminology, menus, messages, and dialogues. Just as with the earlier design stages, these phases can be pipelined, and the completed portions checked for usability and communicated to the engineering and documentation teams as the design elements firm up.


Previous Table of Contents Next