Re: [IxDA Discuss] Formal design review process

2008-04-25 Thread Chauncey Wilson
Hi,

here are a few more references on formal inspections.


Gunn, C. (1995). An Example of Formal Usability Inspections in
Practice at Hewlett-Packard Company. CHI'95 Proceedings Interactive
Posters.

IEEE Std 1028-1997. (1999). I Standard for software reviews. IEEE
Standards Software Engineering. New York: The Institute of Electrical
and Electronics Engineerings, Inc.

Kahn, M.K.,  Prail, A., (1994). Formal Usability Inspections in
J.Nielsen and R. L. Mack, (Eds.). Us- ability Inspection Methods. John
Wiley  Sons, Inc. New York. 141-171. This chapter has a reasonably
complete write-up on how to conduct formal usability inspections.
References

Freedman, D. P., Weinberg, G. W. (1990). Handbook of walkthroughs,
inspections, and technical reviews: Evaluating programs, projects, and
products (Third Edition).New York, NY: Dorset House Publishing.


On Thu, Apr 24, 2008 at 1:18 PM, Chauncey Wilson
[EMAIL PROTECTED] wrote:
 Hi,

 You are welcome.  I'm writing a book on UCD methods and have sections
 on each of the methods I listed. I have a chapter on Formal Usability
 Inspections that combines some work in UCD with software development
 inspections as well as a few other chapters. I would be happy to kick
 around ideas with you.  One thing that comes up a lot in the
 literature is having a task-based approach where you have the product
 team prioritize scenarios of use and then have the reviewers walk
 through those and then examine other aspects of the system.  Most of
 the methods call for people to do some independent work and then come
 together as a group.

 Thanks for the kind words.
 Chauncey


 On Thu, Apr 24, 2008 at 12:23 PM, Elise Edson [EMAIL PROTECTED] wrote:
  Thank you so much!  I never thought about doing heuristic evals as part of
  the design review - let alone as participants to act out the part of a
  user type!  Thanks for the great ideas and also for taking the time to list
  these detailed references - I'm looking forward to reading about each one!
 
  Elise
 
 
 
  On Wed, Apr 23, 2008 at 7:18 PM, Chauncey Wilson [EMAIL PROTECTED]
  wrote:
   Hello Elise,
  
   There are a number of methods that go by various names: walkthroughs,
   inspections, design reviews, and expert reviews. Some of the
   approaches that you might consider in addition to those that Daniel
   listed are listed below. These vary in the degree of formality with
   the formal usability inspection following many of the ground rules of
   software inspections.  There are some very good books on how to
   conduct software inspections that I've round useful.  There are
   several approaches that I describe below that ask reviewers to adopt
   different perspectives. I've found this approach quite powerful.  You
   might, for example, ask someone to adopt the perspective of a brand
   new user and another person to be the consistency czar and another
   person be the work-flow-efficiency inspector.
  
   Chauncey
  
   Participatory Heuristic Evaluation (Muller, Matheson, Page,  Gallup,
   1995). This is a variation on the heuristic evaluation that includes
   users as well as members of the product team. Muller and his
   colleagues also added heuristics that dealt with task and work support
   issues.
  
   Cooperative Evaluation (Monk, Wright, Haber,  Davenport, 1993)
   Monk and his colleagues (1993) published a procedural guide to a
   technique they called cooperative evaluation. Cooperative evaluation
   involves pairing a user and designer in an evaluation of a working
   version of a product. In the cooperative evaluation, users can freely
   ask questions of the designer and the designer can ask questions of
   the user.
  
   Heuristic Walkthrough (Sears, 1997)
   Sears (1997) developed a technique called a heuristic walkthrough
   that had some of the attributes of three UCD methods: a heuristic
   evaluation, a perspective-based inspection and a cognitive
   walkthrough. In Sears' method, the evaluators were given a prioritized
   list of user tasks, a set of heuristics, and thought-provoking
   questions derived from the cognitive walkthrough method.
  
   Persona-based inspections.
   In this type of inspection, the reviewers take on the perspective of
   the key personas and work through a set of tasks.  This approach may
   yield different problems for different personas.
  
   Perspective-based inspections.
   A perspective-based user interface inspection requires that one or
   more individuals evaluate a product's user interface from different
   perspectives. The use of multiple perspectives (similar to
   role-playing) is meant to broaden the problem-finding ability of
   evaluators (Virzi, 1997), especially those colleagues with little or
   no background in usability or user interface design.
   In perspective-based inspections, inspectors are generally given
   descriptions of one or more perspectives that they are to focus on, a
   list of user tasks, a set of questions related to the 

[IxDA Discuss] Formal design review process

2008-04-25 Thread Suba Periyasami
How do you conduct design reviews at your company?  Who generally signs off
on the designs?  What are the inputs/outputs for the review (wireframes,
full interaction specification, etc.)?  How about recommended
books/templates/resources on this topic?

--
Design reviews with the product development team are the most interesting
and challenging activity. We use low-fidelity prototypes (that demonstrates
sequence of interactions) during design reviews. A short review is first
conducted with the design team or with couple of co-designers to get
feedback on the designs. This also gives the designer a chance to check if
he has adequate convincing explanations for why the designs/interactions are
designed in a certain way. During the design review with product team, the
designer leads the session telling the story explaning what the product
requirements are, the different concepts he came up with, scenarios where
some concepts wouldnt work and what his final concept is. The team might
accept the design or propose changes and the designer should be prepared to
answer why the changes would work best or wouldnt work for the user in a
given scenario. A final verbal agreement is made between the product lead
and the designer at the end of the session. There might be list of changes
to be made to the design or the  design might be agreed as the final design.
The meeting notes, agreement or proposals are documented somewhere at the
bottom of the wireframe so the team can look back at any stage during
the cycle and recollect why such decisions were made.

-Suba Periyasami

Welcome to the Interaction Design Association (IxDA)!
To post to this list ... [EMAIL PROTECTED]
Unsubscribe  http://www.ixda.org/unsubscribe
List Guidelines  http://www.ixda.org/guidelines
List Help .. http://www.ixda.org/help


[IxDA Discuss] Formal design review process

2008-04-23 Thread Elise Edson
Hi folks,

My Human Factors team colleagues and I are working to formalize our
processes at our company, and one of the areas that is new for us is the
formal design review.  Right now, our UX designs are reviewed as part of the
software requirements specification (SRS), but we've found that reviewing
them at this level elicits requirements rather than design feedback (big
surprise :-)).

How do you conduct design reviews at your company?  Who generally signs off
on the designs?  What are the inputs/outputs for the review (wireframes,
full interaction specification, etc.)?  How about recommended
books/templates/resources on this topic?

Thanks!
Elise

Welcome to the Interaction Design Association (IxDA)!
To post to this list ... [EMAIL PROTECTED]
Unsubscribe  http://www.ixda.org/unsubscribe
List Guidelines  http://www.ixda.org/guidelines
List Help .. http://www.ixda.org/help


Re: [IxDA Discuss] Formal design review process

2008-04-23 Thread Daniel Szuc
Hi Elise:

A few starters -

* Walking Through Your Product Design With Stakeholders - 
http://www.uxmatters.com/MT/archives/000199.php

* Cognitive Walkthrough and Heuristic Evaluation in the Contemporary  
Design Process - 
http://www.apogeehk.com/articles/Cognitive_Walkthrough_and_Heuristic_Evaluation_in_the_Contemporary_Design_Process.html

rgds,
Dan
-- 
Daniel Szuc
Principal Usability Consultant
Apogee Usability Asia Ltd
www.apogeehk.com
Usability in Asia

The Usability Kit - www.theusabilitykit.com

On 24 Apr 2008, at 12:43 AM, Elise Edson wrote:

 Hi folks,

 My Human Factors team colleagues and I are working to formalize our
 processes at our company, and one of the areas that is new for us is  
 the
 formal design review.  Right now, our UX designs are reviewed as  
 part of the
 software requirements specification (SRS), but we've found that  
 reviewing
 them at this level elicits requirements rather than design feedback  
 (big
 surprise :-)).

 How do you conduct design reviews at your company?  Who generally  
 signs off
 on the designs?  What are the inputs/outputs for the review  
 (wireframes,
 full interaction specification, etc.)?  How about recommended
 books/templates/resources on this topic?

 Thanks!
 Elise
 
 Welcome to the Interaction Design Association (IxDA)!
 To post to this list ... [EMAIL PROTECTED]
 Unsubscribe  http://www.ixda.org/unsubscribe
 List Guidelines  http://www.ixda.org/guidelines
 List Help .. http://www.ixda.org/help






Welcome to the Interaction Design Association (IxDA)!
To post to this list ... [EMAIL PROTECTED]
Unsubscribe  http://www.ixda.org/unsubscribe
List Guidelines  http://www.ixda.org/guidelines
List Help .. http://www.ixda.org/help


Re: [IxDA Discuss] Formal design review process

2008-04-23 Thread Chauncey Wilson
Hello Elise,

There are a number of methods that go by various names: walkthroughs,
inspections, design reviews, and expert reviews. Some of the
approaches that you might consider in addition to those that Daniel
listed are listed below. These vary in the degree of formality with
the formal usability inspection following many of the ground rules of
software inspections.  There are some very good books on how to
conduct software inspections that I've round useful.  There are
several approaches that I describe below that ask reviewers to adopt
different perspectives. I've found this approach quite powerful.  You
might, for example, ask someone to adopt the perspective of a brand
new user and another person to be the consistency czar and another
person be the work-flow-efficiency inspector.

Chauncey

Participatory Heuristic Evaluation (Muller, Matheson, Page,  Gallup,
1995). This is a variation on the heuristic evaluation that includes
users as well as members of the product team. Muller and his
colleagues also added heuristics that dealt with task and work support
issues.

Cooperative Evaluation (Monk, Wright, Haber,  Davenport, 1993)
Monk and his colleagues (1993) published a procedural guide to a
technique they called cooperative evaluation. Cooperative evaluation
involves pairing a user and designer in an evaluation of a working
version of a product. In the cooperative evaluation, users can freely
ask questions of the designer and the designer can ask questions of
the user.

Heuristic Walkthrough (Sears, 1997)
Sears (1997) developed a technique called a heuristic walkthrough
that had some of the attributes of three UCD methods: a heuristic
evaluation, a perspective-based inspection and a cognitive
walkthrough. In Sears' method, the evaluators were given a prioritized
list of user tasks, a set of heuristics, and thought-provoking
questions derived from the cognitive walkthrough method.

Persona-based inspections.
In this type of inspection, the reviewers take on the perspective of
the key personas and work through a set of tasks.  This approach may
yield different problems for different personas.

Perspective-based inspections.
A perspective-based user interface inspection requires that one or
more individuals evaluate a product's user interface from different
perspectives. The use of multiple perspectives (similar to
role-playing) is meant to broaden the problem-finding ability of
evaluators (Virzi, 1997), especially those colleagues with little or
no background in usability or user interface design.
In perspective-based inspections, inspectors are generally given
descriptions of one or more perspectives that they are to focus on, a
list of user tasks, a set of questions related to the perspective, and
possibly a set of heuristics related to the perspective. Inspectors
are asked to work through tasks from the assigned perspectives.

The Structured Heuristic Evaluation Method
Kurosu, Matsuura, and Sugizakiu (1997) proposed a variation on the
heuristic evaluation called the structured heuristic evaluation method
(sHEM).  The sHEM involves multiple evaluation sessions with each
session focusing on one category of usability and a set of associated
heuristics that defined each category. The categories of usability in
Kurosu's sHEM were:
1. Ease of cognition (part 1).
2. Ease of operation.
3. Ease of cognition (part 2).
4. Pleasantness.
5. Novice versus expert users.
6. Users with special care (this category dealt with very young and
elderly users;  users who had visual, hearing, or physical
disabilities; left-handed users, and color-blind users).

Cognitive efficiency was the rationale for focusing on only one
category of usability during a session. Kurosu and his colleagues felt
that trying to keep many heuristics in mind while reviewing a product
was difficult for evaluators.

Cognitive Walkthrough
The cognitive walkthrough (CW) is a usability inspection technique
that focuses primarily on the ease of learning of a product. The
cognitive walkthrough is based on a theory that users often learn how
to use a product through a process of exploration, not through formal
training courses (Polson  Lewis, 1990). The cognitive walkthrough was
originally designed to evaluate walk-up-and-use interfaces (for
example museum kiosks, postage machines, and ATM machines), but has
been applied to more complex products (CAD systems, operating
procedures, software development tools) that support new and
infrequent users (Wharton, Bradford, Jeffries,  Franzke, 1992;
Novick, 1999).  The cognitive walkthrough is based on the concept of a
hypothetical user and does not require any actual users.

Streamlined Cognitive Walkthrough
Rick Spencer developed a simplified version of the cognitive
walkthrough that was more applicable to fast-paced development
environments.
Spencer, R. (2000). The streamlined cognitive walkthrough method,
working around social constraints encountered in a software
development company. Proceedings of ACM