In this assignment, you will conduct heuristic evaluations (HEs) of another team's paper prototype. The heuristic evaluations will be a way to highlight usability issues in your rapid electronic prototypes. Heuristic evaluations will follow the "How to Conduct a Heuristic Evaluation" and "Ten Usability Heuristics" readings by Jakob Nielsen. For a thorough explanation of the heuristic evaluation process, see the course videos. You will be given time both in studio (week 3) and in class on Tuesday to do Steps 1 and 2, but it is also recommended that you complete this over the weekend, if possible. As you develop your design, focus on satisfaction, and iterate through alternatives to get there. Refrain from advocating for any one particular design. To foreshadow: This course focuses on Interaction Design; you are not required to implement a full database (even if a production version of your design would).
This part will be completed individually. You will be assigned a group to evaluate. Meet with the group and go through their prototype (i.e. pressing buttons) in any order you prefer, jotting down as many problems as you can. Don't try to be "nice" by not reporting problems--everything you write will help the group improve their interface. Try to make comparative feedback as well by noting if certain problems are found across both prototypes or just in one. If a problem exists in only one prototype, note which one by using the labels Prototype #1 and Prototype #2. You should highlight enough similarities and differences between the two prototypes so that the group receiving your feedback will understand the advantages and drawbacks of each design and know which designs they should implement in the coming weeks.
Use Nielsen's heuristics as a guide and specify what heuristic(s) each problem is related to. If you come across problems that don't fall neatly into particular heuristics, mark that no heuristics apply. Be sure to also use Nielsen's Severity Ratings for Usability Problems to add a severity rating for every problem discussed in your heuristic evaluation.
As an evaluator, you should spend about 20 minutes reviewing both prototypes. Do not wait until the last day to do your evaluations. You submission will not be accepted if you fail to find a team to evaluate by the deadline.
This part will be completed as a team, though only two team members need to be present. Your team will conduct three Heuristic Evaluation sessions with three different assigned evaluators who will test out your prototypes. Before meeting the evaluators, your team should know the flow of your prototype and be able to quickly find/move the pieces of your prototype to simulate the experience of using an app.
For each HE session, your team members should have specific roles:
Have a copy of Nielsen's heuristics ready for the evaluator to refer to. You may provide evaluators with this worksheet to fill out while evaluating. You may also want to create a Google spreadsheet containing the heuristics with three tabs (1 for each evaluator) so the evaluator can write their feedback directly to you. A spreadsheet shared with your team and the evaluators is recommended because it makes it easier for the evaluator to do their assignment and for you to improve your design.
Each HE session, where one evaluator evaluates both prototypes, should take 20 minutes. Therefore, the three HE sessions from three evaluators should take about a total of one hour.
Think about the general characteristics of the UI you evaluated, and suggest potential improvements to address major usability problems that you identified. Decide which problems are most important, then brainstorm potential solutions for the individual you evaluated.
Finally, distill all of this down to one paragraph where you address the major problems you all identified, as well as the potential solutions. Include a sentence or two reflecting on what kinds of things you found heuristic evaluation valuable for, and what kinds of things it's not very useful for. This is done individually.
In addition, write another paragraph, or make a list of changes, that you would make to your prototypes based on your experience evaluating another team's prototypes, and the feedback you received from other teams evaluating your prototypes. Each individual submits a paragraph, but also take the opportunity to discuss your findings with your team.
Create a 1-minute video demonstrating a compelling scenario where your design might be used, and a user persona that might use it. The video is forward-looking: it should illustrate the interactions that your team plans to implement, and how they will be used. Like a storyboard, the video should go from setting to satisfaction: it demonstrates a core usage through an example; it's not about abstractly clicking on all the buttons. The beauty of sketch-level screens is that you can fold in some of the HE feedback before making your video. We'll play each group's video prototypes during studio!
The Walkabout video is a good example of a video prototype that highlights all of the team's intended features and the ways users can interact with those features.
Examples are a great way to learn, but they are not a gold standard for this class, especially as it evolves over the years. Use these examples as a reference, see where they succeeed/breakdown, and make sure your final submissions adhere to the rubric items.
Example 1 - This student did a thorough heuristic evaluation with comparitive feedback.
Example 2 - This is a weak example of a heuristic evaluation, as it wasn’t very thorough. Problems are listed for each prototype, and the heuristics and severity scores are identified. This example would have benefitted more from comparison of the prototypes.
NEW Example - This is an example of a video prototype from this year. Although the first half is not clearly different from Google Maps, the second half does an excellent job at distinguishing itself from other existing apps.
This assignment will be submitted individually, in a single formatted pdf file with the following items:
Students earn one point for each binary criterion met.