A7: User testing and redesign - due Thursday, 11/17, 11:59pm


The goal of this assignment is to test your app prototype with two people to further streamline your app and to inform the online A/B testing that you will do for A8.


Step 1: In-person user testing

Observe at least two different people testing your app prototype in-person. Try to find representative testers whom you would expect to use your app in real life.

One person will facilitate the test and interact with the user, and the rest of the team will be in charge of taking notes/photos/video/etc. The facilitator should use the protocol that you developed in A6. Don't forget to have your user sign a printed consent form before beginning the test (some user testing may be unethical without informed consent). Unlike in A3's heuristic evaluations, this time around your user will not be writing down the problems they find for you. It's your job to learn what the people testing your prototype are thinking, and what mental models they are forming; the feedback they provide you will be very valuable for your next iteration. Your main goal here is to find ways to improve your interface.

Submit your testing protocol and signed consent form for each participant. Immediately after each test, do a quick debrief with your team and write down any reactions or thoughts that came up during the testing session.

Take a photo or draw a sketch of each tester using your prototype. As with the A1 needfinding assignment, these photos with captions should show breakdowns and design opportunities. Contextualize them by capturing the action, e.g. by using over-the-shoulder photos to capture the user using your app, and by showing the setting. Look for other breakdowns and pain points in your interface and try to understand what the problems are and how you might fix them. When possible, modify/update your app prototype before testing on the next participant so that you can get fresher data.

Watch the In-Person Experiments lecture video for some practical tips about running these sorts of experiments


Step 2: Compile your findings

After testing, take some time with your team to reflect on your findings. Discuss as a team and define some general patterns in people's behavior. When you identify some interesting points, talk deeply about them—ask each other questions, recreate the different tests, analyze the decisions people made, other paths they could have taken, and so on.

Submit a detailed and understandable list of changes that you will implement as a result of your testing and discussion, with justifications. Fix the bugs that are either small and easy to fix, or too severe to ignore. Make sure that you do this before moving on to the next step of this assignment.

Step 3: Create a Meaningful Redesign

Choose ONE non-trivial component of your prototype to redesign in order to either resolve a breakdown or provide a potentially better solution than what was created before, as informed by user testing. To do this, make a duplicate of the specific webpage where the change will take place and change the URL route. For example, if you are redesigning a component of the homepage, keep the current homepage ("/homepage") and create a second page with a route "/homepage2". Do not overwrite the current page and do not create an entirely new repository for this redesign! You will be submitting both the original URL and the redesign URL.

The redesign will be used for online A/B testing in the next assignment (A8), which requires the use of the chi-squared statistical test. Therefore, the aspect you choose to redesign must be testable using the chi-squared test, which usually tests an aspect that can be binarily classified (i.e. user clicked or didn't click a button). The redesigned component must also be noticeably different from the original design in a substantial way—changing the background color as your redesign is not enough. We highly suggest that you read A8 in detail to understand what kind of redesign can be measured by chi-squared tests during online A/B testing. See the lecture video on "comparing rates" for more information.

Step 4: Description of Online A/B Test

Submit a description of the online A/B test that you plan to run for next week's assignment (A8); you do NOT need to run the online A/B test yet. The description should explain what the differences between your original and redesigned versions of the component are (from Step 3), and how the results of these changes will be measured in your online A/B test next week. Also include your best guess of all possible outcomes and interpretations for next week's test, as interpreted with implications on the design of the prototype.

Step 5: Update Development Plan

Just as we've been doing in the previous weeks, update your development plan. Add new tasks for this week and the following weeks while marking when existing tasks have been completed. Add stretch goals that you find feasible and adjust other tasks that may be out of reach.

Student Examples

Here are three student examples (note that the assignment might have differed slightly in prior years):

  • Example 1 - This is an example of an A+ level assignment. This group obviously put a lot of thought into their in-person test, and was able to motivate their redesign from the conclusions they drew from the in-person test.
  • Example 2 - This is an example of a B-level assignment. This group lost points for not including their consent form for the in-person test. We also wished the feedback was more substantive beyond obvious usability bugs (one of which had been mentioned by the TA in a previous assignment). For the online test description, we were not convinced that measuring click rates was the right metric to measure success.
  • Example 3 - This is an example of an A-level assignment. We liked the clean and well captioned photos for each participant testing their app. They also tested more than the required two users.

What’s this for? A UX agency perspective

By Mike Davison, Community TA and UX Project Manager

Testing your high fidelity prototype with users closes the circle. It is vital to ensure your solution meets the needs identified during the first assignment, and that the agency has not simply spent months drifting further from the problem.

It also allows you third party reflection and suggestions for tweaks to the design. Everything we learn here and correct is a problem we don’t have to live with because it has already been coded and is too costly to change....it’s a valuable phase of the process.

Remember this - feedback is not criticism, feedback is not personal. User centered design works best when pride is left aside, and the feedback of others is incorporated into your design thinking!

Team Submission

For this assignment, ONE person will submit the assignment for their team, listing every team member's name and student ID number (PID) in the assignment submission.

Your write-up will contain the following in one single PDF:

  • Your testing protocol and signed consent forms, as well as any materials you gave to the user as part of your tests (either as text, PDF, or a scanned image). (Testing Protocol & Documentation)
  • Notes taken from user studies with at least TWO users (User Study Notes)
  • Captioned photos for each participant testing your prototype. (Photo Documentation)
  • A detailed list of changes you will implement in your next iteration, with justifications. (Planned Changes Based on Test)
  • The URL of the original prototype you tested and the URL of the implemented alternative redesign of one non-trivial interface element. Very important: the contents of both of these URLs should not change in the upcoming week while your TA is grading this assignment, or else that is a violation of the academic honesty policy; test your new changes at a different URL. (Meaningful Design)
  • Description of your online A/B test, as well as your guess of all possible outcomes of the A/B test, interpreted with implications on the design of your prototype. (Description of Planned Online A/B Test)
  • A copy of last week's and this week's development plan embedded in a single PDF in a readable and easy-to-compare format. (Update Development Plan)

Submit your single formatted PDF in Gradescope.

Evaluation criteria & Grading rubric

The rubric below contains criteria that are worth one point each and will be graded independently and in a binary fashion.

  1. Signed consent forms & photos/sketches of each observant are submitted.
  2. Notes taken from performing the user testing protocol on at least TWO users are submitted.
  3. Notes from the first user include breakdowns, errors, or design improvements inspired by that user's testing session.
  4. Notes from the second user include breakdowns, errors, or design improvements inspired by that user's testing session.
  5. List of changes demonstrates changes that your team plans to implement, based off the user testing observations and subsequent discussions.
  6. Redesign stems from user testing, and properly reflects an underlying design breakdown or design opportunity that arose from such testing.
  7. The original and alternative redesign webpage URLs are submitted. (i.e. myapp.herokuapp.com/homepage, myapp.herokuapp.com/homepage2)
  8. The redesigned webpage is fully interactive, functional, and ready for online A/B testing next week.
  9. The redesign only changed a single feature of one page of the app, thus making it possible to test with A/B testing.
  10. The planned A/B test will involve measuring a quantifiable user behavior suitable for a chi-square test.
  11. All possible outcomes of the A/B test are identified and interpreted with implications on the design of the prototype.
  12. Development plan is included as a PDF, is easy for your TA to read and understand, and has been updated to show progress and additional tasks in the new version compared to the previous week.