i213 Spring 2011: UI Design and Development

March 15, 2011

Experiment Design Assignment

by Tapan Parikh

Due: Thursday, April 19th, 2012 at Noon

Objective: In this assignment, you will develop a plan for testing your user interface. This includes deciding the tasks to support, the users to test, where to test, the method of testing and the metrics used to evaluate success.

What to do:

  1. Decide 3-4 tasks that you would like to support during your user testing. The tasks should provide the user with a complete and realistic feel for how the eventual application will work (or, for large projects, how some aspects will work). This should include both high-level user tasks (decide where you want to eat dinner using this interface), as well as specific, low-level tasks (use the interface to search for a restaurant open at 10PM).
  2. Decide the context where you will test each of these tasks. The location should be as realistic as possible, while allowing for controlled measurement and evaluation.
  3. Decide how you will recruit participants for your final user study. Aim for testing each task with between 5-15 users. Each task should be tested by appropriate users fitting the characteristics outlined in your persona descriptions.  Make sure to conduct a pilot test with design team members or other students before subjecting real users.
  4. Determine whether it is possible to include a control condition for the testing. For example, if you have designed a new interface for web search, a control condition could be using Google search. This allows you to measure how your new interface compares with current alternatives. Ideally, you should test each task using both your own prototype and at least one control condition. Those without a suitable control condition should compensate by either including more users and/or more performance measures (see below).
  5. Decide the methods you will use to evaluate performance for each task you want to test. This could include quantitative methods for metrics such as efficiency and error rate, as well as qualitative methods for metrics such as subjective satisfaction. Ideally, you will test each task using at least two quantitative and one qualitative method. Decide how you will measure user performance (for example, task execution time or number of errors), behavior and/or responses (for example, using audio, video, screen recorders, interview transcripts and/or detailed notes) .
  6. Decide how you will address any learning and/or fatigue effects – either between tasks, and/or between the control and experimental conditions.
  7. Decide the conditions of success. For experiments with a control, this may be some relative improvement over the control condition. For experiments without a control, this may be some absolute measure of efficiency, accuracy, subjective satisfaction, etc.
  8. Decide the materials you will need to conduct the usability test. This includes the script you will use to conduct the test, the prototype itself, and any questionnaires you will administer before and/or after testing. This should include a demographic questionnaire capturing important user details (age, gender, education, experience with your proposed kind of technology and/or application, etc), as well as informed consent and records release forms allowing you to conduct the test with the user and/or capture audio, images and video. (Example forms are provided here and here).

What to turn in:

The preferred method of turn-in is a PDF document, including each of the following components. To avoid a late penalty, e-mail a link to your group’s submission to the professor and TA before 12:00 PM (Noon) on Thursday, April 19th:

  1. Cover sheet including yourself and your partners’ names, and your chosen focus. Note the time, duration and attendance of each brainstorming session. Include a paragraph describing what each person contributed to the assignment [1/2 page].
  2. A list of tasks that you intend to test during your usability testing.
  3. Description of your control condition(s) and experimental variants, if any (it is highly recommended to include a control if possible) [1/2 page].
  4. Details about how many users that you will test for each task, and how you plan to recruit participants, including for the pilot test. If the same persona applies to multiple tasks, it is fine if the same user is tested for all of them, making sure to account for any learning / fatigue effects [1 page].
  5. Description of the methods that you will use to evaluate the usability of each task, and how you will measure document user behavior and responses. It is recommended to evaluate each task using at least two quantitative and one qualitative methods. Mention how you will address any learning and/or fatigue effects – either between tasks, or between the control and experimental conditions. Provide the location where you will test each task [2-3 pages].
  6. A list of conditions of success for each of the tasks [1/2 page]
  7. A description of the supporting materials that you will need for conducting your user test – including forms, testing scripts, questionnaires, props, and the prototype itself [1 page].

The total length of your report should be less than 8 pages. Brevity, clarity and focus on the goals of the assignment will be rewarded.

E-mail a link to your group’s project page to the professor and TA before 12:00 PM (Noon) on Thursday, April 19th.

Please contact the professor or the class TA if you have any questions with this assignment.

Filed under at 1:52 PM
Comments Off on Experiment Design Assignment

Functional Prototype Assignment

by Tapan Parikh

Due: Thursday, April 7th, 2011 at Noon

Objective: In this assignment, you will refine your interactive prototype based on the heuristic evaluation. You will also provide enough functionality so that your prototype can be tested by users in a realistic way.

What to do:

  1. Review the formative evaluation you have collected, and that provided by the other group. Decide the points that you plan to address in your next prototype, and any general changes you would like to make.
  2. Decide the tasks that you would like to support during your user testing (see the next assignment). The tasks should provide the user with a complete and realistic feel for how the eventual application will work (or, for large projects, how some specific aspects of the future application will work). The interface should support both high-level user tasks (decide where you want to eat dinner tonight), as well as specific, low-level tasks (search for a restaurant open at 10PM).
  3. Revise and/or re-implement your interactive prototype based on the tasks that you intend to support. It is not essential that you implement all of the back-end functionality, or that you provide a completely refined graphic presentation (polished icons, visual design, etc.). However, your prototype should be complete from an interaction perspective – the user should be able to use all of the functions that he/she needs to perform the tasks that you have outlined. Moreover, the interface should provide sufficient functionality to gracefully allow exploration the user might do to perform these tasks. Any potential user errors should also be handled gracefully. In short, the system should provide a realistic and complete experience for the user while performing the tasks you have described.
  4. In class on April 7th, be ready with your interactive prototype. Prepare a presentation describing its functionality, and providing an overview of your experimental design. You will receive feedback from the rest of the class, including the TA and instructor, on your prototype and experimental design, including whether or not you have provided enough functionality to support testing with real users.

What to turn in:

The preferred method of turn-in is a PDF document, including each of the following components. To avoid a late penalty, e-mail a link to your group’s submission to the professor and TA before 12:00 PM (Noon) on Thursday, April 7th:

  1. Cover sheet including yourself and your partners’ names, and your chosen focus. Note the time, duration and attendance of each brainstorming session. Include a paragraph about what each person contributed to the assignment [1/2 page].
  2. The list of changes that you decided to make based on the heuristic evaluation. Include the original comment, the severity rating provided by the other group, your own assessment, and what you to did to address it [1-2 pages].
  3. Describe the tools you used to develop your prototype, how they helped and/or created additional obstacles [1/2 page].
  4. Provide a link to your second interactive prototype, as well as directions about how to install and/or run it. Supplement with screen shots [1-2 pages].
  5. Be ready with your presentation, supporting material and your interactive prototype to be presented in class on April 7th.

The total length of your report should be less than 4 pages (not including any prototype screen shots). Brevity, clarity and focus on the goals of the assignment will be rewarded.

Please contact the professor or the class TA if you have any questions with this assignment.

Filed under at 1:52 PM
Comments Off on Functional Prototype Assignment

March 10, 2011

Formative Evaluation Assignment

by Tapan Parikh

Due: Thursday, March 17th, 2011 before Class

Objective: In this assignment, you will perform a formative evaluation another group’s balsamiq prototype.  First, you will start by conducting a “think aloud” exercise.  Then, you will conduct a heuristic evaluation, integrating your results with other evaluators to generate an evaluation report.

What to do:

  1. Arrange yourself into your groups, sitting with 1-2 other groups (see the whiteboard).
  2. Each group should write down 2-3 tasks to be completed by evaluators using your interface.
  3. Pick one person to  evaluate the other group’s interface, while the rest conducts an evaluation of your own interface.
  4. Ask the test user rom the other group to step through the tasks using the balsamiq prototype. Utilize the “Think-Aloud” protocol. Take detailed notes of the pilot user’s observations.
  5. Switch roles – the “Think-Aloud” evaluator in your group should now conduct the test; and vice versa.
  6. The remaining group members should now conduct a heuristic evaluation of the other group’s prototype. You are encouraged to use Nielsen’s and Norman’s heuristics. The demonstrator should begin by outlining a task or scenario. After that, the evaluators are free to ask questions. When the evaluators are finished assessing one scenario, the demonstratorsshould restart the conversation by demonstrating the next task or scenario.Each evaluator should work independently, making sure to take detailed notes. Any evaluations that cannot be completed during class time must be done after class.
  7. Use the following format to keep track of your observations: HE.xls. Each observation requires a numeric index, a heuristic that was violated, a location on the user interface, a description of the problem, a severity rating, and a possible fix (use the 0-4 scale presented in lecture for severity ratings). Each evaluator should aim to document a minimum of 10 usability problems, covering 5 distinct heuristics. Some usability problems may not violate an established heuristic – in that case you can label them “misc” for miscellaneous. However, please make sure that an existing heuristic does not cover what you are describing.
  8. For each problem, you should suggest a possible fix. This is not a technical description, but a simple recommendation of how to fix the problem. Try to be concise – for example, for a button that needs to be changed, simply mentioning that “Button ‘X’ should be renamed ‘Y'” is sufficient.
  9. The next step is to combine your report with others in your group.  If you do not get to it in class, you should plan a time to meet to consolidate your individual evaluation reports. The group whose project you evaluated need not be present, but you will probably find it helpful to have a version of their prototype available for reference. You will create one master Excel spreadsheet that contains each unique problem found. Remember – a duplicate is a violation of the same heuristic, in the same location. A violation of a different heuristic in the same location is considered distinct. For each unique problem found, you will need to discuss amongst yourselves to decide on a final severity rating and possible fix. Before turning in your final list, please prioritize the most severe and fixable problems.
  10. In collaboration with the other evaluators, write a short (less then one page) executive summary that outlines the major problems that you found, and possible solutions for the same, prioritizing those with the highest severity + fixability ratings.
  11. Before planning your next prototype, you should conduct another “Think-Aloud” exercise with your real prospective users.  Use the task descriptions you used in class (refined if necessary), obtain initial feedback and recommendations from users for the next iteration.  (This doesn’t need to be completed until after Spring Break.)

What to turn in:

The preferred method of turn-in is a PDF document, including each of the following components. To avoid a late penalty, e-mail a link to your group’s submission to the professor, TA and the other group before class on Thursday, March 17th:

  1. Cover sheet including yourself and the other evaluators’ names, and the project(s) you evaluated.
  2. Each of your individual evaluation reports, preferably in the following Excel format: HE.xls.
  3. The final consolidated evaluation report, using the same format.
  4. A short executive summary that outlines the major problems you found, and possible solutions for the same, prioritizing those with the highest severity ratings [1 page].

You should be prepared to discuss your report with the other group(s) in class next Thursday.  Please note that the other group is depending on your timely feedback. For this reason, late submissions will not receive credit.  Please contact the professor or the class TA if you have any questions with this assignment.

Filed under at 10:28 AM
Comments Off on Formative Evaluation Assignment