Wiki source code of Measuring Instruments
Show last authors
| author | version | line-number | content |
|---|---|---|---|
| 1 | |||
| 2 | |||
| 3 | |||
| 4 | == Basics== | ||
| 5 | |||
| 6 | === Decide framework=== | ||
| 7 | |||
| 8 | ==== DETERMINE THE GOALS==== | ||
| 9 | |||
| 10 | * What are the high-level goals of the evaluation? | ||
| 11 | * Who wants it and why? | ||
| 12 | * The goals influence the approach used for the study. | ||
| 13 | |||
| 14 | |||
| 15 | ==== EXPLORE THE QUESTIONS==== | ||
| 16 | |||
| 17 | Define (sub)goals & (sub)research questions. | ||
| 18 | |||
| 19 | |||
| 20 | ==== CHOOSE EVALUATION APPROACH & METHODS==== | ||
| 21 | |||
| 22 | The evaluation approach influences the methods used, and in turn, | ||
| 23 | how data is collected, analyzed and presented | ||
| 24 | |||
| 25 | |||
| 26 | ==== IDENTIFY PRACTICAL ISSUES==== | ||
| 27 | |||
| 28 | For example: | ||
| 29 | * Select users. | ||
| 30 | * Stay on budget. | ||
| 31 | * Stay on schedule. | ||
| 32 | * Find participants. | ||
| 33 | * Select equipment. | ||
| 34 | Perform a pilot (trial) study! | ||
| 35 | |||
| 36 | |||
| 37 | ==== DECIDE ABOUT ETHICAL ISSUES==== | ||
| 38 | |||
| 39 | |||
| 40 | |||
| 41 | ==== EVALUATE, ANALYZE, INTERPRET AND PRESENT THE DATA==== | ||
| 42 | |||
| 43 | |||
| 44 | The approach and methods used influence how data | ||
| 45 | is evaluated, analyzed, interpreted and presented. | ||
| 46 | |||
| 47 | |||
| 48 | === IMPACT framework=== | ||
| 49 | * Intention: Clarify objectives and hypotheses/claims | ||
| 50 | * Metrics & measures: What, how and why | ||
| 51 | * People: Target group & participants | ||
| 52 | * Activities: Derive activities from use cases | ||
| 53 | * Context: Social, ethical, physical, etc. aspects | ||
| 54 | * Technologies: Hardware and software | ||
| 55 | |||
| 56 | |||
| 57 | |||
| 58 | ==== Formative evaluation==== | ||
| 59 | Focuses on the various processes of the human-technology interaction. | ||
| 60 | Derive open questions from your design specification. | ||
| 61 | ==== Summative evaluation==== | ||
| 62 | Focuses on the overall effects of the human-technology interaction. | ||
| 63 | |||
| 64 | |||
| 65 | ==== Qualitative Data==== | ||
| 66 | |||
| 67 | • Words | ||
| 68 | • Drawings | ||
| 69 | |||
| 70 | ==== Quantitative==== | ||
| 71 | |||
| 72 | • Numbers | ||
| 73 | • Statistics | ||
| 74 | |||
| 75 | == GOOD EVALUATION == | ||
| 76 | |||
| 77 | * Establishing convincing arguments for your design solution | ||
| 78 | |||
| 79 | * By conducting complementary and regular evaluations at different stages | ||
| 80 | of your design process using the appropriate evaluation methods | ||
| 81 | (e.g. summative, formative, expert-based, observational, ...) | ||
| 82 | |||
| 83 | * Evaluations should result in insights regarding possible problems and | ||
| 84 | their causes in order to support refinement of your design specification | ||
| 85 | |||
| 86 | * Look at user experience in its full breadth: Effectiveness, efficiency, | ||
| 87 | satisfaction, learnability, mood, connectedness, ... | ||
| 88 | |||
| 89 | |||
| 90 | |||
| 91 | |||
| 92 |