Measuring Instruments

Last modified by Mathieu Jung-Muller on 2022/04/04 13:37

In the lecture about Evaluation, we have learned about the evaluation frameworks, including DECIDE and IMPACT frameworks. We also learned the experiment design and different types of data, etc. These frameworks and methods can be used in our evaluation to give us useful insights into the prototype.

Evaluation is an important part of product design and it can last from the beginning to the very end. In the Human-Computer Interaction field, product evaluation can help researchers to identify good and bad designs, determine how usable features are, discover new features that were neglected before, and compare design choices to assist us in making decisions.

Frameworks

DECIDE Framework

Determine the goals

  • What are the high-level goals of the evaluation?
  • Who wants it and why?
  • The goals influence the approach used for the study.
    In our evaluation, our goals are to check if the different stakeholders are able to use our prototype smoothly. Investigate how Pepper affects stakeholders' lives and try to use evaluation to improve our prototype.

Explore the questions
    Define goals and research questions. Our research questions are: 

  • Are the different stakeholders able to use our prototype smoothly?
  • Does the prototype allow the PwD greater autonomy in their day-to-day life? 
  • Does the prototype improve the emotional state of the PwD and their relatives?

Choose the evaluation approach and methods
    The evaluation approach influences the methods used, and in turn, how data is collected, analyzed, and presented.

Identify the practical issues
    In our case, the most important practical issue is to gather our classmates to do the evaluation. We do not have any real dented people to evaluate. Besides, we have to make a schedule about when to evaluate our prototype.

Decide how to deal with ethical issues
    Ethical issues are the basis of the evaluation. We would inform all participants about practical issues and make sure to get their consent before starting the evaluation. Users have the right to know their tasks, know what will happen to the collected data, stop participation and leave when they wish.

Evaluate, analyze, interpret and present the data
    How data is evaluated, analyzed, interpreted, and presented. To make the results reliable and valid, we have to consider biases, reliability, validity, scope, and ecological validity.

IMPACT Framework

Intention: Clarify objectives and hypotheses/claims
Metrics & Measures: What, how and why
People: Target group & participants
Activities: Derive activities from use cases
Context: Social, ethical, physical, etc. aspects
Technologies: Hardware and software

Formative Evaluation

Focus on the various processes of the human-technology interaction
Derive open questions from the design specification.

Summative Evaluation

Focuses on the overall effects of the human-technology interaction
Specify research questions and hypotheses based on claims.

Data

Qualitative Data

Explore, discover, instruct

  • Understand and interpret interactions
  • Gain insight into views and perspectives
  • Open-ended, like interviews and participant observations
  • Try to identify patterns, features, themes
  • Study groups tend to be smaller

Quantitative Data

Describe, explain, predict

  • Measure outcomes, test hypotheses, and make predictions
  • Precise measurements
  • Identify statistical relationships
  • Larger number of participants

Experimental Design

Within-subjects

Each participant, all conditions

  • Few subjects needed
  • Reduced variability
  • More statistical power
  • Practice/fatigue effects

Between-subjects

Each participant, one condition

  • Simplicity
  • Less chance of practice/fatigue effects
  • More time, effort and participants
  • Individual variability
  • Environment factors