b. Use Case with Claims
Objective | Helping PwD reinforce their long-term memory using reminiscence games. |
TDP | |
IDP | |
Actors | Family members, Robot and PwD. |
Pre-Condition |
|
Post-Condition |
|
Action Sequence | 1. The Robot stands in its base 2. The Robot drives to the PwD during a time where the PwD has no activities and is just chilling. Watching TV etc. 3. The Robot activates the interaction sequence in 2d Interaction Design Pattern. 4. The Robot drives away and rests in its base position. |
Claims (title) | Function | Effect(s) | Action Sequence Step(s) |
CL1: PwD will engage more with the usage of emotion adaptability It can have any emotion, is purely focused on the quantity. | • Enable emotional communication for Persons with Dementia (PwD) through multimodal interaction (read, listen, see). • Support quantity-based emotional expressions via robot interaction. • Act as a responsive interface for emotional feedback. | • Increased emotional engagement for PwD. • Improved attention and participation in interaction due to adaptable emotions. • Enhanced communication through visual, auditory, and textual feedback.
| 1. Robot detects presence of PwD. 2. Robot initiates interaction using adaptive emotional tone (e.g., happy, calm). 3. PwD engages through reading, listening, or observing the emotion. 4. PwD responds through verbal, physical, or emotional cues. 5. Robot adjusts emotion in real-time to maintain engagement. 6. Interaction logs engagement level based on frequency/quantity of response. |
CL2: The emotions will be more positive with the usage of emotion adaptability. Time does not matter, only the interaction before and interaction after with the AffectButton | • Allow PwD or caregiver to use AffectButton to register emotional feedback. • Use emotion adaptability to enhance pre- and post-interaction emotion quality.
| • Clearer emotional input from user before and after interaction. • System adapts based on emotional shift, improving tailored response. • Emotional states become trackable for care improvement over time. | 1. User presses AffectButton to indicate pre-interaction emotion. 2. Robot adjusts emotional tone accordingly. 3. Interaction (e.g., conversation, storytelling) occurs. 4. User presses AffectButton again to show post-interaction emotion. 5. System compares pre/post feedback for adaptability scoring. 6. Emotional response parameters updated for future sessions. |
CL3: The emotions will be more positive with the usage of AI-voice of relatives.
| • Use AI-generated voice to convey a wide range of emotional tones. • Support natural, adaptive dialogue for improved user comfort. | • More human-like, emotionally resonant interaction. • Improved clarity and engagement for auditory learners or visually impaired users. • Boosts trust and connection between PwD and robot. | 1. Robot initiates speech using AI voice from family mebers with selected emotional tone (e.g., soothing). 2. PwD responds verbally or non-verbally. 3. AI voice adjusts in real time based on detected cues (e.g., tone, facial expression). 4. Emotional tone enhances rapport and responsiveness during activity. 5. Session ends with AI voice summarizing or giving emotional closure. 6. Voice tone data logged for adaptation in future sessions. |