Wiki source code of b. Use Case with Claims

Version 13.1 by Trustin Ang on 2025/04/10 19:57

Show last authors
1 (% style="background-color:#ffffff" %)
2 |(% style="width:166px" %)**Objective**|(% style="width:653px" %)(((
3 Helping PwD reinforce their long-term memory using reminiscence games.
4 )))
5 |(% style="width:166px" %)**TDP**|(% style="width:653px" %)
6 |(% style="width:166px" %)**IDP**|(% style="width:653px" %)
7 |(% style="width:166px" %)**Actors**|(% style="width:653px" %)Family members, Robot and PwD.
8 |(% style="width:166px" %)**Pre-Condition**|(% style="width:653px" %)(((
9 * The family members have distinctive happy memories with the PwD to share.
10 * The family members can speak and is willing to let the AI model to train their voice.
11 * The PwD can see, listen, and is not disabled in motor ability to interact with the robot.
12 * In early stage of dementia, PwD is willing to share memories.
13 * All actors are willing to share their data with the AI models.
14 )))
15 |(% style="width:166px" %)**Post-Condition**|(% style="width:653px" %)(((
16 * The PwD has reinforced its happy long-term memories.
17 * The PwD has a better memory of its family members.
18 * The Robot has learned more about the interaction with the PwD, which will be used to finetune its input and output.
19 )))
20 |(% style="width:166px" %)**Action Sequence**|(% style="width:653px" %)(((
21 ~1. The Robot stands in its base
22
23 2. The Robot drives to the PwD during a time where the PwD has no activities and is just chilling. Watching TV etc.
24
25 3. The Robot activates the interaction sequence in 2d Interaction Design Pattern.
26
27 4. The Robot drives away and rests in its base position.
28 )))
29
30 |(% style="width:180px" %)**Claims (title)**|(% style="width:289px" %)**Function**|(% style="width:247px" %)**Effect(s)**|(% style="width:348px" %)**Action Sequence Step(s)**
31 |(% style="width:180px" %)(((
32 CL1: PwD will engage more with the usage of emotion adaptability
33
34 //It can have any emotion, is purely focused on the quantity. //
35 )))|(% style="width:289px" %)(((
36 • Enable emotional communication for Persons with Dementia (PwD) through multimodal interaction (read, listen, see).
37
38 • Support quantity-based emotional expressions via robot interaction.
39
40 • Act as a responsive interface for emotional feedback.
41 )))|(% style="width:247px" %)(((
42 • Increased emotional engagement for PwD.
43
44 • Improved attention and participation in interaction due to adaptable emotions.
45
46 • Enhanced communication through visual, auditory, and textual feedback.
47
48
49 )))|(% style="width:348px" %)(((
50 ~1. Robot detects presence of PwD.
51
52 2. Robot initiates interaction using adaptive emotional tone (e.g., happy, calm).
53
54 3. PwD engages through reading, listening, or observing the emotion.
55
56 4. PwD responds through verbal, physical, or emotional cues.
57
58 5. Robot adjusts emotion in real-time to maintain engagement.
59
60 6. Interaction logs engagement level based on frequency/quantity of response.
61 )))
62 |(% style="width:180px" %)(((
63 CL2: The emotions will be more positive with the usage of emotion adaptability.
64
65 //Time does not matter, only the interaction before and interaction after with the AffectButton//
66 )))|(% style="width:289px" %)(((
67 • Allow PwD or caregiver to use AffectButton to register emotional feedback.
68
69 • Use emotion adaptability to enhance pre- and post-interaction emotion quality.
70
71
72
73 )))|(% style="width:247px" %)(((
74 • Clearer emotional input from user before and after interaction.
75
76 • System adapts based on emotional shift, improving tailored response.
77
78 • Emotional states become trackable for care improvement over time.
79 )))|(% style="width:348px" %)(((
80 ~1. User presses AffectButton to indicate pre-interaction emotion.
81
82 2. Robot adjusts emotional tone accordingly.
83
84 3. Interaction (e.g., conversation, storytelling) occurs.
85
86 4. User presses AffectButton again to show post-interaction emotion.
87
88 5. System compares pre/post feedback for adaptability scoring.
89
90 6. Emotional response parameters updated for future sessions.
91 )))
92 |(% style="width:180px" %)(((
93 CL3: The emotions will be more positive with the usage of AI-voice of relatives.
94
95
96
97 )))|(% style="width:289px" %)(((
98 • Use AI-generated voice to convey a wide range of emotional tones.
99
100 • Support natural, adaptive dialogue for improved user comfort.
101 )))|(% style="width:247px" %)(((
102 • More human-like, emotionally resonant interaction.
103
104 • Improved clarity and engagement for auditory learners or visually impaired users.
105
106 • Boosts trust and connection between PwD and robot.
107 )))|(% style="width:348px" %)(((
108 ~1. Robot initiates speech using AI voice from family members with selected emotional tone (e.g., soothing).
109
110 2. PwD responds verbally or non-verbally.
111
112 3. AI voice adjusts in real time based on detected cues (e.g., tone, facial expression).
113
114 4. Emotional tone enhances rapport and responsiveness during activity.
115
116 5. Session ends with AI voice summarizing or giving emotional closure.
117
118 6. Voice tone data logged for adaptation in future sessions.
119 )))
120
121