Changes for page b. Use Case with Claims
Last modified by Tonny Chen on 2025/04/24 18:23
From version 8.1
edited by Tonny Chen
on 2025/04/01 01:29
on 2025/04/01 01:29
Change comment:
There is no comment for this version
To version 14.1
edited by Tonny Chen
on 2025/04/24 18:22
on 2025/04/24 18:22
Change comment:
There is no comment for this version
Summary
-
Page properties (1 modified, 0 added, 0 removed)
-
Objects (0 modified, 1 added, 0 removed)
Details
- Page properties
-
- Content
-
... ... @@ -1,6 +1,10 @@ 1 1 (% style="background-color:#ffffff" %) 2 2 |(% style="width:166px" %)**Objective**|(% style="width:653px" %)((( 3 3 Helping PwD reinforce their long-term memory using reminiscence games. 4 + 5 +Remember their relatives voice with AI-voice learning. 6 + 7 +Increasing positive emotions through Face Recognition 4 4 ))) 5 5 |(% style="width:166px" %)**TDP**|(% style="width:653px" %) 6 6 |(% style="width:166px" %)**IDP**|(% style="width:653px" %) ... ... @@ -18,33 +18,65 @@ 18 18 * The Robot has learned more about the interaction with the PwD, which will be used to finetune its input and output. 19 19 ))) 20 20 |(% style="width:166px" %)**Action Sequence**|(% style="width:653px" %)((( 21 -~1. The Robot stands in its base 25 +1. The Robot needs input from the PwD relatives about their past and their voice too. 26 +1. The Robot stands in its base 27 +1. The Robot drives to the PwD during a time where the PwD has no activities and is just chilling. Watching TV etc. 28 +1. The Robot starts with the interaction with the PwD 29 +1. The Robot starts the timer on how long the session is going. 30 +1. The Robot notes the starting emotion. 31 +1. Robot initiates interaction using adaptive emotional tone (e.g., happy, calm) using the voice of the relatives. 32 +1. PwD responds through verbal, physical, or emotional cues. 33 +1. Robot adjusts emotion in real-time to maintain engagement. 34 +1. Interaction logs engagement level based on frequency/quantity of response. 35 +1. When the PwD, is not interested the robot drives back to its base. 36 +))) 22 22 23 -2. The Robot drives to the PwD during a time where the PwD has no activities and is just chilling. Watching TV etc. 38 +|(% style="width:180px" %)**Claims (title)**|(% style="width:206px" %)**Function**|(% style="width:338px" %)**Effect(s)**|(% style="width:348px" %)**Action Sequence Step(s)** 39 +|(% style="width:180px" %)((( 40 +CL1: PwD will engage more with the usage of emotion adaptability 24 24 25 -2. The Robot activates the interaction sequence in 2d Interaction Design Pattern. 42 + 43 +)))|(% style="width:206px" %)((( 44 +The robot tracks the time of how long 45 +)))|(% style="width:338px" %)((( 46 +• Increased emotional engagement for PwD. 26 26 27 -3. The Robot drives away and rests in its base position. 48 +• Improved attention and participation in interaction due to adaptable emotions. 49 + 50 +• Enhanced communication through visual, auditory, and textual feedback. 51 + 52 + 53 +)))|(% style="width:348px" %)((( 54 +3 28 28 ))) 56 +|(% style="width:180px" %)((( 57 +CL2: The emotions will be more positive with the usage of emotion adaptability. 58 +)))|(% style="width:206px" %)((( 59 +The robot tracks the emotional state of the PwD, before starting the reminiscence therapy vs after reminiscence therapy 60 +)))|(% style="width:338px" %)((( 61 +• Clearer emotional input from user before and after interaction. 29 29 30 -|(% style="width:232px" %)**Claims (title)**|(% style="width:118px" %)**Function**|(% style="width:238px" %)**Effect(s)**|(% style="width:201px" %)**Action Sequence Step(s)** 31 -|(% style="width:232px" %)((( 32 -CL1: PwD will engage more with the usage of emotion adaptability 63 +• System adapts based on emotional shift, improving tailored response. 33 33 34 -//It can have any emotion, is purely focused on the quantity. // 35 -)))|(% style="width:118px" %)The elder can read, listen and see|(% style="width:238px" %)It can interact with the Robot|(% style="width:201px" %) 36 -|(% style="width:232px" %)((( 37 -CL2: The emotions will be enhanced with the usage of emotion adaptability. 65 +• Emotional states become trackable for care improvement over time. 66 +)))|(% style="width:348px" %)((( 67 +4, 5, 6, 7, 8 68 +))) 69 +|(% style="width:180px" %)((( 70 +CL3: The emotions will be more positive with the usage of AI-voice of relatives. 38 38 39 -//Time does not matter, only the interaction before and interaction after with the AffectButton// 40 -)))|(% style="width:118px" %)((( 72 + 41 41 74 +)))|(% style="width:206px" %)((( 75 +The robot needs to learn about the relatives voice first and then can use it as output. 76 +)))|(% style="width:338px" %)((( 77 +• More human-like, emotionally resonant interaction. 42 42 79 +• Improved clarity and engagement for auditory learners or visually impaired users. 43 43 44 - 45 -)))|(% style="width: 238px" %)|(% style="width:201px" %)(((46 - 81 +• Boosts trust and connection between PwD and robot. 82 +)))|(% style="width:348px" %)((( 83 +1 47 47 ))) 48 -|(% style="width:232px" %)CL2: The emotions will be enhanced with the usage of AI-voice |(% style="width:118px" %) |(% style="width:238px" %) |(% style="width:201px" %) 49 49 50 50
- XWiki.XWikiComments[0]
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +xwiki:XWiki.MarkNeerincx - Comment
-
... ... @@ -1,0 +1,1 @@ 1 +The idea is that the first "main table" includes the action sequence (and possible one or more alternative sequences). The table below shows the underlying claims of the use, referring to the relevant steps of the action sequence. Now, the overall structure is not clear. - Date
-
... ... @@ -1,0 +1,1 @@ 1 +2025-04-21 16:24:05.368