Changes for page Test
Last modified by Clara Stiller on 2022/04/05 13:44
From version
69.1


edited by Vishruty Mittal
on 2022/04/02 13:01
on 2022/04/02 13:01
Change comment:
There is no comment for this version
To version
57.1


edited by Vishruty Mittal
on 2022/04/02 11:46
on 2022/04/02 11:46
Change comment:
There is no comment for this version
Summary
Details
- Page properties
-
- Content
-
... ... @@ -23,43 +23,83 @@ 23 23 24 24 = Method = 25 25 26 -A between-subject study with students who play the role of having dementia. Data will be collected with a questionnaire that participants fill outbefore and after interactingwithPepper. The questionnairecaptures differentaspectsoftheconversationalong with theirmood beforeandafter theinteractionwithPepper.26 +A between-subject study with students who play the role of having dementia. Data will be collected with a questionnaire (before and after participation), observing the participant's body language and the way that they're responding to Pepper. 27 27 28 -For our between-subject study, our independent variable is Pepper trying to distract the users by mentioning different activities along with the corresponding music. Through this, we want to measure the effectiveness of music and activities in preventing people from leaving the care home, which is thereby our dependent variable. So we developed 2 different prototype designs- 29 - 30 -Design X - It is the full interaction flow where Pepper suggests activities and uses music to distract people from leaving. 31 -Design Y - It is the control condition where pepper simply tries to stop people from leaving by physically keeping its hand on the door. 32 - 33 33 == Participants == 34 34 35 -1 7students who play the role of having dementia. They will be divided into two groups. One group (11 participants) will be interacting withdesignX(group 1) robot while the other group (6students) will interact with thedesignY(group 2).30 +18 students who play the role of having dementia. They will be divided into two groups. One group (11 participants) will be interacting with the intelligent (group 1) robot while the other group (7 students) will interact with the unintelligent robot (group 2). 36 36 It is assumed that all participants are living at the same care center. 32 +Before they start, they can choose how stubborn they want to be and where they want to go. 37 37 38 38 == Experimental design == 39 39 40 - **BeforeExperiment:**36 +All questions collect quantitative data, using a 5 point Likert scale wherever applicable. 41 41 42 -We will explain to the participants the goal of this experiment and what do they need to do to prevent ambiguity. Therefore, as our participants are students and only playing the role of having dementia, we will give them a level of stubbornness/ willpower with which they are trying to leave the care home. 43 -Participants will also be given a reason to leave, from the below list: 38 +1. Observe the participant's mood and see how the conversation goes. Observe the level of aggression (tone, volume, pace) 39 +1. Observe whether the mood is improved and the decision has been changed. 40 +1. Observe how natural the conversation is. (conversation makes sense) 41 +1. Participants fill out questionnaires. 44 44 45 -* going to the supermarket 46 -* going to the office 47 -* going for a walk 43 +== Tasks == 48 48 49 -After this preparation, the participant fills a part of the questionnaire. 45 +Because our participants only play the role of having dementia, we will give them a level of stubbornness/ willpower with they are trying to leave. We try to detect this level with the robot. 46 +Participants from group 1 (using intelligent robot) will also be given one of the reasons to leave, listed below: 50 50 51 -**Experiment:** 52 -The participant begins interacting with Pepper who is standing near the exit door. The participant and robot have an interaction where the robot is trying to convince him/her to stay inside. 48 +1. going to the supermarket 49 +1. going to the office 50 +1. going for a walk 53 53 54 -**After Experiment:** 55 -After the participant finishes interacting with Pepper, he/she will be asked to fill out the remaining questionnaire. Almost all the questions in the questionnaire collect quantitative data, using a 5 point Likert scale. The questionnaire also used images from Self Assessment Manikin (SAM) so that user can self attest to their mood before and after their interaction with Pepper. 52 +After this preparation, the participant is told to (try to) leave the building. The participant and robot have an interaction where the robot is trying to convince the participant to stay inside. 56 56 54 + 55 +== Measures == 56 + 57 +We will be measuring this physically and emotionally. 58 +Physically: whether the participant was stopped from leaving the building or not. 59 +Emotionally: evaluate their responses to the robot and observe their mood before and after the interaction. 60 + 61 + 62 +== Procedure == 63 + 64 +{{html}} 65 +<!-- Your HTML code here --> 66 +<table width='100%'> 67 +<tr> 68 +<th width='50%'>Group 1</th> 69 +<th width='50%'>Group 2</th> 70 +</tr> 71 +<tr> 72 +<td>intelligent robot</td> 73 +<td>unintelligent robot</td> 74 +</tr> 75 +<tr> 76 +<td> 77 +1. Starts with a short briefing on what we expect from the participant<br> 78 +2. Let them fill out the informed consent form<br> 79 +3. Tell them their level of stubbornness and reason to leave<br> 80 +4. Fill out question about current mood (in their role)<br> 81 +4. Let the user interact with the robot<br> 82 +5. While user is interacting, we will be observing the conversation with the robot<br> 83 +6. Let user fill out the questionnaire about their experience after the interaction 84 +</td> 85 +<td> 86 +1. Starts with a short briefing on what we expect from the participant<br> 87 +2. Let them fill out the informed consent form<br> 88 +4. Fill out question about current mood (in their role)<br> 89 +5. Let the user interact with the robot<br> 90 +6. Let user fill out the questionnaire about their experience after the interaction<br> 91 +</td> 92 +</tr> 93 +</table> 94 + 95 +{{/html}} 96 + 57 57 == Material == 58 58 59 59 Pepper, laptop, door, and music. 60 60 61 -= Results = 62 62 102 += Results = 63 63 {{html}} 64 64 <!--=== Comparison between intelligent (cond. 1) and less intelligent (cond. 2) prototype === 65 65 ... ... @@ -117,7 +117,6 @@ 117 117 {{/html}} 118 118 119 119 === RQ1: Are people convinced not to go out unsupervised? === 120 - 121 121 {{html}} 122 122 <table style="width: 100%"> 123 123 <tr> ... ... @@ -132,7 +132,6 @@ 132 132 {{/html}} 133 133 134 134 === RQ2: How does the interaction change the participant's mood? === 135 - 136 136 {{html}} 137 137 <table style="width: 100%"> 138 138 <tr> ... ... @@ -147,7 +147,6 @@ 147 147 {{/html}} 148 148 149 149 === RQ3: Can the robot respond appropriately to the participant's intention? === 150 - 151 151 {{html}} 152 152 <table style="width: 100%"> 153 153 <tr> ... ... @@ -162,7 +162,6 @@ 162 162 {{/html}} 163 163 164 164 === RQ4: How do the participants react to the music? === 165 - 166 166 {{html}} 167 167 <table style="width: 100%"> 168 168 <tr> ... ... @@ -177,7 +177,6 @@ 177 177 {{/html}} 178 178 179 179 === RQ5: Does the activity that the robot suggests prevent people from wandering/ leaving? === 180 - 181 181 {{html}} 182 182 <table style="width: 100%"> 183 183 <tr> ... ... @@ -192,7 +192,6 @@ 192 192 {{/html}} 193 193 194 194 === RQ6: Can pepper identify and catch the attention of the PwD? === 195 - 196 196 {{html}} 197 197 <table style="width: 100%"> 198 198 <tr> ... ... @@ -207,7 +207,6 @@ 207 207 {{/html}} 208 208 209 209 === Reliabity Scores === 210 - 211 211 {{html}} 212 212 <table style="width: 100%"> 213 213 <tr> ... ... @@ -221,7 +221,7 @@ 221 221 </table> 222 222 {{/html}} 223 223 224 -= Limitation 257 += Limitation= 225 225 226 226 * **Lab Environment**: The lab environment is different from a care home, the participants found it difficult to process the suggestions made by Pepper. For example, if Pepper asked someone to visit the living room, it created confusion among the participants regarding their next action. 227 227 ... ... @@ -232,7 +232,6 @@ 232 232 * **Face Detection**: The face recognition module within Pepper is also rudimentary in nature. It can not detect half faces are when participants approach from the side. Adding to the problem, the lighting condition in the lab was not sufficient for the reliable functioning of the face recognition module. Hence Pepper failed to notice the participant in some cases and did not start the dialogue flow. 233 233 234 234 = Conclusions = 235 - 236 236 * People who liked the activity tend to stay in 237 237 * People who knew the music found it more fitting 238 238 * People are more convinced to stay in with the intelligent prototype ... ... @@ -241,8 +241,8 @@ 241 241 * Experiment with personalization 242 242 243 243 = Future Work = 244 - 245 245 * **Personalisation**: Personalize music, and activity preferences according to the person interacting with Pepper. 246 246 * **Robot Collaboration**: Collaborate with other robots such as Miro to assist a person with dementia while going for a walk instead of the caretaker. 247 247 * **Recognise Person**: For a personalised experience, it is essential that Pepper is able to identify each person based on an internal database. 248 248 * **Fine Tune Speech Recognition**: Improvements are necessary for the speech recognition module before the actual deployment of the project in a care home. Additionally, support for multiple languages can be considered to engage with non-English speaking people. 280 +