Changes for page Use Cases
Last modified by Cesar van der Poel on 2022/04/05 14:31
From version
43.1


edited by Cesar van der Poel
on 2022/03/29 01:51
on 2022/03/29 01:51
Change comment:
There is no comment for this version
Summary
Details
- Page properties
-
- Author
-
... ... @@ -1,1 +1,1 @@ 1 -XWiki. Cesarvanderpoe1 +XWiki.xinwan - Content
-
... ... @@ -1,14 +1,14 @@ 1 1 {{html}} 2 2 <!-- Your HTML code here --> 3 -<h2>UC001: User distracted from going outwith other activity</h2>3 +<h2>UC001: User prevented from going out</h2> 4 4 <table width='100%'> 5 5 <tr> 6 -<td style="font-size:16px"> 7 -<table border='1px'> 8 -<tr border='1px'><td bgcolor='gainsboro'> 6 +<td width='30%' style="font-size:16px"> 7 +<table border='1px' width='50%'> 8 +<tr border='1px' width='30%'><td bgcolor='gainsboro'> 9 9 <b>Objective</b> 10 -</td><td> 11 - OB01 .3: Prevent user from going outby providing an indoor activty instead10 +</td><td width='70%'> 11 + OB01: Prevent user from going out 12 12 </td></tr> 13 13 <tr><td bgcolor='gainsboro'> 14 14 <b>TDP</b> ... ... @@ -18,10 +18,6 @@ 18 18 <tr><td bgcolor='gainsboro'> 19 19 <b>Actors</b> 20 20 </td><td> 21 - <ul> 22 - <li>Pepper</li> 23 - <li>Person with Dementia</li> 24 - </ul> 25 25 </td></tr> 26 26 <tr><td bgcolor='gainsboro'> 27 27 <b>Pre-condition</b> ... ... @@ -39,16 +39,15 @@ 39 39 Figure<br><br> 40 40 UC steps:<br> 41 41 1. User walks to door to go out<br> 42 - 2. Robot navigates to door<br> 43 - 3. Robot asks user what they are doing<br> 44 - 4. User responds to robot they are restless and want to take a walk<br> 45 - 5. Robot invites user to calm down by doing a puzzle<br> 46 - 6. User complies<br> 38 + 2. User opens door<br> 39 + 3. Door sensor is triggered<br> 40 + 4. In response to door sensor trigger, robot navigates to door<br> 41 + 5. Robot attempts to get attention of user through movement and auditory signals<br> 42 + 6. User focusses on robot<br> 43 + 7. Robot guides user away from door<br> 47 47 </td></tr> 48 48 </table> 49 49 </td> 50 -</tr> 51 - 52 52 <td width='50%' style="font-size:16px"> 53 53 <table border='1px' width='50%'> 54 54 <tr><td bgcolor='gainsboro'> ... ... @@ -60,15 +60,18 @@ 60 60 </td><td bgcolor='gainsboro'> 61 61 <b>IDP</b> 62 62 </td></tr> 63 -<tr><td>1</td><td>RQ001: Detect movement towards door</td><td>-</td><td>-</td></tr>58 +<tr><td>1</td><td>RQ001: Title</td><td>CL001: Title</td><td>IDP: Title</td></tr> 64 64 <tr><td>2</td><td>-</td><td>-</td><td>-</td></tr> 65 -<tr><td>3</td><td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home)</td><td>-</td><td>-</td></tr> 66 -<tr><td>4</td><td>RQ004: Process spoken natural language</td><td>-</td><td>-</td></tr> 67 -<tr><td>5</td><td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td><td>CL02: PwD's mood is improved</td><td>-</td></tr> 68 -<tr><td>6</td><td>-</td><td>CL01: The PwD does not leave the care home, CL05: No caretaker is needed to prevent PwD from wandering 69 -</td><td>-</td></tr> 60 +<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 61 +<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 62 +<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 70 70 </table> 71 71 65 + 66 +</td></tr> 67 + 68 +</table> 69 + 72 72 <h2>UC002: User not prevented from going out</h2> 73 73 <table width='100%'> 74 74 <tr> ... ... @@ -77,7 +77,7 @@ 77 77 <tr border='1px' width='30%'><td bgcolor='gainsboro'> 78 78 <b>Objective</b> 79 79 </td><td width='70%'> 80 - OB0 1.4: Alert caretakers of user that user has gone out78 + OB02: Alert caretakers of user that the user has gone out 81 81 </td></tr> 82 82 <tr><td bgcolor='gainsboro'> 83 83 <b>TDP</b> ... ... @@ -104,11 +104,13 @@ 104 104 Figure<br><br> 105 105 UC steps:<br> 106 106 1. User walks to door to go out<br> 107 - 2. Robot navigates to door<br> 108 - 3. Robot asks user what they are doing<br> 109 - 4. User ignores robot and walks outside<br> 110 - 5. Robot alerts caretakers that the user has done so<br> 111 - 6. Caretakers take actions necessary to protect and/or locate user<br> 105 + 2. User opens door<br> 106 + 3. Door sensor is triggered<br> 107 + 4. In response to door sensor trigger, robot navigates to door<br> 108 + 5. Robot attempts to get attention of user through movement and auditory signals<br> 109 + 6. User ignores robot and walks outside<br> 110 + 7. Robot alerts caretakers that the user has done so<br> 111 + 8. Caretakers take actions necessary to protect and/or locate user<br> 112 112 </td></tr> 113 113 </table> 114 114 </td> ... ... @@ -123,163 +123,16 @@ 123 123 </td><td bgcolor='gainsboro'> 124 124 <b>IDP</b> 125 125 </td></tr> 126 -<tr><td>1</td><td>RQ001: Detect movement towards door</td><td>-</td><td>-</td></tr>126 +<tr><td>1</td><td>RQ001: Title</td><td>CL001: Title</td><td>IDP: Title</td></tr> 127 127 <tr><td>2</td><td>-</td><td>-</td><td>-</td></tr> 128 -<tr><td>3</td><td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home)</td><td>-</td><td>-</td></tr> 129 -<tr><td>4</td><td>-</td><td>-</td><td>-</td></tr> 130 -<tr><td>5</td><td>RQ007: Alert caretakers</td><td>-</td><td>-</td></tr> 131 -<tr><td>6</td><td>-</td><td>-</td><td>-</td></tr> 128 +<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 129 +<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 130 +<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 132 132 </table> 133 -</td></tr> 134 -</table> 135 135 136 136 137 - 138 - 139 -<h2>UC003: Guide user when going out</h2> 140 -<table border='1px' width='50%'> 141 -<tr border='1px' width='30%'><td bgcolor='gainsboro'> 142 - <b>Objective</b> 143 -</td><td width='70%'> 144 - OB02: Alert caretakers when user wants to take a walk 145 145 </td></tr> 146 -<tr><td bgcolor='gainsboro'> 147 - <b>TDP</b> 148 -</td><td> 149 - TDP: TITLE 150 -</td></tr> 151 -<tr><td bgcolor='gainsboro'> 152 - <b>Actors</b> 153 -</td><td> 154 -</td></tr> 155 -<tr><td bgcolor='gainsboro'> 156 - <b>Pre-condition</b> 157 -</td><td> 158 - User is restless and wants to take a walk 159 -</td></tr> 160 -<tr><td bgcolor='gainsboro'> 161 - <b>Post-condition</b> 162 -</td><td> 163 - Robot alerts caretaker and caretaker accompanies the user to a familiar location 164 -</td></tr> 165 -<tr><td bgcolor='gainsboro'> 166 - <b>Action sequence</b> 167 -</td><td> 168 - Figure<br><br> 169 - UC steps:<br> 170 - 1. User walks to door to go out<br> 171 - 2. Robot asks user what they are doing<br> 172 - 3. User responds to robot they are restless and want to take a walk<br> 173 - 4. Robot starts to play some sounds corresponding to a familiar location<br> 174 - 5. Robot alerts caretaker that the user wants to go out<br> 175 - 6. User listens and recognizes to sound and remembers the route to that location<br> 176 - 7. Caretaker arrives.<br> 177 - 8. Caretaker accompanies the user on his/her stroll<br> 178 -</td></tr> 179 179 180 180 </table> 181 - 182 -</table> 183 -</td> 184 -<td width='50%' style="font-size:16px"> 185 -<table border='1px' width='50%'> 186 -<tr><td bgcolor='gainsboro'> 187 - <b>UC step<b> 188 -</td><td bgcolor='gainsboro'> 189 - <b>Requirements</b> 190 -</td><td bgcolor='gainsboro'> 191 -<b>Claims</b> 192 -</td><td bgcolor='gainsboro'> 193 -<b>IDP</b> 194 -</td></tr> 195 -<tr><td>1</td><td>RQ001: Detect movement towards door, (RQ002: Recognize people in the care home)</td><td>-</td><td>-</td></tr> 196 -<tr><td>2</td><td>RQ003: Speak in a human-like way</td><td>-</td><td>-</td></tr> 197 -<tr><td>3</td><td>RQ004: Process spoken natural language</td><td>-</td><td>-</td></tr> 198 -<tr><td>4</td><td>RQ006: Play music</td><td>CL04: The music reminds the PwD of the intended situation or place</td><td>-</td></tr> 199 -<tr><td>5</td><td>RQ007: Alert caretakers</td><td>-</td><td>-</td></tr> 200 -<tr><td>6</td><td>-</td><td>-</td><td>-</td></tr> 201 -<tr><td>7</td><td>RQ002: Recognize people in the care home</td><td>-</td></tr> 202 -<tr><td>8</td><td>-</td><td>CL02: PwD's mood is improved, CL06: PwD leaves care home with caretaker/td><td>-</td></tr> 203 -</table> 204 -</td></tr> 205 -</table> 206 -</tr> 207 - 208 -</table> 209 - 210 - 211 -<h2>UC004: User is reminded of their current situation</h2> 212 -<table width='100%'> 213 -<tr> 214 -<td style="font-size:16px"> 215 -<table border='1px'> 216 -<tr border='1px'><td bgcolor='gainsboro'> 217 - <b>Objective</b> 218 -</td><td> 219 - OB01.1: Prevent user from going out by objecting to their reason for going out 220 -</td></tr> 221 -<tr><td bgcolor='gainsboro'> 222 - <b>TDP</b> 223 -</td><td> 224 - TDP: TITLE 225 -</td></tr> 226 -<tr><td bgcolor='gainsboro'> 227 - <b>Actors</b> 228 -</td><td> 229 -</td></tr> 230 -<tr><td bgcolor='gainsboro'> 231 - <b>Pre-condition</b> 232 -</td><td> 233 - User is under the impression they need to be somewhere and intends to go there 234 -</td></tr> 235 -<tr><td bgcolor='gainsboro'> 236 - <b>Post-condition</b> 237 -</td><td> 238 - User remembers they need to stay here 239 -</td></tr> 240 -<tr><td bgcolor='gainsboro'> 241 - <b>Action sequence</b> 242 -</td><td> 243 - Figure<br><br> 244 - UC steps:<br> 245 - 1. User walks to door to go out<br> 246 - 2. Robot navigates to door<br> 247 - 3. Robot asks user what they are doing<br> 248 - 4. User argues they need to go to work and they are already late<br> 249 - 5. Robot tries to convince the user you are already retired<br> 250 - 6. User feels agitated and insist on his idea<br> 251 - 7. Robot sings or plays a song associated with retirement<br> 252 - 8. User is reminded of his retirement and realises he does not have to go to work<br> 253 - 9. Robot invites user to stay home and user complies<br> 254 -</td></tr> 255 -</table> 256 -</td> 257 -<td width='50%' style="font-size:16px"> 258 -<table border='1px' width='50%'> 259 -<tr><td bgcolor='gainsboro'> 260 - <b>UC step<b> 261 -</td><td bgcolor='gainsboro'> 262 - <b>Requirements</b> 263 -</td><td bgcolor='gainsboro'> 264 -<b>Claims</b> 265 -</td><td bgcolor='gainsboro'> 266 -<b>IDP</b> 267 -</td></tr> 268 -<tr><td>1</td><td>RQ001: Detect movement towards door, (RQ002: Recognize people in the care home)</td><td>-</td><td>-</td></tr> 269 -<tr><td>2</td><td>-</td><td>-</td><td>-</td></tr> 270 -<tr><td>3</td><td>RQ003: Speak in a human-like way</td><td>-</td><td>-</td></tr> 271 -<tr><td>4</td><td>RQ004: Process spoken natural language</td><td>-</td><td>-</td></tr> 272 -<tr><td>5</td><td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td><td>-</td><td>-</td></tr> 273 -<tr><td>6</td><td>RQ009: Recognize emotions</td><td>-</td><td>-</td></tr> 274 -<tr><td>7</td><td>RQ006: Play music</td><td>CL02: PwD's mood is improved</td><td>-</td></tr> 275 -<tr><td>8</td><td>-</td><td>CL04: The music reminds the PwD of the intended situation or place</td><td>-</td></tr> 276 -<tr><td>9</td><td>RQ003: Speak in a human-like way</td><td>CL01: The PwD does not leave the care home</td><td>-</td></tr> 277 -</table> 278 -</td></tr> 279 -</table> 280 -</tr> 281 - 282 -</table> 283 - 284 284 {{/html}} 285 285
- XWiki.XWikiComments[0]
-
- Author
-
... ... @@ -1,1 +1,0 @@ 1 -Anonymous - Comment
-
... ... @@ -1,2 +1,0 @@ 1 -This could be specified in one or two use cases (which is ok/enough). One use case can include alternative action sequences. I am not sure about the role of music in UC4: Robot sings or plays a song associated with retirement, 2 -and the user is reminded of his retirement and realises he does not have to go to work. Is there music that reminds each resident about his/her retirement, what is being remembered about the retirement? Probably changing the mood with music and just distracting the resident from his/her intention to pass the door are more practical? It would be good to formulate a claim on this topic. - Date
-
... ... @@ -1,1 +1,0 @@ 1 -2022-03-20 23:15:39.745