Changes for page Use Cases

Last modified by Cesar van der Poel on 2022/04/05 14:31

From version Icon 3.1 Icon
edited by Cesar van der Poel
on 2022/02/08 12:28
Change comment: There is no comment for this version
To version Icon 46.1 Icon
edited by Cesar van der Poel
on 2022/04/05 14:09
Change comment: There is no comment for this version

Summary

Details

Icon Page properties
Content
... ... @@ -1,134 +1,517 @@
1 1  {{html}}
2 2  <!-- Your HTML code here -->
3 -<h2>UC001: User prevented from going out</h2>
4 -<table width='100%'>
5 -<tr>
6 -<td width='30%' style="font-size:16px">
7 -<table border='1px' width='50%'>
8 -<tr border='1px' width='30%'><td bgcolor='gainsboro'>
9 - <b>Objective</b>
10 -</td><td width='70%'>
11 - OB01: Prevent user from going out
12 -</td></tr>
13 -<tr><td bgcolor='gainsboro'>
14 - <b>TDP</b>
15 -</td><td>
16 - TDP: TITLE
17 -</td></tr>
18 -<tr><td bgcolor='gainsboro'>
19 - <b>Actors</b>
20 -</td><td>
21 -</td></tr>
22 -<tr><td bgcolor='gainsboro'>
23 - <b>Pre-condition</b>
24 -</td>User is restless and wants to take a walk<td>
25 -</td></tr>
26 -<tr><td bgcolor='gainsboro'>
27 - <b>Post-condition</b>
28 -</td>User stays home<td>
29 -</td></tr>
30 -<tr><td bgcolor='gainsboro'>
31 - <b>Action sequence</b>
32 -</td><td>
33 - Figure<br><br>
34 - UC steps:<br>
35 - 1. User walks to door to go out<br>
36 - 2. User opens door<br>
37 - 3. Door sensor is triggered<br>
38 - 4. In response to door sensor trigger, robot navigates to door<br>
39 - 5. Robot attempts to get attention of user through movement and auditory signals<br>
40 - 6. User focusses on robot<br>
41 - 7. Robot guides user away from door<br>
42 -</td></tr>
43 -</table>
44 -</td>
45 -<td width='50%' style="font-size:16px">
46 -<table border='1px' width='50%'>
47 -<tr><td bgcolor='gainsboro'>
48 - <b>UC step<b>
49 -</td><td bgcolor='gainsboro'>
50 - <b>Requirements</b>
51 -</td><td bgcolor='gainsboro'>
52 -<b>Claims</b>
53 -</td><td bgcolor='gainsboro'>
54 -<b>IDP</b>
55 -</td></tr>
56 -<tr><td>1</td><td>RQ001: Title</td><td>CL001: Title</td><td>IDP: Title</td></tr>
57 -<tr><td>2</td><td>-</td><td>-</td><td>-</td></tr>
58 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
59 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
60 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
61 -</table>
3 +<h2>UC001: User distracted from going out</h2>
4 + <table border='1px' width='50%'>
5 + <tr border='1px' width='30%'>
6 + <td bgcolor='gainsboro'>
7 + <b>Objective</b>
8 + </td><td width='70%'>
9 + OB01.1: Remove user's incentive for going out<br>
10 + OB01.3: Keep user occupied inside
11 + </td>
12 + </tr><tr>
13 + <td bgcolor='gainsboro'>
14 + <b>TDP</b>
15 + </td><td>
16 + TDP: scene A
17 + </td>
18 + </tr><tr>
19 + <td bgcolor='gainsboro'>
20 + <b>Actors</b>
21 + </td><td>
22 + <ul>
23 + <li>Person with dementia</li>
24 + <li>Pepper</li>
25 + </ul>
26 + </td>
27 + </tr><tr>
28 + <td bgcolor='gainsboro'>
29 + <b>Pre-condition</b>
30 + </td><td>
31 + User is bored and wants to go shopping
32 + </td>
33 + </tr><tr>
34 + <td bgcolor='gainsboro'>
35 + <b>Post-condition</b>
36 + </td><td>
37 + User entertains themselves inside
38 + </td>
39 + </tr><tr>
40 + <td bgcolor='gainsboro'>
41 + <b>Action sequence</b>
42 + </td><td>
43 + UC steps:<br>
44 + 1. User walks to door to go out<br>
45 + 2. Robot asks user what they are doing<br>
46 + 3. User responds that they are bored and want to go to the mall<br>
47 + 4. Robot suggests user entertains themselves with a puzzle instead<br>
48 + 5. User complies<br>
49 + </td>
50 + </tr>
51 + </table>
62 62  
53 + <table border='1px' width='50%'>
54 + <tr>
55 + <td bgcolor='gainsboro'>
56 + <b>UC step<b>
57 + </td><td bgcolor='gainsboro'>
58 + <b>Requirements</b>
59 + </td><td bgcolor='gainsboro'>
60 + <b>Claims</b>
61 + </td><td bgcolor='gainsboro'>
62 + <b>IDP</b>
63 + </td>
64 + </tr>
65 + <tr>
66 + <td>1</td>
67 + <td>RQ001: Detect movement towards door</td>
68 + <td>-</td>
69 + <td>-</td>
70 + </tr>
71 + <tr>
72 + <td>2</td>
73 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
74 + <td>CL07: The user notices the system</td>
75 + <td>IDP1</td>
76 + </tr>
77 + <tr>
78 + <td>3</td>
79 + <td>RQ004: Process spoken natural language</td>
80 + <td>-</td>
81 + <td>IDP1</td>
82 + </tr>
83 + <tr>
84 + <td>4</td>
85 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
86 + <td>-</td>
87 + <td>IDP1</td>
88 + </tr>
89 + <tr>
90 + <td>5</td>
91 + <td>-</td>
92 + <td>CL06: The user entertains themselves inside, CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
93 + <td>IDP1</td>
94 + </tr>
95 + </table>
63 63  
64 -</td></tr>
97 +<h2>UC002: User not prevented from going out</h2>
98 + <table border='1px' width='50%'>
99 + <tr border='1px' width='30%'>
100 + <td bgcolor='gainsboro'>
101 + <b>Objective</b>
102 + </td><td width='70%'>
103 + OB01.4: Allow for quick intervention from the caretaker
104 + </td>
105 + </tr><tr>
106 + <td bgcolor='gainsboro'>
107 + <b>TDP</b>
108 + </td><td>
109 + TDP: scene A, scene B
110 + </td>
111 + </tr><tr>
112 + <td bgcolor='gainsboro'>
113 + <b>Actors</b>
114 + </td><td>
115 + <ul>
116 + <li>Person with dementia</li>
117 + <li>Pepper</li>
118 + <li>(Caretaker)</li>
119 + </ul>
120 + </td>
121 + </tr><tr>
122 + <td bgcolor='gainsboro'>
123 + <b>Pre-condition</b>
124 + </td><td>
125 + User is restless and wants to take a walk
126 + </td>
127 + </tr><tr>
128 + <td bgcolor='gainsboro'>
129 + <b>Post-condition</b>
130 + </td><td>
131 + Caretakers are alerted of the fact that the user has left
132 + </td>
133 + </tr><tr>
134 + <td bgcolor='gainsboro'>
135 + <b>Action sequence</b>
136 + </td><td>
137 + UC steps:<br>
138 + 1. User walks to door to go out<br>
139 + 2. Robot asks user what they are doing<br>
140 + 3. User ignores robot and walks outside<br>
141 + 4. Robot alerts caretakers that the user has done so<br>
142 + 5. Caretakers take actions necessary to protect and/or locate user<br>
143 + </td>
144 + </tr>
145 + </table>
65 65  
66 -</table>
147 + <table border='1px' width='50%'>
148 + <tr>
149 + <td bgcolor='gainsboro'>
150 + <b>UC step<b>
151 + </td><td bgcolor='gainsboro'>
152 + <b>Requirements</b>
153 + </td><td bgcolor='gainsboro'>
154 + <b>Claims</b>
155 + </td><td bgcolor='gainsboro'>
156 + <b>IDP</b>
157 + </td>
158 + </tr>
159 + <tr>
160 + <td>1</td>
161 + <td>RQ001: Detect movement towards door</td>
162 + <td>-</td>
163 + <td>-</td>
164 + </tr>
165 + <tr>
166 + <td>2</td>
167 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
168 + <td>-</td>
169 + <td>IDP3</td>
170 + </tr>
171 + <tr>
172 + <td>3</td>
173 + <td>-</td>
174 + <td>-</td>
175 + <td>IDP3</td>
176 + </tr>
177 + <tr>
178 + <td>4</td>
179 + <td>RQ007: Alert caretakers</td>
180 + <td>CL14: Caretakers are alerted once a user leaves the home</td>
181 + <td>IDP3</td>
182 + </tr>
183 + <tr>
184 + <td>5</td>
185 + <td>-</td>
186 + <td>-</td>
187 + <td>-</td>
188 + </tr>
189 + </table>
67 67  
68 -<h2>UC002: User not prevented from going out</h2>
69 -<table width='100%'>
70 -<tr>
71 -<td width='30%' style="font-size:16px">
72 -<table border='1px' width='50%'>
73 -<tr border='1px' width='30%'><td bgcolor='gainsboro'>
74 - <b>Objective</b>
75 -</td><td width='70%'>
76 - OB02: Alert caretakers of user that the user has gone out
77 -</td></tr>
78 -<tr><td bgcolor='gainsboro'>
79 - <b>TDP</b>
80 -</td><td>
81 - TDP: TITLE
82 -</td></tr>
83 -<tr><td bgcolor='gainsboro'>
84 - <b>Actors</b>
85 -</td><td>
86 -</td></tr>
87 -<tr><td bgcolor='gainsboro'>
88 - <b>Pre-condition</b>
89 -</td>User is restless and wants to take a walk<td>
90 -</td></tr>
91 -<tr><td bgcolor='gainsboro'>
92 - <b>Post-condition</b>
93 -</td>Caretakers are alerted of the fact that the user has left<td>
94 -</td></tr>
95 -<tr><td bgcolor='gainsboro'>
96 - <b>Action sequence</b>
97 -</td><td>
98 - Figure<br><br>
99 - UC steps:<br>
100 - 1. User walks to door to go out<br>
101 - 2. User opens door<br>
102 - 3. Door sensor is triggered<br>
103 - 4. In response to door sensor trigger, robot navigates to door<br>
104 - 5. Robot attempts to get attention of user through movement and auditory signals<br>
105 - 6. User ignores robot and walks outside<br>
106 - 7. Robot alerts caretakers that the user has done so<br>
107 - 8. Caretakers take actions necessary to protect and/or locate user<br>
108 -</td></tr>
109 -</table>
110 -</td>
111 -<td width='50%' style="font-size:16px">
112 -<table border='1px' width='50%'>
113 -<tr><td bgcolor='gainsboro'>
114 - <b>UC step<b>
115 -</td><td bgcolor='gainsboro'>
116 - <b>Requirements</b>
117 -</td><td bgcolor='gainsboro'>
118 -<b>Claims</b>
119 -</td><td bgcolor='gainsboro'>
120 -<b>IDP</b>
121 -</td></tr>
122 -<tr><td>1</td><td>RQ001: Title</td><td>CL001: Title</td><td>IDP: Title</td></tr>
123 -<tr><td>2</td><td>-</td><td>-</td><td>-</td></tr>
124 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
125 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
126 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
127 -</table>
191 +<h2>UC003: User is reminded of their current situation</h2>
192 + <table border='1px' width='50%'>
193 + <tr border='1px' width='30%'>
194 + <td bgcolor='gainsboro'>
195 + <b>Objective</b>
196 + </td><td width='70%'>
197 + OB01.2: Bring user back to reality
198 + </td>
199 + </tr><tr>
200 + <td bgcolor='gainsboro'>
201 + <b>TDP</b>
202 + </td><td>
203 + TDP: scene A
204 + </td>
205 + </tr><tr>
206 + <td bgcolor='gainsboro'>
207 + <b>Actors</b>
208 + </td><td>
209 + <ul>
210 + <li>Person with dementia</li>
211 + <li>Pepper</li>
212 + </ul>
213 + </td>
214 + </tr><tr>
215 + <td bgcolor='gainsboro'>
216 + <b>Pre-condition</b>
217 + </td><td>
218 + User thinks they need to get to work quickly
219 + </td>
220 + </tr><tr>
221 + <td bgcolor='gainsboro'>
222 + <b>Post-condition</b>
223 + </td><td>
224 + User remembers they are retired and in a care home
225 + </td>
226 + </tr><tr>
227 + <td bgcolor='gainsboro'>
228 + <b>Action sequence</b>
229 + </td><td>
230 + Figure<br><br>
231 + UC steps:<br>
232 + 1. User walks to door to go out<br>
233 + 2. Robot asks user what they are doing<br>
234 + 3. User argues they need to go to work and they are already late<br>
235 + 4. Robot tries to convince the user they are free today<br>
236 + 5. User feels agitated and insist on their idea<br>
237 + 6. Robot sings or plays a song associated with retirement<br>
238 + 7. User is reminded of their retirement and realises they do not have to go to work, deciding to stay in<br>
239 + </td>
240 + </tr>
241 + </table>
128 128  
243 + <table border='1px' width='50%'>
244 + <tr>
245 + <td bgcolor='gainsboro'>
246 + <b>UC step<b>
247 + </td><td bgcolor='gainsboro'>
248 + <b>Requirements</b>
249 + </td><td bgcolor='gainsboro'>
250 + <b>Claims</b>
251 + </td><td bgcolor='gainsboro'>
252 + <b>IDP</b>
253 + </td>
254 + </tr>
255 + <tr>
256 + <td>1</td>
257 + <td>RQ001: Detect movement towards door</td>
258 + <td>-</td>
259 + <td>-</td>
260 + </tr>
261 + <tr>
262 + <td>2</td>
263 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
264 + <td>CL07: The user notices the system</td>
265 + <td>IDP2</td>
266 + </tr>
267 + <tr>
268 + <td>3</td>
269 + <td>RQ004: Process spoken natural language</td>
270 + <td>-</td>
271 + <td>IDP2</td>
272 + </tr>
273 + <tr>
274 + <td>4</td>
275 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
276 + <td>-</td>
277 + <td>IDP2</td>
278 + </tr>
279 + <tr>
280 + <td>5</td>
281 + <td>RQ009: Recognize emotions, RQ004: Process spoken natural language</td>
282 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
283 + <td>IDP2</td>
284 + </tr>
285 + <tr>
286 + <td>6</td>
287 + <td>RQ006: Play music</td>
288 + <td>CL02: The user's mood is improved, CL04: The music fits the situation or place, CL08: The user is subtly brought back to reality</td>
289 + <td>IDP2</td>
290 + </tr>
291 + <tr>
292 + <td>7</td>
293 + <td>-</td>
294 + <td>CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
295 + <td>IDP2</td>
296 + </tr>
297 + </table>
129 129  
130 -</td></tr>
299 +<h2>UC004: User is not convinced by robot</h2>
300 + <table border='1px' width='50%'>
301 + <tr border='1px' width='30%'>
302 + <td bgcolor='gainsboro'>
303 + <b>Objective</b>
304 + </td><td width='70%'>
305 + OB01.4: Allow for quick intervention from the caretaker
306 + </td>
307 + </tr><tr>
308 + <td bgcolor='gainsboro'>
309 + <b>TDP</b>
310 + </td><td>
311 + TDP: scene A, scene B
312 + </td>
313 + </tr><tr>
314 + <td bgcolor='gainsboro'>
315 + <b>Actors</b>
316 + </td><td>
317 + <ul>
318 + <li>Person with dementia</li>
319 + <li>Pepper</li>
320 + <li>Caretaker</li>
321 + </ul>
322 + </td>
323 + </tr><tr>
324 + <td bgcolor='gainsboro'>
325 + <b>Pre-condition</b>
326 + </td><td>
327 + User thinks they need to get to work quickly
328 + </td>
329 + </tr><tr>
330 + <td bgcolor='gainsboro'>
331 + <b>Post-condition</b>
332 + </td><td>
333 + User is calmed down by caretaker
334 + </td>
335 + </tr><tr>
336 + <td bgcolor='gainsboro'>
337 + <b>Action sequence</b>
338 + </td><td>
339 + Figure<br><br>
340 + UC steps:<br>
341 + 1. User walks to door to go out<br>
342 + 2. Robot asks user what they are doing<br>
343 + 3. User argues they need to go to work and they are already late<br>
344 + 4. Robot tries to convince the user they are free today<br>
345 + 5. User feels agitated and insist on their idea<br>
346 + 6. Robot sings or plays a song associated with retirement<br>
347 + 7. User does not respond to music and insists on going out<br>
348 + 8. Robot calls caretaker and requests user to wait for them<br>
349 + 9. Caretaker takes over and calms the user
350 + </td>
351 + </tr>
352 + </table>
131 131  
132 -</table>
354 + <table border='1px' width='50%'>
355 + <tr>
356 + <td bgcolor='gainsboro'>
357 + <b>UC step<b>
358 + </td><td bgcolor='gainsboro'>
359 + <b>Requirements</b>
360 + </td><td bgcolor='gainsboro'>
361 + <b>Claims</b>
362 + </td><td bgcolor='gainsboro'>
363 + <b>IDP</b>
364 + </td>
365 + </tr>
366 + <tr>
367 + <td>1</td>
368 + <td>RQ001: Detect movement towards door</td>
369 + <td>-</td>
370 + <td>-</td>
371 + </tr>
372 + <tr>
373 + <td>2</td>
374 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
375 + <td>CL07: The user notices the system</td>
376 + <td>IDP2</td>
377 + </tr>
378 + <tr>
379 + <td>3</td>
380 + <td>RQ004: Process spoken natural language</td>
381 + <td>-</td>
382 + <td>IDP2</td>
383 + </tr>
384 + <tr>
385 + <td>4</td>
386 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
387 + <td>-</td>
388 + <td>IDP2</td>
389 + </tr>
390 + <tr>
391 + <td>5</td>
392 + <td>RQ009: Recognize emotions, RQ004: Process spoken natural language</td>
393 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
394 + <td>IDP2</td>
395 + </tr>
396 + <tr>
397 + <td>6</td>
398 + <td>RQ006: Play music</td>
399 + <td>CL04: The music fits the situation or place</td>
400 + <td>IDP2</td>
401 + </tr>
402 + <tr>
403 + <td>7</td>
404 + <td>-</td>
405 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
406 + <td>IDP2</td>
407 + </tr>
408 + <tr>
409 + <td>8</td>
410 + <td>RQ007: Alert caretakers</td>
411 + <td>CL05: The user is willing to wait for the caretaker, CL14: Caretakers are alerted once a user leaves the home</td>
412 + <td>IDP2</td>
413 + </tr>
414 + <tr>
415 + <td>9</td>
416 + <td></td>
417 + <td>CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
418 + <td>-</td>
419 + </tr>
420 + </table>
421 +
422 +<h2>UC005: User takes a supervised walk</h2>
423 + <table border='1px' width='50%'>
424 + <tr border='1px' width='30%'>
425 + <td bgcolor='gainsboro'>
426 + <b>Objective</b>
427 + </td><td width='70%'>
428 + OB02: Allow user to take supervised walks
429 + </td>
430 + </tr><tr>
431 + <td bgcolor='gainsboro'>
432 + <b>TDP</b>
433 + </td><td>
434 + TDP: scene A, scene C
435 + </td>
436 + </tr><tr>
437 + <td bgcolor='gainsboro'>
438 + <b>Actors</b>
439 + </td><td>
440 + <ul>
441 + <li>Person with dementia</li>
442 + <li>Pepper</li>
443 + <li>Caretaker</li>
444 + </ul>
445 + </td>
446 + </tr><tr>
447 + <td bgcolor='gainsboro'>
448 + <b>Pre-condition</b>
449 + </td><td>
450 + User is restless and wants to take a walk
451 + </td>
452 + </tr><tr>
453 + <td bgcolor='gainsboro'>
454 + <b>Post-condition</b>
455 + </td><td>
456 + User is accompanied by caretaker
457 + </td>
458 + </tr><tr>
459 + <td bgcolor='gainsboro'>
460 + <b>Action sequence</b>
461 + </td><td>
462 + Figure<br><br>
463 + UC steps:<br>
464 + 1. User walks to door to go out<br>
465 + 2. Robot asks user what they are doing<br>
466 + 3. User requests to take a walk<br>
467 + 4. Robot calls the caretaker and requests the user to wait<br>
468 + 5. Caretaker arrives and takes a walk with the user<br>
469 + </td>
470 + </tr>
471 + </table>
472 +
473 + <table border='1px' width='50%'>
474 + <tr>
475 + <td bgcolor='gainsboro'>
476 + <b>UC step<b>
477 + </td><td bgcolor='gainsboro'>
478 + <b>Requirements</b>
479 + </td><td bgcolor='gainsboro'>
480 + <b>Claims</b>
481 + </td><td bgcolor='gainsboro'>
482 + <b>IDP</b>
483 + </td>
484 + </tr>
485 + <tr>
486 + <td>1</td>
487 + <td>RQ001: Detect movement towards door</td>
488 + <td>-</td>
489 + <td>-</td>
490 + </tr>
491 + <tr>
492 + <td>2</td>
493 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
494 + <td>CL07: The user notices the system</td>
495 + <td>IDP4</td>
496 + </tr>
497 + <tr>
498 + <td>3</td>
499 + <td>RQ004: Process spoken natural language</td>
500 + <td>-</td>
501 + <td>IDP4</td>
502 + </tr>
503 + <tr>
504 + <td>4</td>
505 + <td>RQ007: Alert caretakers, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
506 + <td>CL05: The user is willing to wait for the caretaker, CL10: The user feels dependent on others, (CL12, CL15)</td>
507 + <td>IDP4</td>
508 + </tr>
509 + <tr>
510 + <td>5</td>
511 + <td></td>
512 + <td>CL02: The user's mood is improved</td>
513 + <td></td>
514 + </tr>
515 + </table>
133 133  {{/html}}
134 134  
Icon XWiki.XWikiComments[0]
Author
... ... @@ -1,0 +1,1 @@
1 +Anonymous
Comment
... ... @@ -1,0 +1,2 @@
1 +This could be specified in one or two use cases (which is ok/enough). One use case can include alternative action sequences. I am not sure about the role of music in UC4: Robot sings or plays a song associated with retirement,
2 +and the user is reminded of his retirement and realises he does not have to go to work. Is there music that reminds each resident about his/her retirement, what is being remembered about the retirement? Probably changing the mood with music and just distracting the resident from his/her intention to pass the door are more practical? It would be good to formulate a claim on this topic.
Date
... ... @@ -1,0 +1,1 @@
1 +2022-03-20 23:15:39.745