Wiki source code of Use Cases

Version 52.1 by Cesar van der Poel on 2022/04/05 14:29

Show last authors
1 {{html}}
2 <!-- Your HTML code here -->
3 <h2>UC001: User distracted from going out</h2>
4 <table border='1px'>
5 <colgroup>
6 <col>
7 <col style="width: 70%">
8 </colgroup>
9 <tr border='1px'>
10 <td bgcolor='gainsboro'>
11 <b>Objective</b>
12 </td><td>
13 OB01.1: Remove user's incentive for going out<br>
14 OB01.3: Keep user occupied inside
15 </td>
16 </tr><tr>
17 <td bgcolor='gainsboro'>
18 <b>TDP</b>
19 </td><td>
20 TDP: scene A
21 </td>
22 </tr><tr>
23 <td bgcolor='gainsboro'>
24 <b>Actors</b>
25 </td><td>
26 <ul>
27 <li>Person with dementia</li>
28 <li>Pepper</li>
29 </ul>
30 </td>
31 </tr><tr>
32 <td bgcolor='gainsboro'>
33 <b>Pre-condition</b>
34 </td><td>
35 User is bored and wants to go shopping
36 </td>
37 </tr><tr>
38 <td bgcolor='gainsboro'>
39 <b>Post-condition</b>
40 </td><td>
41 User entertains themselves inside
42 </td>
43 </tr><tr>
44 <td bgcolor='gainsboro'>
45 <b>Action sequence</b>
46 </td><td>
47 UC steps:<br>
48 1. User walks to door to go out<br>
49 2. Robot asks user what they are doing<br>
50 3. User responds that they are bored and want to go to the mall<br>
51 4. Robot suggests user entertains themselves with a puzzle instead<br>
52 5. User complies<br>
53 </td>
54 </tr>
55 </table>
56
57 <table border='1px'>
58 <colgroup>
59 <col style="width: 5%">
60 <col>
61 <col>
62 <col style="width: 10%">
63 </colgroup>
64 <tr>
65 <td bgcolor='gainsboro'>
66 <b>UC step<b>
67 </td><td bgcolor='gainsboro'>
68 <b>Requirements</b>
69 </td><td bgcolor='gainsboro'>
70 <b>Claims</b>
71 </td><td bgcolor='gainsboro'>
72 <b>IDP</b>
73 </td>
74 </tr>
75 <tr>
76 <td>1</td>
77 <td>RQ001: Detect movement towards door</td>
78 <td>-</td>
79 <td>-</td>
80 </tr>
81 <tr>
82 <td>2</td>
83 <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
84 <td>CL07: The user notices the system</td>
85 <td>IDP1</td>
86 </tr>
87 <tr>
88 <td>3</td>
89 <td>RQ004: Process spoken natural language</td>
90 <td>-</td>
91 <td>IDP1</td>
92 </tr>
93 <tr>
94 <td>4</td>
95 <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
96 <td>-</td>
97 <td>IDP1</td>
98 </tr>
99 <tr>
100 <td>5</td>
101 <td>-</td>
102 <td>CL06: The user entertains themselves inside, CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
103 <td>IDP1</td>
104 </tr>
105 </table>
106
107 <h2>UC002: User not prevented from going out</h2>
108 <table border='1px'>
109 <colgroup>
110 <col>
111 <col style="width: 70%">
112 </colgroup>
113 <tr border='1px'>
114 <td bgcolor='gainsboro'>
115 <b>Objective</b>
116 </td><td width='70%'>
117 OB01.4: Allow for quick intervention from the caretaker
118 </td>
119 </tr><tr>
120 <td bgcolor='gainsboro'>
121 <b>TDP</b>
122 </td><td>
123 TDP: scene A, scene B
124 </td>
125 </tr><tr>
126 <td bgcolor='gainsboro'>
127 <b>Actors</b>
128 </td><td>
129 <ul>
130 <li>Person with dementia</li>
131 <li>Pepper</li>
132 <li>(Caretaker)</li>
133 </ul>
134 </td>
135 </tr><tr>
136 <td bgcolor='gainsboro'>
137 <b>Pre-condition</b>
138 </td><td>
139 User is restless and wants to take a walk
140 </td>
141 </tr><tr>
142 <td bgcolor='gainsboro'>
143 <b>Post-condition</b>
144 </td><td>
145 Caretakers are alerted of the fact that the user has left
146 </td>
147 </tr><tr>
148 <td bgcolor='gainsboro'>
149 <b>Action sequence</b>
150 </td><td>
151 UC steps:<br>
152 1. User walks to door to go out<br>
153 2. Robot asks user what they are doing<br>
154 3. User ignores robot and walks outside<br>
155 4. Robot alerts caretakers that the user has done so<br>
156 5. Caretakers take actions necessary to protect and/or locate user<br>
157 </td>
158 </tr>
159 </table>
160
161 <table border='1px'>
162 <colgroup>
163 <col style="width: 5%">
164 <col>
165 <col>
166 <col style="width: 10%">
167 </colgroup>
168 <tr>
169 <td bgcolor='gainsboro'>
170 <b>UC step<b>
171 </td><td bgcolor='gainsboro'>
172 <b>Requirements</b>
173 </td><td bgcolor='gainsboro'>
174 <b>Claims</b>
175 </td><td bgcolor='gainsboro'>
176 <b>IDP</b>
177 </td>
178 </tr>
179 <tr>
180 <td>1</td>
181 <td>RQ001: Detect movement towards door</td>
182 <td>-</td>
183 <td>-</td>
184 </tr>
185 <tr>
186 <td>2</td>
187 <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
188 <td>-</td>
189 <td>IDP3</td>
190 </tr>
191 <tr>
192 <td>3</td>
193 <td>-</td>
194 <td>-</td>
195 <td>IDP3</td>
196 </tr>
197 <tr>
198 <td>4</td>
199 <td>RQ007: Alert caretakers</td>
200 <td>CL14: Caretakers are alerted once a user leaves the home</td>
201 <td>IDP3</td>
202 </tr>
203 <tr>
204 <td>5</td>
205 <td>-</td>
206 <td>-</td>
207 <td>-</td>
208 </tr>
209 </table>
210
211 <h2>UC003: User is reminded of their current situation</h2>
212 <table border='1px'>
213 <colgroup>
214 <col>
215 <col style="width: 70%">
216 </colgroup>
217 <tr border='1px'>
218 <td bgcolor='gainsboro'>
219 <b>Objective</b>
220 </td><td width='70%'>
221 OB01.2: Bring user back to reality
222 </td>
223 </tr><tr>
224 <td bgcolor='gainsboro'>
225 <b>TDP</b>
226 </td><td>
227 TDP: scene A
228 </td>
229 </tr><tr>
230 <td bgcolor='gainsboro'>
231 <b>Actors</b>
232 </td><td>
233 <ul>
234 <li>Person with dementia</li>
235 <li>Pepper</li>
236 </ul>
237 </td>
238 </tr><tr>
239 <td bgcolor='gainsboro'>
240 <b>Pre-condition</b>
241 </td><td>
242 User thinks they need to get to work quickly
243 </td>
244 </tr><tr>
245 <td bgcolor='gainsboro'>
246 <b>Post-condition</b>
247 </td><td>
248 User remembers they are retired and in a care home
249 </td>
250 </tr><tr>
251 <td bgcolor='gainsboro'>
252 <b>Action sequence</b>
253 </td><td>
254 Figure<br><br>
255 UC steps:<br>
256 1. User walks to door to go out<br>
257 2. Robot asks user what they are doing<br>
258 3. User argues they need to go to work and they are already late<br>
259 4. Robot tries to convince the user they are free today<br>
260 5. User feels agitated and insist on their idea<br>
261 6. Robot sings or plays a song associated with retirement<br>
262 7. User is reminded of their retirement and realises they do not have to go to work, deciding to stay in<br>
263 </td>
264 </tr>
265 </table>
266
267 <table border='1px'>
268 <colgroup>
269 <col>
270 <col style="width: 70%">
271 </colgroup>
272 <tr>
273 <td bgcolor='gainsboro'>
274 <b>UC step<b>
275 </td><td bgcolor='gainsboro'>
276 <b>Requirements</b>
277 </td><td bgcolor='gainsboro'>
278 <b>Claims</b>
279 </td><td bgcolor='gainsboro'>
280 <b>IDP</b>
281 </td>
282 </tr>
283 <tr>
284 <td>1</td>
285 <td>RQ001: Detect movement towards door</td>
286 <td>-</td>
287 <td>-</td>
288 </tr>
289 <tr>
290 <td>2</td>
291 <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
292 <td>CL07: The user notices the system</td>
293 <td>IDP2</td>
294 </tr>
295 <tr>
296 <td>3</td>
297 <td>RQ004: Process spoken natural language</td>
298 <td>-</td>
299 <td>IDP2</td>
300 </tr>
301 <tr>
302 <td>4</td>
303 <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
304 <td>-</td>
305 <td>IDP2</td>
306 </tr>
307 <tr>
308 <td>5</td>
309 <td>RQ009: Recognize emotions, RQ004: Process spoken natural language</td>
310 <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
311 <td>IDP2</td>
312 </tr>
313 <tr>
314 <td>6</td>
315 <td>RQ006: Play music</td>
316 <td>CL02: The user's mood is improved, CL04: The music fits the situation or place, CL08: The user is subtly brought back to reality</td>
317 <td>IDP2</td>
318 </tr>
319 <tr>
320 <td>7</td>
321 <td>-</td>
322 <td>CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
323 <td>IDP2</td>
324 </tr>
325 </table>
326
327 <h2>UC004: User is not convinced by robot</h2>
328 <table border='1px'>
329 <colgroup>
330 <col>
331 <col style="width: 70%">
332 </colgroup>
333 <tr border='1px'>
334 <td bgcolor='gainsboro'>
335 <b>Objective</b>
336 </td><td width='70%'>
337 OB01.4: Allow for quick intervention from the caretaker
338 </td>
339 </tr><tr>
340 <td bgcolor='gainsboro'>
341 <b>TDP</b>
342 </td><td>
343 TDP: scene A, scene B
344 </td>
345 </tr><tr>
346 <td bgcolor='gainsboro'>
347 <b>Actors</b>
348 </td><td>
349 <ul>
350 <li>Person with dementia</li>
351 <li>Pepper</li>
352 <li>Caretaker</li>
353 </ul>
354 </td>
355 </tr><tr>
356 <td bgcolor='gainsboro'>
357 <b>Pre-condition</b>
358 </td><td>
359 User thinks they need to get to work quickly
360 </td>
361 </tr><tr>
362 <td bgcolor='gainsboro'>
363 <b>Post-condition</b>
364 </td><td>
365 User is calmed down by caretaker
366 </td>
367 </tr><tr>
368 <td bgcolor='gainsboro'>
369 <b>Action sequence</b>
370 </td><td>
371 Figure<br><br>
372 UC steps:<br>
373 1. User walks to door to go out<br>
374 2. Robot asks user what they are doing<br>
375 3. User argues they need to go to work and they are already late<br>
376 4. Robot tries to convince the user they are free today<br>
377 5. User feels agitated and insist on their idea<br>
378 6. Robot sings or plays a song associated with retirement<br>
379 7. User does not respond to music and insists on going out<br>
380 8. Robot calls caretaker and requests user to wait for them<br>
381 9. Caretaker takes over and calms the user
382 </td>
383 </tr>
384 </table>
385
386 <table border='1px'>
387 <colgroup>
388 <col>
389 <col style="width: 70%">
390 </colgroup>
391 <tr>
392 <td bgcolor='gainsboro'>
393 <b>UC step<b>
394 </td><td bgcolor='gainsboro'>
395 <b>Requirements</b>
396 </td><td bgcolor='gainsboro'>
397 <b>Claims</b>
398 </td><td bgcolor='gainsboro'>
399 <b>IDP</b>
400 </td>
401 </tr>
402 <tr>
403 <td>1</td>
404 <td>RQ001: Detect movement towards door</td>
405 <td>-</td>
406 <td>-</td>
407 </tr>
408 <tr>
409 <td>2</td>
410 <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
411 <td>CL07: The user notices the system</td>
412 <td>IDP2</td>
413 </tr>
414 <tr>
415 <td>3</td>
416 <td>RQ004: Process spoken natural language</td>
417 <td>-</td>
418 <td>IDP2</td>
419 </tr>
420 <tr>
421 <td>4</td>
422 <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
423 <td>-</td>
424 <td>IDP2</td>
425 </tr>
426 <tr>
427 <td>5</td>
428 <td>RQ009: Recognize emotions, RQ004: Process spoken natural language</td>
429 <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
430 <td>IDP2</td>
431 </tr>
432 <tr>
433 <td>6</td>
434 <td>RQ006: Play music</td>
435 <td>CL04: The music fits the situation or place</td>
436 <td>IDP2</td>
437 </tr>
438 <tr>
439 <td>7</td>
440 <td>-</td>
441 <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
442 <td>IDP2</td>
443 </tr>
444 <tr>
445 <td>8</td>
446 <td>RQ007: Alert caretakers</td>
447 <td>CL05: The user is willing to wait for the caretaker, CL14: Caretakers are alerted once a user leaves the home</td>
448 <td>IDP2</td>
449 </tr>
450 <tr>
451 <td>9</td>
452 <td></td>
453 <td>CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
454 <td>-</td>
455 </tr>
456 </table>
457
458 <h2>UC005: User takes a supervised walk</h2>
459 <table border='1px'>
460 <colgroup>
461 <col>
462 <col style="width: 70%">
463 </colgroup>
464 <tr border='1px'>
465 <td bgcolor='gainsboro'>
466 <b>Objective</b>
467 </td><td width='70%'>
468 OB02: Allow user to take supervised walks
469 </td>
470 </tr><tr>
471 <td bgcolor='gainsboro'>
472 <b>TDP</b>
473 </td><td>
474 TDP: scene A, scene C
475 </td>
476 </tr><tr>
477 <td bgcolor='gainsboro'>
478 <b>Actors</b>
479 </td><td>
480 <ul>
481 <li>Person with dementia</li>
482 <li>Pepper</li>
483 <li>Caretaker</li>
484 </ul>
485 </td>
486 </tr><tr>
487 <td bgcolor='gainsboro'>
488 <b>Pre-condition</b>
489 </td><td>
490 User is restless and wants to take a walk
491 </td>
492 </tr><tr>
493 <td bgcolor='gainsboro'>
494 <b>Post-condition</b>
495 </td><td>
496 User is accompanied by caretaker
497 </td>
498 </tr><tr>
499 <td bgcolor='gainsboro'>
500 <b>Action sequence</b>
501 </td><td>
502 Figure<br><br>
503 UC steps:<br>
504 1. User walks to door to go out<br>
505 2. Robot asks user what they are doing<br>
506 3. User requests to take a walk<br>
507 4. Robot calls the caretaker and requests the user to wait<br>
508 5. Caretaker arrives and takes a walk with the user<br>
509 </td>
510 </tr>
511 </table>
512
513 <table border='1px'>
514 <colgroup>
515 <col>
516 <col style="width: 70%">
517 </colgroup>
518 <tr>
519 <td bgcolor='gainsboro'>
520 <b>UC step<b>
521 </td><td bgcolor='gainsboro'>
522 <b>Requirements</b>
523 </td><td bgcolor='gainsboro'>
524 <b>Claims</b>
525 </td><td bgcolor='gainsboro'>
526 <b>IDP</b>
527 </td>
528 </tr>
529 <tr>
530 <td>1</td>
531 <td>RQ001: Detect movement towards door</td>
532 <td>-</td>
533 <td>-</td>
534 </tr>
535 <tr>
536 <td>2</td>
537 <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
538 <td>CL07: The user notices the system</td>
539 <td>IDP4</td>
540 </tr>
541 <tr>
542 <td>3</td>
543 <td>RQ004: Process spoken natural language</td>
544 <td>-</td>
545 <td>IDP4</td>
546 </tr>
547 <tr>
548 <td>4</td>
549 <td>RQ007: Alert caretakers, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
550 <td>CL05: The user is willing to wait for the caretaker, CL10: The user feels dependent on others, (CL12, CL15)</td>
551 <td>IDP4</td>
552 </tr>
553 <tr>
554 <td>5</td>
555 <td></td>
556 <td>CL02: The user's mood is improved</td>
557 <td></td>
558 </tr>
559 </table>
560 {{/html}}