Changes for page Use Cases

Last modified by Cesar van der Poel on 2022/04/05 14:31

From version Icon 17.1 Icon
edited by Simran Karnani
on 2022/03/07 12:11
Change comment: There is no comment for this version
To version Icon 49.1 Icon
edited by Cesar van der Poel
on 2022/04/05 14:23
Change comment: There is no comment for this version

Summary

Details

Icon Page properties
Author
... ... @@ -1,1 +1,1 @@
1 -XWiki.SimranKarnani
1 +XWiki.Cesarvanderpoe
Content
... ... @@ -1,214 +1,521 @@
1 1  {{html}}
2 2  <!-- Your HTML code here -->
3 -<h2>UC001: User prevented from going out</h2>
4 -<table width='100%'>
5 -<tr>
6 -<td style="font-size:16px">
7 -<table border='1px'>
8 -<tr border='1px'><td bgcolor='gainsboro'>
9 - <b>Objective</b>
10 -</td><td>
11 - OB01: Prevent user from going out
12 -</td></tr>
13 -<tr><td bgcolor='gainsboro'>
14 - <b>TDP</b>
15 -</td><td>
16 - TDP: TITLE
17 -</td></tr>
18 -<tr><td bgcolor='gainsboro'>
19 - <b>Actors</b>
20 -</td><td>
21 -</td></tr>
22 -<tr><td bgcolor='gainsboro'>
23 - <b>Pre-condition</b>
24 -</td><td>
25 - User is restless and wants to take a walk
26 -</td></tr>
27 -<tr><td bgcolor='gainsboro'>
28 - <b>Post-condition</b>
29 -</td><td>
30 - User stays home
31 -</td></tr>
32 -<tr><td bgcolor='gainsboro'>
33 - <b>Action sequence</b>
34 -</td><td>
35 - Figure<br><br>
36 - UC steps:<br>
37 - 1. PwD walks to door to go out<br>
38 - 2. PwD meets Pepper who is standing in front of the door<br>
39 - 3. Pepper asks PwD where he/she is going<br>
40 - 4. PwD interacts with Pepper<br>
41 - 5. Pepper plays music and distracts user<br>
42 - 6. Robot invites user to stay inside to perform an activity<br>
43 - 7. User complies and turns away from door<br>
44 -</td></tr>
45 -</table>
46 -</td>
47 -</tr>
3 +<h2>UC001: User distracted from going out</h2>
4 + <table border='1px'>
5 + <colgroup>
6 + <col>
7 + <col style="width: 70%">
8 + </colgroup>
9 + <tr border='1px'>
10 + <td bgcolor='gainsboro'>
11 + <b>Objective</b>
12 + </td><td>
13 + OB01.1: Remove user's incentive for going out<br>
14 + OB01.3: Keep user occupied inside
15 + </td>
16 + </tr><tr>
17 + <td bgcolor='gainsboro'>
18 + <b>TDP</b>
19 + </td><td>
20 + TDP: scene A
21 + </td>
22 + </tr><tr>
23 + <td bgcolor='gainsboro'>
24 + <b>Actors</b>
25 + </td><td>
26 + <ul>
27 + <li>Person with dementia</li>
28 + <li>Pepper</li>
29 + </ul>
30 + </td>
31 + </tr><tr>
32 + <td bgcolor='gainsboro'>
33 + <b>Pre-condition</b>
34 + </td><td>
35 + User is bored and wants to go shopping
36 + </td>
37 + </tr><tr>
38 + <td bgcolor='gainsboro'>
39 + <b>Post-condition</b>
40 + </td><td>
41 + User entertains themselves inside
42 + </td>
43 + </tr><tr>
44 + <td bgcolor='gainsboro'>
45 + <b>Action sequence</b>
46 + </td><td>
47 + UC steps:<br>
48 + 1. User walks to door to go out<br>
49 + 2. Robot asks user what they are doing<br>
50 + 3. User responds that they are bored and want to go to the mall<br>
51 + 4. Robot suggests user entertains themselves with a puzzle instead<br>
52 + 5. User complies<br>
53 + </td>
54 + </tr>
55 + </table>
48 48  
49 -</table>
57 + <table border='1px'>
58 + <tr>
59 + <td bgcolor='gainsboro'>
60 + <b>UC step<b>
61 + </td><td bgcolor='gainsboro'>
62 + <b>Requirements</b>
63 + </td><td bgcolor='gainsboro'>
64 + <b>Claims</b>
65 + </td><td bgcolor='gainsboro'>
66 + <b>IDP</b>
67 + </td>
68 + </tr>
69 + <tr>
70 + <td>1</td>
71 + <td>RQ001: Detect movement towards door</td>
72 + <td>-</td>
73 + <td>-</td>
74 + </tr>
75 + <tr>
76 + <td>2</td>
77 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
78 + <td>CL07: The user notices the system</td>
79 + <td>IDP1</td>
80 + </tr>
81 + <tr>
82 + <td>3</td>
83 + <td>RQ004: Process spoken natural language</td>
84 + <td>-</td>
85 + <td>IDP1</td>
86 + </tr>
87 + <tr>
88 + <td>4</td>
89 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
90 + <td>-</td>
91 + <td>IDP1</td>
92 + </tr>
93 + <tr>
94 + <td>5</td>
95 + <td>-</td>
96 + <td>CL06: The user entertains themselves inside, CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
97 + <td>IDP1</td>
98 + </tr>
99 + </table>
50 50  
51 51  <h2>UC002: User not prevented from going out</h2>
52 -<table width='100%'>
53 -<tr>
54 -<td width='30%' style="font-size:16px">
55 -<table border='1px' width='50%'>
56 -<tr border='1px' width='30%'><td bgcolor='gainsboro'>
57 - <b>Objective</b>
58 -</td><td width='70%'>
59 - OB02: Alert caretakers of user that user has gone out
60 -</td></tr>
61 -<tr><td bgcolor='gainsboro'>
62 - <b>TDP</b>
63 -</td><td>
64 - TDP: TITLE
65 -</td></tr>
66 -<tr><td bgcolor='gainsboro'>
67 - <b>Actors</b>
68 -</td><td>
69 -</td></tr>
70 -<tr><td bgcolor='gainsboro'>
71 - <b>Pre-condition</b>
72 -</td><td>
73 - User is restless and wants to take a walk
74 -</td></tr>
75 -<tr><td bgcolor='gainsboro'>
76 - <b>Post-condition</b>
77 -</td><td>
78 - Caretakers are alerted of the fact that the user has left
79 -</td></tr>
80 -<tr><td bgcolor='gainsboro'>
81 - <b>Action sequence</b>
82 -</td><td>
83 - Figure<br><br>
84 - UC steps:<br>
85 - 1. User walks to door to go out<br>
86 - 2. User opens door<br>
87 - 3. Door sensor is triggered<br>
88 - 4. In response to door sensor trigger, robot navigates to door<br>
89 - 5. Robot attempts to interact with user<br>
90 - 6. User ignores robot and walks outside<br>
91 - 7. Robot alerts caretakers that the user has done so<br>
92 - 8. Caretakers take actions necessary to protect and/or locate user<br>
93 -</td></tr>
94 -</table>
95 -</td>
96 -<td width='50%' style="font-size:16px">
97 -<table border='1px' width='50%'>
98 -<tr><td bgcolor='gainsboro'>
99 - <b>UC step<b>
100 -</td><td bgcolor='gainsboro'>
101 - <b>Requirements</b>
102 -</td><td bgcolor='gainsboro'>
103 -<b>Claims</b>
104 -</td><td bgcolor='gainsboro'>
105 -<b>IDP</b>
106 -</td></tr>
107 -<tr><td>1</td><td>RQ001: Title</td><td>CL001: Title</td><td>IDP: Title</td></tr>
108 -<tr><td>2</td><td>-</td><td>-</td><td>-</td></tr>
109 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
110 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
111 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr>
112 -</table>
113 -</td></tr>
114 -</table>
102 + <table border='1px'>
103 + <tr border='1px'>
104 + <td bgcolor='gainsboro'>
105 + <b>Objective</b>
106 + </td><td width='70%'>
107 + OB01.4: Allow for quick intervention from the caretaker
108 + </td>
109 + </tr><tr>
110 + <td bgcolor='gainsboro'>
111 + <b>TDP</b>
112 + </td><td>
113 + TDP: scene A, scene B
114 + </td>
115 + </tr><tr>
116 + <td bgcolor='gainsboro'>
117 + <b>Actors</b>
118 + </td><td>
119 + <ul>
120 + <li>Person with dementia</li>
121 + <li>Pepper</li>
122 + <li>(Caretaker)</li>
123 + </ul>
124 + </td>
125 + </tr><tr>
126 + <td bgcolor='gainsboro'>
127 + <b>Pre-condition</b>
128 + </td><td>
129 + User is restless and wants to take a walk
130 + </td>
131 + </tr><tr>
132 + <td bgcolor='gainsboro'>
133 + <b>Post-condition</b>
134 + </td><td>
135 + Caretakers are alerted of the fact that the user has left
136 + </td>
137 + </tr><tr>
138 + <td bgcolor='gainsboro'>
139 + <b>Action sequence</b>
140 + </td><td>
141 + UC steps:<br>
142 + 1. User walks to door to go out<br>
143 + 2. Robot asks user what they are doing<br>
144 + 3. User ignores robot and walks outside<br>
145 + 4. Robot alerts caretakers that the user has done so<br>
146 + 5. Caretakers take actions necessary to protect and/or locate user<br>
147 + </td>
148 + </tr>
149 + </table>
115 115  
151 + <table border='1px'>
152 + <tr>
153 + <td bgcolor='gainsboro'>
154 + <b>UC step<b>
155 + </td><td bgcolor='gainsboro'>
156 + <b>Requirements</b>
157 + </td><td bgcolor='gainsboro'>
158 + <b>Claims</b>
159 + </td><td bgcolor='gainsboro'>
160 + <b>IDP</b>
161 + </td>
162 + </tr>
163 + <tr>
164 + <td>1</td>
165 + <td>RQ001: Detect movement towards door</td>
166 + <td>-</td>
167 + <td>-</td>
168 + </tr>
169 + <tr>
170 + <td>2</td>
171 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
172 + <td>-</td>
173 + <td>IDP3</td>
174 + </tr>
175 + <tr>
176 + <td>3</td>
177 + <td>-</td>
178 + <td>-</td>
179 + <td>IDP3</td>
180 + </tr>
181 + <tr>
182 + <td>4</td>
183 + <td>RQ007: Alert caretakers</td>
184 + <td>CL14: Caretakers are alerted once a user leaves the home</td>
185 + <td>IDP3</td>
186 + </tr>
187 + <tr>
188 + <td>5</td>
189 + <td>-</td>
190 + <td>-</td>
191 + <td>-</td>
192 + </tr>
193 + </table>
116 116  
195 +<h2>UC003: User is reminded of their current situation</h2>
196 + <table border='1px'>
197 + <tr border='1px'>
198 + <td bgcolor='gainsboro'>
199 + <b>Objective</b>
200 + </td><td width='70%'>
201 + OB01.2: Bring user back to reality
202 + </td>
203 + </tr><tr>
204 + <td bgcolor='gainsboro'>
205 + <b>TDP</b>
206 + </td><td>
207 + TDP: scene A
208 + </td>
209 + </tr><tr>
210 + <td bgcolor='gainsboro'>
211 + <b>Actors</b>
212 + </td><td>
213 + <ul>
214 + <li>Person with dementia</li>
215 + <li>Pepper</li>
216 + </ul>
217 + </td>
218 + </tr><tr>
219 + <td bgcolor='gainsboro'>
220 + <b>Pre-condition</b>
221 + </td><td>
222 + User thinks they need to get to work quickly
223 + </td>
224 + </tr><tr>
225 + <td bgcolor='gainsboro'>
226 + <b>Post-condition</b>
227 + </td><td>
228 + User remembers they are retired and in a care home
229 + </td>
230 + </tr><tr>
231 + <td bgcolor='gainsboro'>
232 + <b>Action sequence</b>
233 + </td><td>
234 + Figure<br><br>
235 + UC steps:<br>
236 + 1. User walks to door to go out<br>
237 + 2. Robot asks user what they are doing<br>
238 + 3. User argues they need to go to work and they are already late<br>
239 + 4. Robot tries to convince the user they are free today<br>
240 + 5. User feels agitated and insist on their idea<br>
241 + 6. Robot sings or plays a song associated with retirement<br>
242 + 7. User is reminded of their retirement and realises they do not have to go to work, deciding to stay in<br>
243 + </td>
244 + </tr>
245 + </table>
117 117  
247 + <table border='1px'>
248 + <tr>
249 + <td bgcolor='gainsboro'>
250 + <b>UC step<b>
251 + </td><td bgcolor='gainsboro'>
252 + <b>Requirements</b>
253 + </td><td bgcolor='gainsboro'>
254 + <b>Claims</b>
255 + </td><td bgcolor='gainsboro'>
256 + <b>IDP</b>
257 + </td>
258 + </tr>
259 + <tr>
260 + <td>1</td>
261 + <td>RQ001: Detect movement towards door</td>
262 + <td>-</td>
263 + <td>-</td>
264 + </tr>
265 + <tr>
266 + <td>2</td>
267 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
268 + <td>CL07: The user notices the system</td>
269 + <td>IDP2</td>
270 + </tr>
271 + <tr>
272 + <td>3</td>
273 + <td>RQ004: Process spoken natural language</td>
274 + <td>-</td>
275 + <td>IDP2</td>
276 + </tr>
277 + <tr>
278 + <td>4</td>
279 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
280 + <td>-</td>
281 + <td>IDP2</td>
282 + </tr>
283 + <tr>
284 + <td>5</td>
285 + <td>RQ009: Recognize emotions, RQ004: Process spoken natural language</td>
286 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
287 + <td>IDP2</td>
288 + </tr>
289 + <tr>
290 + <td>6</td>
291 + <td>RQ006: Play music</td>
292 + <td>CL02: The user's mood is improved, CL04: The music fits the situation or place, CL08: The user is subtly brought back to reality</td>
293 + <td>IDP2</td>
294 + </tr>
295 + <tr>
296 + <td>7</td>
297 + <td>-</td>
298 + <td>CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
299 + <td>IDP2</td>
300 + </tr>
301 + </table>
118 118  
119 -<h2>UC001: Guide user when going out</h2>
120 -<table border='1px' width='50%'>
121 -<tr border='1px' width='30%'><td bgcolor='gainsboro'>
122 - <b>Objective</b>
123 -</td><td width='70%'>
124 - OB03: Guide the user when going out
125 -</td></tr>
126 -<tr><td bgcolor='gainsboro'>
127 - <b>TDP</b>
128 -</td><td>
129 - TDP: TITLE
130 -</td></tr>
131 -<tr><td bgcolor='gainsboro'>
132 - <b>Actors</b>
133 -</td><td>
134 -</td></tr>
135 -<tr><td bgcolor='gainsboro'>
136 - <b>Pre-condition</b>
137 -</td><td>
138 - User is restless and wants to take a walk
139 -</td></tr>
140 -<tr><td bgcolor='gainsboro'>
141 - <b>Post-condition</b>
142 -</td><td>
143 - Robot accompanies the user to a familiar location
144 -</td></tr>
145 -<tr><td bgcolor='gainsboro'>
146 - <b>Action sequence</b>
147 -</td><td>
148 - Figure<br><br>
149 - UC steps:<br>
150 - 1. User walks to door to go out<br>
151 - 2. User opens door<br>
152 - 3. Door sensor is triggered<br>
153 - 4. In response to door sensor trigger, robot navigates to door<br>
154 - 5. Robot starts to play some sounds corresponding to a familiar location<br>
155 - 6. User listens and recognizes to sound and remembers the route tot that location<br>
156 - 7. Robot accompanies the user on his/her stroll<br>
157 -</td></tr>
303 +<h2>UC004: User is not convinced by robot</h2>
304 + <table border='1px'>
305 + <tr border='1px'>
306 + <td bgcolor='gainsboro'>
307 + <b>Objective</b>
308 + </td><td width='70%'>
309 + OB01.4: Allow for quick intervention from the caretaker
310 + </td>
311 + </tr><tr>
312 + <td bgcolor='gainsboro'>
313 + <b>TDP</b>
314 + </td><td>
315 + TDP: scene A, scene B
316 + </td>
317 + </tr><tr>
318 + <td bgcolor='gainsboro'>
319 + <b>Actors</b>
320 + </td><td>
321 + <ul>
322 + <li>Person with dementia</li>
323 + <li>Pepper</li>
324 + <li>Caretaker</li>
325 + </ul>
326 + </td>
327 + </tr><tr>
328 + <td bgcolor='gainsboro'>
329 + <b>Pre-condition</b>
330 + </td><td>
331 + User thinks they need to get to work quickly
332 + </td>
333 + </tr><tr>
334 + <td bgcolor='gainsboro'>
335 + <b>Post-condition</b>
336 + </td><td>
337 + User is calmed down by caretaker
338 + </td>
339 + </tr><tr>
340 + <td bgcolor='gainsboro'>
341 + <b>Action sequence</b>
342 + </td><td>
343 + Figure<br><br>
344 + UC steps:<br>
345 + 1. User walks to door to go out<br>
346 + 2. Robot asks user what they are doing<br>
347 + 3. User argues they need to go to work and they are already late<br>
348 + 4. Robot tries to convince the user they are free today<br>
349 + 5. User feels agitated and insist on their idea<br>
350 + 6. Robot sings or plays a song associated with retirement<br>
351 + 7. User does not respond to music and insists on going out<br>
352 + 8. Robot calls caretaker and requests user to wait for them<br>
353 + 9. Caretaker takes over and calms the user
354 + </td>
355 + </tr>
356 + </table>
158 158  
159 -</table>
358 + <table border='1px'>
359 + <tr>
360 + <td bgcolor='gainsboro'>
361 + <b>UC step<b>
362 + </td><td bgcolor='gainsboro'>
363 + <b>Requirements</b>
364 + </td><td bgcolor='gainsboro'>
365 + <b>Claims</b>
366 + </td><td bgcolor='gainsboro'>
367 + <b>IDP</b>
368 + </td>
369 + </tr>
370 + <tr>
371 + <td>1</td>
372 + <td>RQ001: Detect movement towards door</td>
373 + <td>-</td>
374 + <td>-</td>
375 + </tr>
376 + <tr>
377 + <td>2</td>
378 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
379 + <td>CL07: The user notices the system</td>
380 + <td>IDP2</td>
381 + </tr>
382 + <tr>
383 + <td>3</td>
384 + <td>RQ004: Process spoken natural language</td>
385 + <td>-</td>
386 + <td>IDP2</td>
387 + </tr>
388 + <tr>
389 + <td>4</td>
390 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
391 + <td>-</td>
392 + <td>IDP2</td>
393 + </tr>
394 + <tr>
395 + <td>5</td>
396 + <td>RQ009: Recognize emotions, RQ004: Process spoken natural language</td>
397 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
398 + <td>IDP2</td>
399 + </tr>
400 + <tr>
401 + <td>6</td>
402 + <td>RQ006: Play music</td>
403 + <td>CL04: The music fits the situation or place</td>
404 + <td>IDP2</td>
405 + </tr>
406 + <tr>
407 + <td>7</td>
408 + <td>-</td>
409 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td>
410 + <td>IDP2</td>
411 + </tr>
412 + <tr>
413 + <td>8</td>
414 + <td>RQ007: Alert caretakers</td>
415 + <td>CL05: The user is willing to wait for the caretaker, CL14: Caretakers are alerted once a user leaves the home</td>
416 + <td>IDP2</td>
417 + </tr>
418 + <tr>
419 + <td>9</td>
420 + <td></td>
421 + <td>CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td>
422 + <td>-</td>
423 + </tr>
424 + </table>
160 160  
161 -<h2>UC004: User is reminded of their current situation</h2>
162 -<table width='100%'>
163 -<tr>
164 -<td style="font-size:16px">
165 -<table border='1px'>
166 -<tr border='1px'><td bgcolor='gainsboro'>
167 - <b>Objective</b>
168 -</td><td>
169 - OB01: Prevent user from going out
170 -</td></tr>
171 -<tr><td bgcolor='gainsboro'>
172 - <b>TDP</b>
173 -</td><td>
174 - TDP: TITLE
175 -</td></tr>
176 -<tr><td bgcolor='gainsboro'>
177 - <b>Actors</b>
178 -</td><td>
179 -</td></tr>
180 -<tr><td bgcolor='gainsboro'>
181 - <b>Pre-condition</b>
182 -</td><td>
183 - User is under the impression they need to be somewhere and intends to go there
184 -</td></tr>
185 -<tr><td bgcolor='gainsboro'>
186 - <b>Post-condition</b>
187 -</td><td>
188 - User remembers they need to stay here
189 -</td></tr>
190 -<tr><td bgcolor='gainsboro'>
191 - <b>Action sequence</b>
192 -</td><td>
193 - Figure<br><br>
194 - UC steps:<br>
195 - 1. User walks to door to go out<br>
196 - 2. User opens door<br>
197 - 3. Door sensor is triggered<br>
198 - 4. In response to door sensor trigger, robot navigates to door<br>
199 - 5. Robot interacts with user<br>
200 - 6. User focusses on robot<br>
201 - 7. Robot invites user to stay inside<br>
202 - 8. User argues they need to go to work and they are already late<br>
203 - 9. Robot sings or plays a song associated with retirement<br>
204 - 10. User is reminded of their retirement and realises they do not have to go to work<br>
205 - 11. Robot invites user to stay home and user complies<br>
206 -</td></tr>
207 -</table>
208 -</td>
209 -</tr>
426 +<h2>UC005: User takes a supervised walk</h2>
427 + <table border='1px'>
428 + <tr border='1px'>
429 + <td bgcolor='gainsboro'>
430 + <b>Objective</b>
431 + </td><td width='70%'>
432 + OB02: Allow user to take supervised walks
433 + </td>
434 + </tr><tr>
435 + <td bgcolor='gainsboro'>
436 + <b>TDP</b>
437 + </td><td>
438 + TDP: scene A, scene C
439 + </td>
440 + </tr><tr>
441 + <td bgcolor='gainsboro'>
442 + <b>Actors</b>
443 + </td><td>
444 + <ul>
445 + <li>Person with dementia</li>
446 + <li>Pepper</li>
447 + <li>Caretaker</li>
448 + </ul>
449 + </td>
450 + </tr><tr>
451 + <td bgcolor='gainsboro'>
452 + <b>Pre-condition</b>
453 + </td><td>
454 + User is restless and wants to take a walk
455 + </td>
456 + </tr><tr>
457 + <td bgcolor='gainsboro'>
458 + <b>Post-condition</b>
459 + </td><td>
460 + User is accompanied by caretaker
461 + </td>
462 + </tr><tr>
463 + <td bgcolor='gainsboro'>
464 + <b>Action sequence</b>
465 + </td><td>
466 + Figure<br><br>
467 + UC steps:<br>
468 + 1. User walks to door to go out<br>
469 + 2. Robot asks user what they are doing<br>
470 + 3. User requests to take a walk<br>
471 + 4. Robot calls the caretaker and requests the user to wait<br>
472 + 5. Caretaker arrives and takes a walk with the user<br>
473 + </td>
474 + </tr>
475 + </table>
210 210  
211 -</table>
212 -
477 + <table border='1px'>
478 + <tr>
479 + <td bgcolor='gainsboro'>
480 + <b>UC step<b>
481 + </td><td bgcolor='gainsboro'>
482 + <b>Requirements</b>
483 + </td><td bgcolor='gainsboro'>
484 + <b>Claims</b>
485 + </td><td bgcolor='gainsboro'>
486 + <b>IDP</b>
487 + </td>
488 + </tr>
489 + <tr>
490 + <td>1</td>
491 + <td>RQ001: Detect movement towards door</td>
492 + <td>-</td>
493 + <td>-</td>
494 + </tr>
495 + <tr>
496 + <td>2</td>
497 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td>
498 + <td>CL07: The user notices the system</td>
499 + <td>IDP4</td>
500 + </tr>
501 + <tr>
502 + <td>3</td>
503 + <td>RQ004: Process spoken natural language</td>
504 + <td>-</td>
505 + <td>IDP4</td>
506 + </tr>
507 + <tr>
508 + <td>4</td>
509 + <td>RQ007: Alert caretakers, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td>
510 + <td>CL05: The user is willing to wait for the caretaker, CL10: The user feels dependent on others, (CL12, CL15)</td>
511 + <td>IDP4</td>
512 + </tr>
513 + <tr>
514 + <td>5</td>
515 + <td></td>
516 + <td>CL02: The user's mood is improved</td>
517 + <td></td>
518 + </tr>
519 + </table>
213 213  {{/html}}
214 214  
Icon XWiki.XWikiComments[0]
Author
... ... @@ -1,0 +1,1 @@
1 +Anonymous
Comment
... ... @@ -1,0 +1,2 @@
1 +This could be specified in one or two use cases (which is ok/enough). One use case can include alternative action sequences. I am not sure about the role of music in UC4: Robot sings or plays a song associated with retirement,
2 +and the user is reminded of his retirement and realises he does not have to go to work. Is there music that reminds each resident about his/her retirement, what is being remembered about the retirement? Probably changing the mood with music and just distracting the resident from his/her intention to pass the door are more practical? It would be good to formulate a claim on this topic.
Date
... ... @@ -1,0 +1,1 @@
1 +2022-03-20 23:15:39.745