Changes for page Use Cases
Last modified by Cesar van der Poel on 2022/04/05 14:31
From version
2.1


edited by Cesar van der Poel
on 2022/02/08 12:28
on 2022/02/08 12:28
Change comment:
There is no comment for this version
To version
51.1


edited by Cesar van der Poel
on 2022/04/05 14:27
on 2022/04/05 14:27
Change comment:
There is no comment for this version
Summary
Details
- Page properties
-
- Content
-
... ... @@ -1,134 +1,527 @@ 1 1 {{html}} 2 2 <!-- Your HTML code here --> 3 -<h2>UC001: User prevented from going out</h2> 4 -<table width='100%'> 5 -<tr> 6 -<td width='30%' style="font-size:16px"> 7 -<table border='1px' width='50%'> 8 -<tr border='1px' width='30%'><td bgcolor='gainsboro'> 9 - <b>Objective</b> 10 -</td><td width='70%'> 11 - OB01: Prevent user from going out 12 -</td></tr> 13 -<tr><td bgcolor='gainsboro'> 14 - <b>TDP</b> 15 -</td><td> 16 - TDP: TITLE 17 -</td></tr> 18 -<tr><td bgcolor='gainsboro'> 19 - <b>Actors</b> 20 -</td><td> 21 -</td></tr> 22 -<tr><td bgcolor='gainsboro'> 23 - <b>Pre-condition</b> 24 -</td>User is restless and wants to take a walk<td> 25 -</td></tr> 26 -<tr><td bgcolor='gainsboro'> 27 - <b>Post-condition</b> 28 -</td>User stays home<td> 29 -</td></tr> 30 -<tr><td bgcolor='gainsboro'> 31 - <b>Action sequence</b> 32 -</td><td> 33 - Figure<br><br> 34 - UC steps:<br> 35 - 1. User walks to door to go out<br> 36 - 2. User opens door<br> 37 - 3. Door sensor is triggered<br> 38 - 4. In response to door sensor trigger, robot navigates to door<br> 39 - 5. Robot attempts to get attention of user through movement and auditory signals<br> 40 - 6. User focusses on robot<br> 41 - 7. Robot guides user away from door<br> 42 -</td></tr> 43 -</table> 44 -</td> 45 -<td width='50%' style="font-size:16px"> 46 -<table border='1px' width='50%'> 47 -<tr><td bgcolor='gainsboro'> 48 - <b>UC step<b> 49 -</td><td bgcolor='gainsboro'> 50 - <b>Requirements</b> 51 -</td><td bgcolor='gainsboro'> 52 -<b>Claims</b> 53 -</td><td bgcolor='gainsboro'> 54 -<b>IDP</b> 55 -</td></tr> 56 -<tr><td>1</td><td>RQ001: Title</td><td>CL001: Title</td><td>IDP: Title</td></tr> 57 -<tr><td>2</td><td>-</td><td>-</td><td>-</td></tr> 58 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 59 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 60 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 61 -</table> 3 +<h2>UC001: User distracted from going out</h2> 4 + <table border='1px'> 5 + <colgroup> 6 + <col> 7 + <col style="width: 70%"> 8 + </colgroup> 9 + <tr border='1px'> 10 + <td bgcolor='gainsboro'> 11 + <b>Objective</b> 12 + </td><td> 13 + OB01.1: Remove user's incentive for going out<br> 14 + OB01.3: Keep user occupied inside 15 + </td> 16 + </tr><tr> 17 + <td bgcolor='gainsboro'> 18 + <b>TDP</b> 19 + </td><td> 20 + TDP: scene A 21 + </td> 22 + </tr><tr> 23 + <td bgcolor='gainsboro'> 24 + <b>Actors</b> 25 + </td><td> 26 + <ul> 27 + <li>Person with dementia</li> 28 + <li>Pepper</li> 29 + </ul> 30 + </td> 31 + </tr><tr> 32 + <td bgcolor='gainsboro'> 33 + <b>Pre-condition</b> 34 + </td><td> 35 + User is bored and wants to go shopping 36 + </td> 37 + </tr><tr> 38 + <td bgcolor='gainsboro'> 39 + <b>Post-condition</b> 40 + </td><td> 41 + User entertains themselves inside 42 + </td> 43 + </tr><tr> 44 + <td bgcolor='gainsboro'> 45 + <b>Action sequence</b> 46 + </td><td> 47 + UC steps:<br> 48 + 1. User walks to door to go out<br> 49 + 2. Robot asks user what they are doing<br> 50 + 3. User responds that they are bored and want to go to the mall<br> 51 + 4. Robot suggests user entertains themselves with a puzzle instead<br> 52 + 5. User complies<br> 53 + </td> 54 + </tr> 55 + </table> 62 62 57 + <table border='1px'> 58 + <colgroup> 59 + <col style="width: 5%"> 60 + <col> 61 + <col> 62 + <col style="width: 10%"> 63 + </colgroup> 64 + <tr> 65 + <td bgcolor='gainsboro'> 66 + <b>UC step<b> 67 + </td><td bgcolor='gainsboro'> 68 + <b>Requirements</b> 69 + </td><td bgcolor='gainsboro'> 70 + <b>Claims</b> 71 + </td><td bgcolor='gainsboro'> 72 + <b>IDP</b> 73 + </td> 74 + </tr> 75 + <tr> 76 + <td>1</td> 77 + <td>RQ001: Detect movement towards door</td> 78 + <td>-</td> 79 + <td>-</td> 80 + </tr> 81 + <tr> 82 + <td>2</td> 83 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td> 84 + <td>CL07: The user notices the system</td> 85 + <td>IDP1</td> 86 + </tr> 87 + <tr> 88 + <td>3</td> 89 + <td>RQ004: Process spoken natural language</td> 90 + <td>-</td> 91 + <td>IDP1</td> 92 + </tr> 93 + <tr> 94 + <td>4</td> 95 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td> 96 + <td>-</td> 97 + <td>IDP1</td> 98 + </tr> 99 + <tr> 100 + <td>5</td> 101 + <td>-</td> 102 + <td>CL06: The user entertains themselves inside, CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td> 103 + <td>IDP1</td> 104 + </tr> 105 + </table> 63 63 64 -</td></tr> 107 +<h2>UC002: User not prevented from going out</h2> 108 + <table border='1px'> 109 + <tr border='1px'> 110 + <td bgcolor='gainsboro'> 111 + <b>Objective</b> 112 + </td><td width='70%'> 113 + OB01.4: Allow for quick intervention from the caretaker 114 + </td> 115 + </tr><tr> 116 + <td bgcolor='gainsboro'> 117 + <b>TDP</b> 118 + </td><td> 119 + TDP: scene A, scene B 120 + </td> 121 + </tr><tr> 122 + <td bgcolor='gainsboro'> 123 + <b>Actors</b> 124 + </td><td> 125 + <ul> 126 + <li>Person with dementia</li> 127 + <li>Pepper</li> 128 + <li>(Caretaker)</li> 129 + </ul> 130 + </td> 131 + </tr><tr> 132 + <td bgcolor='gainsboro'> 133 + <b>Pre-condition</b> 134 + </td><td> 135 + User is restless and wants to take a walk 136 + </td> 137 + </tr><tr> 138 + <td bgcolor='gainsboro'> 139 + <b>Post-condition</b> 140 + </td><td> 141 + Caretakers are alerted of the fact that the user has left 142 + </td> 143 + </tr><tr> 144 + <td bgcolor='gainsboro'> 145 + <b>Action sequence</b> 146 + </td><td> 147 + UC steps:<br> 148 + 1. User walks to door to go out<br> 149 + 2. Robot asks user what they are doing<br> 150 + 3. User ignores robot and walks outside<br> 151 + 4. Robot alerts caretakers that the user has done so<br> 152 + 5. Caretakers take actions necessary to protect and/or locate user<br> 153 + </td> 154 + </tr> 155 + </table> 65 65 66 -</table> 157 + <table border='1px'> 158 + <tr> 159 + <td bgcolor='gainsboro'> 160 + <b>UC step<b> 161 + </td><td bgcolor='gainsboro'> 162 + <b>Requirements</b> 163 + </td><td bgcolor='gainsboro'> 164 + <b>Claims</b> 165 + </td><td bgcolor='gainsboro'> 166 + <b>IDP</b> 167 + </td> 168 + </tr> 169 + <tr> 170 + <td>1</td> 171 + <td>RQ001: Detect movement towards door</td> 172 + <td>-</td> 173 + <td>-</td> 174 + </tr> 175 + <tr> 176 + <td>2</td> 177 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td> 178 + <td>-</td> 179 + <td>IDP3</td> 180 + </tr> 181 + <tr> 182 + <td>3</td> 183 + <td>-</td> 184 + <td>-</td> 185 + <td>IDP3</td> 186 + </tr> 187 + <tr> 188 + <td>4</td> 189 + <td>RQ007: Alert caretakers</td> 190 + <td>CL14: Caretakers are alerted once a user leaves the home</td> 191 + <td>IDP3</td> 192 + </tr> 193 + <tr> 194 + <td>5</td> 195 + <td>-</td> 196 + <td>-</td> 197 + <td>-</td> 198 + </tr> 199 + </table> 67 67 68 -<h2>UC002: User not prevented from going out</h2> 69 -<table width='100%'> 70 -<tr> 71 -<td width='30%' style="font-size:16px"> 72 -<table border='1px' width='50%'> 73 -<tr border='1px' width='30%'><td bgcolor='gainsboro'> 74 - <b>Objective</b> 75 -</td><td width='70%'> 76 - OB01: Alert caretakers of user that the user has gone out 77 -</td></tr> 78 -<tr><td bgcolor='gainsboro'> 79 - <b>TDP</b> 80 -</td><td> 81 - TDP: TITLE 82 -</td></tr> 83 -<tr><td bgcolor='gainsboro'> 84 - <b>Actors</b> 85 -</td><td> 86 -</td></tr> 87 -<tr><td bgcolor='gainsboro'> 88 - <b>Pre-condition</b> 89 -</td>User is restless and wants to take a walk<td> 90 -</td></tr> 91 -<tr><td bgcolor='gainsboro'> 92 - <b>Post-condition</b> 93 -</td>Caretakers are alerted of the fact that the user has left<td> 94 -</td></tr> 95 -<tr><td bgcolor='gainsboro'> 96 - <b>Action sequence</b> 97 -</td><td> 98 - Figure<br><br> 99 - UC steps:<br> 100 - 1. User walks to door to go out<br> 101 - 2. User opens door<br> 102 - 3. Door sensor is triggered<br> 103 - 4. In response to door sensor trigger, robot navigates to door<br> 104 - 5. Robot attempts to get attention of user through movement and auditory signals<br> 105 - 6. User ignores robot and walks outside<br> 106 - 7. Robot alerts caretakers that the user has done so<br> 107 - 8. Caretakers take actions necessary to protect and/or locate user<br> 108 -</td></tr> 109 -</table> 110 -</td> 111 -<td width='50%' style="font-size:16px"> 112 -<table border='1px' width='50%'> 113 -<tr><td bgcolor='gainsboro'> 114 - <b>UC step<b> 115 -</td><td bgcolor='gainsboro'> 116 - <b>Requirements</b> 117 -</td><td bgcolor='gainsboro'> 118 -<b>Claims</b> 119 -</td><td bgcolor='gainsboro'> 120 -<b>IDP</b> 121 -</td></tr> 122 -<tr><td>1</td><td>RQ001: Title</td><td>CL001: Title</td><td>IDP: Title</td></tr> 123 -<tr><td>2</td><td>-</td><td>-</td><td>-</td></tr> 124 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 125 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 126 -<tr><td></td><td>-</td><td>-</td><td>-</td></tr> 127 -</table> 201 +<h2>UC003: User is reminded of their current situation</h2> 202 + <table border='1px'> 203 + <tr border='1px'> 204 + <td bgcolor='gainsboro'> 205 + <b>Objective</b> 206 + </td><td width='70%'> 207 + OB01.2: Bring user back to reality 208 + </td> 209 + </tr><tr> 210 + <td bgcolor='gainsboro'> 211 + <b>TDP</b> 212 + </td><td> 213 + TDP: scene A 214 + </td> 215 + </tr><tr> 216 + <td bgcolor='gainsboro'> 217 + <b>Actors</b> 218 + </td><td> 219 + <ul> 220 + <li>Person with dementia</li> 221 + <li>Pepper</li> 222 + </ul> 223 + </td> 224 + </tr><tr> 225 + <td bgcolor='gainsboro'> 226 + <b>Pre-condition</b> 227 + </td><td> 228 + User thinks they need to get to work quickly 229 + </td> 230 + </tr><tr> 231 + <td bgcolor='gainsboro'> 232 + <b>Post-condition</b> 233 + </td><td> 234 + User remembers they are retired and in a care home 235 + </td> 236 + </tr><tr> 237 + <td bgcolor='gainsboro'> 238 + <b>Action sequence</b> 239 + </td><td> 240 + Figure<br><br> 241 + UC steps:<br> 242 + 1. User walks to door to go out<br> 243 + 2. Robot asks user what they are doing<br> 244 + 3. User argues they need to go to work and they are already late<br> 245 + 4. Robot tries to convince the user they are free today<br> 246 + 5. User feels agitated and insist on their idea<br> 247 + 6. Robot sings or plays a song associated with retirement<br> 248 + 7. User is reminded of their retirement and realises they do not have to go to work, deciding to stay in<br> 249 + </td> 250 + </tr> 251 + </table> 128 128 253 + <table border='1px'> 254 + <tr> 255 + <td bgcolor='gainsboro'> 256 + <b>UC step<b> 257 + </td><td bgcolor='gainsboro'> 258 + <b>Requirements</b> 259 + </td><td bgcolor='gainsboro'> 260 + <b>Claims</b> 261 + </td><td bgcolor='gainsboro'> 262 + <b>IDP</b> 263 + </td> 264 + </tr> 265 + <tr> 266 + <td>1</td> 267 + <td>RQ001: Detect movement towards door</td> 268 + <td>-</td> 269 + <td>-</td> 270 + </tr> 271 + <tr> 272 + <td>2</td> 273 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td> 274 + <td>CL07: The user notices the system</td> 275 + <td>IDP2</td> 276 + </tr> 277 + <tr> 278 + <td>3</td> 279 + <td>RQ004: Process spoken natural language</td> 280 + <td>-</td> 281 + <td>IDP2</td> 282 + </tr> 283 + <tr> 284 + <td>4</td> 285 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td> 286 + <td>-</td> 287 + <td>IDP2</td> 288 + </tr> 289 + <tr> 290 + <td>5</td> 291 + <td>RQ009: Recognize emotions, RQ004: Process spoken natural language</td> 292 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td> 293 + <td>IDP2</td> 294 + </tr> 295 + <tr> 296 + <td>6</td> 297 + <td>RQ006: Play music</td> 298 + <td>CL02: The user's mood is improved, CL04: The music fits the situation or place, CL08: The user is subtly brought back to reality</td> 299 + <td>IDP2</td> 300 + </tr> 301 + <tr> 302 + <td>7</td> 303 + <td>-</td> 304 + <td>CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td> 305 + <td>IDP2</td> 306 + </tr> 307 + </table> 129 129 130 -</td></tr> 309 +<h2>UC004: User is not convinced by robot</h2> 310 + <table border='1px'> 311 + <tr border='1px'> 312 + <td bgcolor='gainsboro'> 313 + <b>Objective</b> 314 + </td><td width='70%'> 315 + OB01.4: Allow for quick intervention from the caretaker 316 + </td> 317 + </tr><tr> 318 + <td bgcolor='gainsboro'> 319 + <b>TDP</b> 320 + </td><td> 321 + TDP: scene A, scene B 322 + </td> 323 + </tr><tr> 324 + <td bgcolor='gainsboro'> 325 + <b>Actors</b> 326 + </td><td> 327 + <ul> 328 + <li>Person with dementia</li> 329 + <li>Pepper</li> 330 + <li>Caretaker</li> 331 + </ul> 332 + </td> 333 + </tr><tr> 334 + <td bgcolor='gainsboro'> 335 + <b>Pre-condition</b> 336 + </td><td> 337 + User thinks they need to get to work quickly 338 + </td> 339 + </tr><tr> 340 + <td bgcolor='gainsboro'> 341 + <b>Post-condition</b> 342 + </td><td> 343 + User is calmed down by caretaker 344 + </td> 345 + </tr><tr> 346 + <td bgcolor='gainsboro'> 347 + <b>Action sequence</b> 348 + </td><td> 349 + Figure<br><br> 350 + UC steps:<br> 351 + 1. User walks to door to go out<br> 352 + 2. Robot asks user what they are doing<br> 353 + 3. User argues they need to go to work and they are already late<br> 354 + 4. Robot tries to convince the user they are free today<br> 355 + 5. User feels agitated and insist on their idea<br> 356 + 6. Robot sings or plays a song associated with retirement<br> 357 + 7. User does not respond to music and insists on going out<br> 358 + 8. Robot calls caretaker and requests user to wait for them<br> 359 + 9. Caretaker takes over and calms the user 360 + </td> 361 + </tr> 362 + </table> 131 131 132 -</table> 364 + <table border='1px'> 365 + <tr> 366 + <td bgcolor='gainsboro'> 367 + <b>UC step<b> 368 + </td><td bgcolor='gainsboro'> 369 + <b>Requirements</b> 370 + </td><td bgcolor='gainsboro'> 371 + <b>Claims</b> 372 + </td><td bgcolor='gainsboro'> 373 + <b>IDP</b> 374 + </td> 375 + </tr> 376 + <tr> 377 + <td>1</td> 378 + <td>RQ001: Detect movement towards door</td> 379 + <td>-</td> 380 + <td>-</td> 381 + </tr> 382 + <tr> 383 + <td>2</td> 384 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td> 385 + <td>CL07: The user notices the system</td> 386 + <td>IDP2</td> 387 + </tr> 388 + <tr> 389 + <td>3</td> 390 + <td>RQ004: Process spoken natural language</td> 391 + <td>-</td> 392 + <td>IDP2</td> 393 + </tr> 394 + <tr> 395 + <td>4</td> 396 + <td>RQ005: Associate certain concepts with related concepts, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td> 397 + <td>-</td> 398 + <td>IDP2</td> 399 + </tr> 400 + <tr> 401 + <td>5</td> 402 + <td>RQ009: Recognize emotions, RQ004: Process spoken natural language</td> 403 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td> 404 + <td>IDP2</td> 405 + </tr> 406 + <tr> 407 + <td>6</td> 408 + <td>RQ006: Play music</td> 409 + <td>CL04: The music fits the situation or place</td> 410 + <td>IDP2</td> 411 + </tr> 412 + <tr> 413 + <td>7</td> 414 + <td>-</td> 415 + <td>CL09: The user feels they are losing their freedom, CL13: The user gets annoyed by the robot</td> 416 + <td>IDP2</td> 417 + </tr> 418 + <tr> 419 + <td>8</td> 420 + <td>RQ007: Alert caretakers</td> 421 + <td>CL05: The user is willing to wait for the caretaker, CL14: Caretakers are alerted once a user leaves the home</td> 422 + <td>IDP2</td> 423 + </tr> 424 + <tr> 425 + <td>9</td> 426 + <td></td> 427 + <td>CL01: The user is prevented from getting lost, (CL11, CL12, CL15)</td> 428 + <td>-</td> 429 + </tr> 430 + </table> 431 + 432 +<h2>UC005: User takes a supervised walk</h2> 433 + <table border='1px'> 434 + <tr border='1px'> 435 + <td bgcolor='gainsboro'> 436 + <b>Objective</b> 437 + </td><td width='70%'> 438 + OB02: Allow user to take supervised walks 439 + </td> 440 + </tr><tr> 441 + <td bgcolor='gainsboro'> 442 + <b>TDP</b> 443 + </td><td> 444 + TDP: scene A, scene C 445 + </td> 446 + </tr><tr> 447 + <td bgcolor='gainsboro'> 448 + <b>Actors</b> 449 + </td><td> 450 + <ul> 451 + <li>Person with dementia</li> 452 + <li>Pepper</li> 453 + <li>Caretaker</li> 454 + </ul> 455 + </td> 456 + </tr><tr> 457 + <td bgcolor='gainsboro'> 458 + <b>Pre-condition</b> 459 + </td><td> 460 + User is restless and wants to take a walk 461 + </td> 462 + </tr><tr> 463 + <td bgcolor='gainsboro'> 464 + <b>Post-condition</b> 465 + </td><td> 466 + User is accompanied by caretaker 467 + </td> 468 + </tr><tr> 469 + <td bgcolor='gainsboro'> 470 + <b>Action sequence</b> 471 + </td><td> 472 + Figure<br><br> 473 + UC steps:<br> 474 + 1. User walks to door to go out<br> 475 + 2. Robot asks user what they are doing<br> 476 + 3. User requests to take a walk<br> 477 + 4. Robot calls the caretaker and requests the user to wait<br> 478 + 5. Caretaker arrives and takes a walk with the user<br> 479 + </td> 480 + </tr> 481 + </table> 482 + 483 + <table border='1px'> 484 + <tr> 485 + <td bgcolor='gainsboro'> 486 + <b>UC step<b> 487 + </td><td bgcolor='gainsboro'> 488 + <b>Requirements</b> 489 + </td><td bgcolor='gainsboro'> 490 + <b>Claims</b> 491 + </td><td bgcolor='gainsboro'> 492 + <b>IDP</b> 493 + </td> 494 + </tr> 495 + <tr> 496 + <td>1</td> 497 + <td>RQ001: Detect movement towards door</td> 498 + <td>-</td> 499 + <td>-</td> 500 + </tr> 501 + <tr> 502 + <td>2</td> 503 + <td>RQ003: Speak in a human-like way, (RQ002: Recognize people in the care home), (RQ008: Use gestures for non-verbal communication)</td> 504 + <td>CL07: The user notices the system</td> 505 + <td>IDP4</td> 506 + </tr> 507 + <tr> 508 + <td>3</td> 509 + <td>RQ004: Process spoken natural language</td> 510 + <td>-</td> 511 + <td>IDP4</td> 512 + </tr> 513 + <tr> 514 + <td>4</td> 515 + <td>RQ007: Alert caretakers, RQ003: Speak in a human-like way, (RQ008: Use gestures for non-verbal communication)</td> 516 + <td>CL05: The user is willing to wait for the caretaker, CL10: The user feels dependent on others, (CL12, CL15)</td> 517 + <td>IDP4</td> 518 + </tr> 519 + <tr> 520 + <td>5</td> 521 + <td></td> 522 + <td>CL02: The user's mood is improved</td> 523 + <td></td> 524 + </tr> 525 + </table> 133 133 {{/html}} 134 134
- XWiki.XWikiComments[0]
-
- Author
-
... ... @@ -1,0 +1,1 @@ 1 +Anonymous - Comment
-
... ... @@ -1,0 +1,2 @@ 1 +This could be specified in one or two use cases (which is ok/enough). One use case can include alternative action sequences. I am not sure about the role of music in UC4: Robot sings or plays a song associated with retirement, 2 +and the user is reminded of his retirement and realises he does not have to go to work. Is there music that reminds each resident about his/her retirement, what is being remembered about the retirement? Probably changing the mood with music and just distracting the resident from his/her intention to pass the door are more practical? It would be good to formulate a claim on this topic. - Date
-
... ... @@ -1,0 +1,1 @@ 1 +2022-03-20 23:15:39.745