Wiki source code of a. Prototype

Last modified by Vladimir Rullens on 2025/11/09 15:43

Show last authors
1 = **Scope and purpose** =
2
3 Our prototype is focused on assessing whether our claims [[CL001: Regular physical activity improves PwD physical well-being>>doc:2\. Specification.b\. Use Cases.Claims.CL002\: Increased calorie expenditure of PwD.WebHome]] and [[CL002: Dance session improves PwD mental well-being>>doc:2\. Specification.b\. Use Cases.Claims.CL003\: H-R Conversation improves mental well-being of the patient\..WebHome]] hold. As such, its functionality is focused primarily on [[UC02.1: Dancing Session Start>>doc:2\. Specification.b\. Use Cases.UC02\.0\: Dancing Session.UC02\.1\: Dancing Session Start.WebHome]], incorporating [[RQ01.2: Human-Robot conversation>>doc:2\. Specification.Requirements.RQ03\.0\: Reflect on game performance.RQ01\.2\: Human-Robot conversation.WebHome]] and [[RQ01.4: Hold dance session>>doc:2\. Specification.Requirements.RQ03\.0\: Reflect on game performance.RQ01\.4\: Hold dance session.WebHome]]. While [[UC02.2: Dancing Session Clean-Up>>doc:2\. Specification.b\. Use Cases.UC02\.0\: Dancing Session.UC02\.2\: Dancing Session Clean-Up.WebHome]] and [[UC03.1: Companion Mode>>doc:2\. Specification.b\. Use Cases.UC03\.0\: Companion.UC03\.1\: Companion Mode.WebHome]] are incorporated, they are only done so on a superficial level, as by this point, the experiment will end. Further details on our experiment procedure are explained further in [[Evaluation: Test>>doc:3\. Evaluation.b\. Test.WebHome]].
4
5
6 = **Procedure** =
7
8 Our prototype procedure is displayed in the following Figure:
9
10 [[An overview of the prototype, showcasing the different functionalities used from start to finish.>>image:Prototype_expl.png]]
11
12 In terms of functionality, our prototype utilizes a **visual** version of the Robot dog Miro, that is operated through an **LLM**. An actual robot was not used. During the interaction, both the user and Miro interact with each other through **speech-to-text** and **text-to-speech** translations respectively, in order to allow for voice communication. To play a song for the dance session, our LLM makes use of the **Spotify API**, starting/pausing/stopping the song at will. This closely matches our described AI and ICT technologies.
13
14 The exact procedure is as follows, based on [[UC02.1: Dancing Session Start>>doc:2\. Specification.b\. Use Cases.UC02\.0\: Dancing Session.UC02\.1\: Dancing Session Start.WebHome]]:
15
16 ~1. Miro introduces itself and invites the user to a Dance Session using the previously mentioned communication method.
17
18 2. Once the user has accepted, a short discussion is held on the song/genre. Namely, the LLM asks the user if they would be interested in certain genres, such as jazz, rock, pop, or classical. We note the user is not forced to pick one of the given options.
19
20 3. Based on the user's response, a song is played using the Spotify API.
21
22 4. The song will play until the user interrupts the song by talking to the LLM, allowing them to modify the state of the activity. Here, by communicating with the LLM, the user can i.e. modify the song or choose to stop the activity entirely, which would send the LLM to its Companion Mode state.
23
24 **Companion Mode.** If the user is not in the process of entering the 'Dancing Session' or currently in one, the LLM will be in 'Companion Mode' as a fallback. We note this state was not explored during the experiments.