Andrei Stefan
Week 1
Attended the lecture on Monday and learned about concepts of Socio-Cognitive Engineering. Found a team to work on the project with and met together on Tuesday, when we created accounts for the XWiki and brainstormed some general ideas of what the robot needs to do, which robot would be most suitable for these tasks, and started writing some rough ideas of use cases together. We also created accounts on the Interactive Robotics platform, since we thought of working with Nao.
Individually, I studied the lecture slides again in detail and read this week's papers, in order to start writing the Human Factors and the Robotic Partner sections.
Week 2
Attended the lecture on Monday and learned about, music and cognition. It was interesting to see how big the differences regarding memory loss are between "regular" elders and people with dementia. We collectivaly started working on the Foundations tab of the Wiki, by spiltting into groups of two people. Xinqi and I worked on the Human Factors section. As I had already started writing the Robotic Partners part, sge suggested that I keep working on it, while she focused on wirting the Music and Cognition section.
Week 3
Missed the lecture on Monday, due to a meeting for a different course. I managed to catch up during the Tuesday lab, when we started working on the Specification tab of the Wiki. This week, my partners were Xinqi and Lang and we worked on the Use cases and the Claims sections. We spent most of the lab brainstorming which use cases would fit well for our choice of robotic partner and for the types of interactions between the patient and the robot that we envisioned (we decided to remove the tablet and only use the NAO robot). After a bit of struggle figuring out how the patient would communicate with NAO without a tablet for text input, Rembrandt found out that the robot can use speech recognition and thus hold conversations, which is the means of communication that we plan on using.
Week 4
Attended the lecture on Monday and learned about experiments and measurements. Together with the team, we divided the work for the week and started planning our experiments and thanks to Rembrandt who set up the timeslot, we had some time to play around with the NAO robot on Wednesday. While the navigation graph that Rembrandt had prepared didn't work as expected, we managed (after about three hours of struggle) to get the robot to understand speech. From there on, the only thing remaining was to implement the functionalities that we wanted. During the week, we also prepared the presentation for the following Monday and we decided that we would all present. Unfortunately, Rembrandt tested positive, so he could not attend, which meant we had to re-divide who would present what in a short meeting on Sunday.
Week 5
Presented on Monday and got feedback for how we should proceed with the project. Continued to develop the prototype, and by Friday we met up again to test it. Compared to last time, it went a bit better, as we didn't have to waste 3 hours trying to understand how to get the robot to recognize speech, but we still had some issues with it not wanting to run the program for no apparent reason. All in all, we managed to get a semi-final file to use for evaluation and filmed a short video showcasing the general flow of an interaction whith the robot (with some bugs still present, so it will at most serve as a bloopers video).