Changes for page Diederik Reflection
Last modified by Diederik Heijbroek on 2024/04/07 20:15
From version 6.1
edited by Diederik Heijbroek
on 2024/04/07 20:15
on 2024/04/07 20:15
Change comment:
There is no comment for this version
To version 4.1
edited by Diederik Heijbroek
on 2024/03/22 12:10
on 2024/03/22 12:10
Change comment:
There is no comment for this version
Summary
-
Page properties (1 modified, 0 added, 0 removed)
Details
- Page properties
-
- Content
-
... ... @@ -1,12 +1,5 @@ 1 - ==Personal Reflection ==1 + 2 2 3 -During our initial brainstorm, we quickly figured out which direction we wanted to go in. As it is easy to drift off, we had a second session to narrow down all of our ideas to one concept and made sure that we were consistent throughout the rest of the project. Throughout the weeks we divided the tasks (most of the time in groups of 2) and made sure that we reviewed each other's work to stick to the aforementioned coherency. I believe that throughout this project we divided the work equally and everything led up to that we had a solid pace, were able to do the full experiment and evaluation in week 7 and showed our interesting work and progress properly during the two presentations. I'm proud of what we achieved in so little time, which we can thank to good organisation, division of tasks, and everyone's involvement :). 4 - 5 -Throughout this project I learned about all of the requirements for coming up with new ideas. Delft is really entrepreneurial and I like about this course that we could learn more in this field by doing the project. Also, the potential of the three robots suggested in the lectures is really good and it is nice how each group could come up with a creative idea for the support of people with early stage dementia and that we could test our idea with fellow students and friends. 6 - 7 -I'd like to give a bit of feedback about the course as well. It seemed to me that sometimes we had to be very repetitive, because in for instance 2.b, 2.c, 2.d, 2.e, we kept repeating the same pieces of information. In my opinion, some of the text is a bit redundant this way. Moreover, it was a bit unclear in the subsections of the evaluation what was expected of us and where to talk about what. However, this was done really nicely in the foundation part of the project, so if it could be more similar to that, it would be an improvement. Nevertheless, I really enjoyed the course material, that we were able to work with the Nao robot and see our idea come to life, and that guest lecturers were invited to talk about specific topics they were specified in! Thanks to the TAs as well :)! 8 - 9 - 10 10 == Lecture Summaries == 11 11 12 12 Because I cannot attend the lectures on Monday, I'll give a short description of each of the missed lectures to show that I am familiar with the lecture content and fully up-to-date with the course material. I have attended all of the Tuesday lectures. ... ... @@ -54,16 +54,17 @@ 54 54 First we need to determine usability for different groups, identify good and bad features for future design. Compare the design choices for making decisions and observe the effects. We need to use this final step to evaluate what to do better. As people's interests change over time, there is a constant need for reiteration of the product. In SCE specifically, we focus on checking the usability of the interaction design at the task level. 55 55 56 56 **Know about different types of evaluations and how they relate to each other** 50 + 57 57 There are formative and summative evaluations. They both intertwine, because we need to observe how people interact with the system and based on that we can evaluate other properties, such as whether they are able to do the interaction we designed. By understanding how they interact with the system we can make different design choices to allow people to perform the interaction based on "logical" choices we observed. 58 58 59 59 **Understand basic quality aspects of evaluation methods** 54 + 60 60 Formulative evaluation is derived from open questions in the design specification. 61 61 Summative evaluation is derived from closed questions in proving the quality of the design. Summative evaluation also uses independent variables which we manipulate to measure the effects (dependent variables). 62 62 63 63 **Know about basic data type constraints** 64 -We need to consider which tools to use (GDPR), where we can gather much data (crowd sourcing) and how to ensure that our data is of quality (attention checks or multiple questions related to same construct). 65 65 66 - **Other**60 + 67 67 Finally, we looked at the PAL-example, ethics approval form, and ReJAM-related examples. 68 68 69 69 === Lecture 11 (no lecture) ===