Changes for page Step 4: Claims
Last modified by Mark Neerincx on 2023/04/13 12:17
From version 3.1
edited by Michaël Grauwde
on 2023/03/27 03:47
on 2023/03/27 03:47
Change comment:
There is no comment for this version
To version 3.3
edited by Michaël Grauwde
on 2023/03/27 03:50
on 2023/03/27 03:50
Change comment:
There is no comment for this version
Summary
-
Page properties (1 modified, 0 added, 0 removed)
Details
- Page properties
-
- Content
-
... ... @@ -10,8 +10,10 @@ 10 10 11 11 **__The impact of the negative effects__** 12 12 13 -**__1. __** Amiscommunication betweentheconversationalagentandthecitizen could beanalysedby theaggingtimein whichemergencyserviceswere dispatchedtothecitizen requiring help.Furthermore,thiscouldalsobeseenbythelack of understandinginthetext communicatedbythecitizenonthedoftheconversationalagent.13 +**__1. __** We can analyse how the humans feel about themselves and if they feel put in a box by examining how they engage in discussion and seeing how open they are to new ideas. 14 14 15 +**2. **If the users lose trust in the system, this can cause loss of trust in the police. This would be damaging to the society, as it would make it more difficult for the police to do their jobs and use technology in the future. 16 + 15 15 **__2. __**In disaster times, it is required that the authorities communicate well with the public. A lack of communication could lead to distrust in the system as well as other AI systems. This distrust could lead to citizens being unwilling to use the conversational agent in the future. This could lead to lagging response times from the emergency services as human respondents may be overwhelmed by the amount of calls that they may receive. This could lead to an increase in deaths and injuries in crisis times. 16 16 ))) 17 17 |(((