Changes for page 7. Persuasiveness of conversational agents
Last modified by Demi Breen on 2023/04/09 14:59
From version 1.1
edited by Liza Wensink
on 2023/03/18 22:57
on 2023/03/18 22:57
Change comment:
There is no comment for this version
To version 2.1
edited by Liza Wensink
on 2023/03/19 21:31
on 2023/03/19 21:31
Change comment:
There is no comment for this version
Summary
-
Page properties (1 modified, 0 added, 0 removed)
Details
- Page properties
-
- Content
-
... ... @@ -7,7 +7,9 @@ 7 7 When it comes to designing a persuasive conversational agent, there are several models that can be used. The general approach is to select the response and rule of replying that is most likely to lead to success. 8 8 9 9 **Goal-oriented conversational model. ** 10 -- The conversation model can be represented as a state transition tree where a statement is represented as a link to change a state from one to another. 10 + 11 +(Quotes from the article) 12 +\\- The conversation model can be represented as a state transition tree where a statement is represented as a link to change a state from one to another. 11 11 - Two different types of states, agent states and user states (the human). 12 12 - They are interleaved on a conversation path. 13 13 - A conversation path represents the flow of conversation between the agent and one or more users and begins with the initial state and terminates with either success or failure. ... ... @@ -15,7 +15,12 @@ 15 15 16 16 This might not be necessarily a structure we need to implement in its entirety, but some information could definitely be taken from it. 17 17 18 -**Updating conversation model.** 19 - 20 +**Updating conversation model. ** 20 20 22 +When updating the above conversation model needs to be updated, it goes according to the following: 23 + 24 +- When input from the user does not match any statement on the stored conversation path, the conversation path is branched and the success probability scores are updated depending on persuasion success/failure. (Once again, maybe not something we will be able to implement but can try to somehow mimic the idea of). 25 + 26 +The article does, however, continue by mentioning that the Wizard approach where the robot is simply controlled by a human in a wizard-like fashion managed to persuade 25/60 users and the conversation agent based on the model only managed 1 out of 10 users. It is necessary to remember that designing a persuasive conversational agent consists out of two important aspects - having the robot follow genera human conversational rules, but also applying persuasiveness tactics. I will attempt to clarify these tactics a bit below. 27 + 21 21