Changes for page 7. Persuasiveness of conversational agents
Last modified by Demi Breen on 2023/04/09 14:59
From version 7.1
edited by Liza Wensink
on 2023/04/04 16:25
on 2023/04/04 16:25
Change comment:
There is no comment for this version
To version 2.1
edited by Liza Wensink
on 2023/03/19 21:31
on 2023/03/19 21:31
Change comment:
There is no comment for this version
Summary
-
Page properties (2 modified, 0 added, 0 removed)
-
Attachments (0 modified, 0 added, 1 removed)
Details
- Page properties
-
- Title
-
... ... @@ -1,1 +1,1 @@ 1 - 7.Persuasiveness of conversational agents1 +Persuasiveness of conversational agents - Content
-
... ... @@ -1,34 +1,28 @@ 1 - Since the focus of ourdesign isto motivateaPwD to follow along on a walk in thegarden together with the robot, we will most likely need to take persuasivenessintoaccount. Persuasiveness in human-humaninteractionsconsistsof persuasiontacticsand behaviorsthatmight make a certainperson more orless convincing.When it comesto human-robotinteractions these aspects also come into play, with the addedchallenge of the agent not being able to employ alltactics a human might beable to do.Below we, therefore, dive intopersuasiveness in conversational agents and whatcould be essential when designing a systemwithn objectivelike this.1 +**Article: **Persuasive Conversational Agent with Persuasion Tactics. [[https:~~/~~/link.springer.com/chapter/10.1007/978-3-642-13226-1_4>>https://link.springer.com/chapter/10.1007/978-3-642-13226-1_4]] 2 2 3 - Generally,for a conversational agent to be persuasive and influence a person's behaviorit needsto be able to adapt to the outcomes of the conversation and the interactions it has with the human, as would a human who wants to be persuasive,according to Narita andKitamura [1]. When itcomes to designing the agent itself, several modelscan be used. The general approachis to select the response and the rule of replyingthat is most likely to lead to the desired goal [1].3 +A number of studies had been done regarding the persuasiveness of conversational agents and how convincing an agent actually might be to a human person. This paper highlights that for a conversational agent to be persuasive and influence a person's behavior they need to be able to adapt to the outcomes of the conversation and the interactions it has with the human, as would a human who wants to be persuasive and convincing. 4 4 5 -This c ould bedonethrough aconversational model which canbe representedasa statetransition tree,using a goal-orientedapproach. The different statements thatcan be givenby therobot are thenrepresented as linkstochange fromonetatetoanother [1]. Since theseinteractionsimplyadialogue there would be two different typesof states: humanstates and agent states which are interconnected in conversationpaths. These paths represent the flow ofconversations, beginning withan initialstateand ending witheither success or failure [1]. When the input fromthe human links toanagent state the agent chooses a statementthat leadstheagent's stateto thehumanstate with the greatest probability of success[1]. The modelis updatedwhen aninputis providedthattheagentis notfamiliarwith.Thiscauses the conversation path to branch and themodel updatestheprobability scores[1]. While it isnot feasible todevelop a full-scale conversationalmodelfor the sake of our designhis project, this clearly illustrates the generalapproachtopersuadingwith the help ofaconversationalagent. A clear goal is setforthe interactionandthe agent attempts to act accordingly, in steps that bring theconversation closer to the goal.5 +This article also contains useful information when it comes to designing a persuasive conversational agent using the Wizard of Oz method. I won't describe it in detail here, but if it is needed there is some information to be taken from the article. 6 6 7 - Thepersuasiveness-objectiveinthegivenstudycentered aroundshowingthe participant two differentcameras,A and B. Thepurposeof the persuasion was to makethe userchangetheirinitial choice [1]. The process of persuasionwasdoneaccordingto Table1 below [1] where the generaltacticwasto try to convince theuserto choosethe other camera byexplainingwhy theconcerns they raisemight not berelevant,suchas explainingthateither the pixels or the stabilizerdo not carry much weight [1]. Themodel already has set predictions of what a usermight inquireaboutand haspre-written responses that mightchangethe opinion of the user[1]. Inourcasemaybewecould also attemptto catchsome reasonssomebodymight not wantto go walking forexample,and then try to explainwhythose reasons are notrelevant orimportantto try topersuadethe user togo out on thewalk.7 +When it comes to designing a persuasive conversational agent, there are several models that can be used. The general approach is to select the response and rule of replying that is most likely to lead to success. 8 8 9 - [[image:attach:flowchart.PNG||height="702"width="416"]]9 +**Goal-oriented conversational model. ** 10 10 11 -The study does, however, continue by mentioning that the Wizard of Oz approach, where the robot is simply controlled by a human in a wizard-like fashion, managed to persuade 25/60 users and the conversation agent based on the model only managed 1 out of 10 users [1]. A necessary takeaway here is to remember that designing a persuasive conversational agent consists of two important aspects, which will be crucial in the design of our project also. These are: 11 +(Quotes from the article) 12 +\\- The conversation model can be represented as a state transition tree where a statement is represented as a link to change a state from one to another. 13 +- Two different types of states, agent states and user states (the human). 14 +- They are interleaved on a conversation path. 15 +- A conversation path represents the flow of conversation between the agent and one or more users and begins with the initial state and terminates with either success or failure. 16 +- If the input matches a statement on a link to an agent state, the agent chooses a statement that links the agent state to a user state with the greatest success probability. 12 12 13 -* having the robot follow general human conversational rules 14 -* applying persuasiveness tactics [1]. 18 +This might not be necessarily a structure we need to implement in its entirety, but some information could definitely be taken from it. 15 15 16 - Another example of an attemptedpersuasionusingaconversationalagent was done by Massimoet al., where an agent attemptedto persuade a person to follow a healthier diet [2].Here, it is shown that there are three psychosocial antecedents of a behavior change, and they include:20 +**Updating conversation model. ** 17 17 18 -* Self-Efficacy (so the person's ability to do something, here: eat healthily) 19 -* Attitude (the person's evaluation of the pros and cons of the change 20 -* Intention Change (the person's willingness to go through with sticking to this diet [2] 22 +When updating the above conversation model needs to be updated, it goes according to the following: 21 21 22 - Theabove-mentioned aspectscannotbemeasured directlyandareinsteadcapturedaslatentvariablesthrough questionnaires[2]. Whilethisfull setup might be too extensivefor ourprojectwith thePepper robot our objectivestill ties into persuadingPwD toparticipateinanactivity. It is noteworthythatthisalso hastodo with healthand that we can usehealth-related explanationsandpersuasionsin ourdesign as well,whichmight mean thethreeaspectsabovecouldbe deemedto berelevant.24 +- When input from the user does not match any statement on the stored conversation path, the conversation path is branched and the success probability scores are updated depending on persuasion success/failure. (Once again, maybe not something we will be able to implement but can try to somehow mimic the idea of). 23 23 24 - In thesubsequenttestthatwasconductedin thisstudy participantswere randomlyassignedtooneoffourgroups, eachreceivinga differenttype ofpersuasivemessage.Thepersuasivemessageswere focused on eithergain(positiveactions givepositiveoutcomes),non-gain(negativeactionspreventpositiveoutcomes),loss(negativeactionsgivenegativeoutcomes),andnon-loss(positiveactionsprevent negative outcomes) [2]. Again, while thisis ratherelaborate it couldberelevant to consider thesedifferenttypesofpersuasivemessages, which could perhapsbe incorporatedintothe motivationwe wanttoprovideto PwD. Perhapsit wouldbe possibleto investigatewhat kind ofpersuasive messages might be effective.Wedo want to,however, stay on thesideof positivemessagesfor motivation and stillanchorthese inithergoal-orientedor emotion-based motivationpproach.Further, the study descendsintousing reinforcementlearning and Bayesian networkstoachieve thesegoalsofpersuasion using theconversational agent, which isnotvery relevant toour particularproject,evenif ithighlyinterestingto learn about.26 +The article does, however, continue by mentioning that the Wizard approach where the robot is simply controlled by a human in a wizard-like fashion managed to persuade 25/60 users and the conversation agent based on the model only managed 1 out of 10 users. It is necessary to remember that designing a persuasive conversational agent consists out of two important aspects - having the robot follow genera human conversational rules, but also applying persuasiveness tactics. I will attempt to clarify these tactics a bit below. 25 25 26 - 27 -===== References ===== 28 - 29 -[1] Narita, T., Kitamura, Y., (2010), Persuasive Conversational Agent with Persuasion Tactics, In: Ploug, T., Hasle, P., Oinas-Kukkonen, H. (eds) Persuasive Technology. PERSUASIVE 2010. Lecture Notes in Computer Science, vol 6137. Springer, Berlin, Heidelberg. DOI: [[https:~~/~~/doi.org/10.1007/978-3-642-13226-1_4>>https://doi.org/10.1007/978-3-642-13226-1_4]] 30 - 31 -[2] Massimo, D. F., Carfora, V., Catellani, P., Piastra, M., (2019), Applying Psychology of Persuasion to Conversational Agents through Reinforcement Learning: an Exploratory Study, //Italian Conference on Computational Linguistics//.Online, [[https:~~/~~/ceur-ws.org/Vol-2481/paper27.pdf>>https://ceur-ws.org/Vol-2481/paper27.pdf]]. 32 - 33 - 34 34
- flowchart.PNG
-
- Author
-
... ... @@ -1,1 +1,0 @@ 1 -XWiki.lwensink - Size
-
... ... @@ -1,1 +1,0 @@ 1 -63.3 KB - Content