jim.shamlin.com

Persuasive Actors

While no formal studies have demonstrated the methods by which computing products trigger social responses, there is ample observational and anecdotal evidence that may do respond to computers as if they are living beings. However, this is not unique to machinery: perfectly sane people treat and inanimate objects as if they possessed sentience.

Five Types of Social Cues

There has been research to suggest that people are most responsive to objects that exhibit five types of social cues:

Many devices, and virtually all computer software, are designed to provide these social cues to the user.

Physical Cues

Devices such as dolls, dummies, and robots are clearly designed to mimic the physical appearance of people (or other sentient creatures), and computer software often uses animated characters to provide physical cues to the user.

The greater the mimesis of physical cues by a device, the greater the tendency for the user to respond to it in a social manner, and generally a positive manner. (EN: the exception being the band of the spectrum referred to as the "uncanny valley," which creates discomfort in the user).

Physical attractiveness is a significant factor, as much for devices as for real creatures: those which have a physically attractive interface or hardware have greater persuasive power. There is no clear agreement among psychologists as to the precise reason, but most agree that it is a highly influential quality.

To this end, researchers have undertaken considerable effort in making devices that closely mimic human physicality, including facial expressions, voices, and lip movements.

The author refers to a 1996 study by the Boston University school of management, in which subjects interacted with computer characters in a social dilemma game that would enable the user to work cooperatively or competitively with the computer character. When the character was unattractive, the cooperation rate was 32%, but when it was attractive, cooperation rate was 92% (which is "statistically indistinguishable" from the cooperation rate of playing the game with a real human being).

Naturally, attractiveness is subjective and there are wide differences among generations, cultures, and individuals, with a few areas in which there seems to be universal agreement (such as symmetry). This has led designers to tailor the appearance of computer characters to their target audiences.

Other experiments at Stanford designed interfaces with no "character" at all, merely text messages, and were still able to elicit social responses - so while it can be an important factor, physicality is not strictly essential to creating a social presence.

Psychological Cues

Psychological cues convey emotion. A few examples are error messages that convey empathy (an apology that something has gone awry) or "smiley" that indicates success. Taken together, the emotional cues that a device expresses are a suggestion that the device has intelligence and personality.

Psychological cues can be powerful, and often function at the subconscious level. Human beings are wired react to the emotions of others, or even the perception of emotion in an object they know to be non-sentient. This is evident in the "pet" relationship with animals, as well as in the reaction to low-tech devices such as dolls, or high-tech devices such as computers. Users express less frustration and greater affection when software conveys psychological cues.

The author refers to a couple of studies he conducted at Stanford for specific examples:

  • In a personality study, it was found that subjects expressed greater trust for computer whose text cues were most similar to their own personality (in terms of dominance or submissiveness)
  • In an affiliation study, it was found that subjects were more trusting of computers that were identified (by use of color codes) as being on the same "team" as the subject's group (in a competitive scenario)
  • In a product development study, it was found that product ratings of a device that gave error messages in a "warm and friendly" tone was preferred over one that gave the same messages in a neutral, factual tone. This was borne out later in increased success in the market, and consumer surveys showed a marked improvement in the perception that the new model was more accurate and reliable (when the only difference from the previous model was the tone of the error messages). Worth noting: the product in question was an oscilloscope, the market for which is scientists and engineers (who tend to be a distinctly intelligent and unemotional group)
  • Taken together, these studies illustrate the potential impact of psychological cues in computing products, indicating that the presence and nature of these cues affects the attitude of the user toward the device. As these were experiments, they were one-dimensional by design. However, computers can convey a wide range of emotions and personalities to influence the user's perception and behavior.

    Naturally, this raises ethical as well as practical concerns as to whether using emotion to affect the user's interaction, which is in effect to subvert the logical faculty, is desirable or just. However, the author suggests that designers will invariably create products that infer emotion, and their only choice is to determine whether the emotions conveyed will be the result of a rational design choice or left to chance.

    (EN: The author makes reference to the "five key dimensions" of personality, which piqued my curiosity. I did a quick Web search, meaning to summarize them here. but found that there are multiple lists and categorization schemas. And so, while it may be worthwhile and relevant, it's going to take more than a quick sidebar to explore the topic.)

    Language

    The use of language, through text messages presented on screen, is the primary mode of interaction of most computer software. To a lesser degree, computing devices also use recorded or simulated voice messages to the user (more common in entertainment than productivity software).

    In either form, language conveys emotion as well as information to the user - and as emotion and interaction are covered separately in this chapter, the author focuses on the use of language as a choice: to communicate or not to communicate when it is not strictly necessary to do so.

    Specifically, the author focuses on the use of language to convey praise in situations where it seems unnecessary to have the device communicate anything at all. Studies on the effect of praise in various situation have clearly shown its positive impact and the ability to reinforce behavior and affect the attitude of the subject.

    Of special importance is the use of praise. It was shown that users regarded software that provided praise intermittently and appropriately (rather than constantly) as being more credible, and had a higher regard for credibility and accuracy of the evaluation, when messages of praise were combined with neutral or cautionary messages.

    In addition to numeric ratings, subjects who received praise provided more unstructured remarks that indicated that they felt that they had performed better, were more engaged by the interaction, were in a better mood, "liked" the computer more, would be more inclined to use the computer again, and were in a better overall mood.

    The author concedes that these are not direct measures of persuasion, but suggests that such positive reactions "open the door to influence" (EN: refer to the principle of kairos, that subjects are more inclined to be cooperative based on their mental state).

    Interaction

    In human interactions, most cultures define a set pattern for interactions in common situations (such as greeting someone, taking turns, forming a queue, and making a basic purchase transaction). These ritual interactions, called "social dynamics," are unwritten rules that others expect us to follow, and deviation from protocol causes confusion and hostility.

    Human-computer interaction also follows certain protocols, whether these are adapted from human interaction or have become established conventions in interacting with computers.

    In the field of e-commerce, considerable effort has been undertaken to utilize social dynamics to make the experience of shopping online more pleasant and user-friendly, adapting the protocols of transactions in a brick-and-mortar store.

    The author also refers to the e-mail program Eudora, which prompts users to register their software by giving them a choice of "register now" or "maybe later" (with no option for flatly refusing), a tactic commonly used in negotiation that places the subject in a position to be able to refuse gently, and at the same time indicate (or accept) a willingness to revisit the decision at a later time, and eventually consent.

    A key persuasive tool in human interaction is the theory of reciprocity, which anthropologists have observed in every human society: it's an unwritten rule that, when you receive a favor, you are obligated to return a favor.

    The author refers to laboratory research, using a simulation in which the subject could request "help." One group received useful information and another general information. (EN: The author doesn't mention the use of a control group of users who had no ability to request help, which would seem relevant.) This was combined with a follow-up task in which the computer asked for the subject in matching colors (ranking them light to dark), ostensibly to "help" the computer better gauge human perception, and subjects had the option of rating as few or many palettes as they cared to. The results were that participants who used the computer that was helpful in the first task tended to reciprocate by "helping" the computer rank a greater number of color palettes.

    The author mentions that "retaliation" is closely related to reciprocity, as one could plausibly argue that the students who received "useless" information were reacting in a hostile manner by refusing to provide help on the second task. He concedes that the retaliation effect is often more provocative and pronounced than the reciprocity effect, but is generally not useful for designing computer products (EN: My earlier comment, about having a control group who received no help at all, might have helped distinguish this)

    The author also suggests that software can leverage reciprocity protocols by providing a prompt such as "You have enjoyed playing this game ten times. Why not pay back the favor and register?" (EN: I am inclined to disagree with this suggestion. Reciprocity is an obligation accepted by the subject, not placed upon the subject by another party. The simple act of suggesting one is "owed" a favor in return for what you have done for someone else is a cue that most will recognize as manipulative, and their reaction is likely to be negative.)

    Adopting Roles

    Human beings are strongly motivated to cooperate and obey with individuals in roles of authority - teachers, referees, judges, and experts. They assume authorities are intelligent, powerful, objective, and well-intentioned and readily submit to their authority. This behavior seems to transfer to human-computer interaction as well.

    In the mid 1960's, and experiment was done at MIT (Weizenbaum) developed a program called ELIZA, that mimicked the role of a psychotherapist by asking questions of subjects without specific reference to or consideration of the information they provided (e.g., "Would you elaborate on that?"). The success of this simple program (300 lines of code) in getting test subjects to remain engaged, disclose a great deal of information, and their belief that the program had "helped" them with their problems, was "disturbing" to the researchers.

    The author notes that this is often used in branding software products: Norton's Disk Doctor and Broderbund's Typing Teacher are presumed to have been more successful than similar programs branded as "assistant" or "helper" (Broderbund's software even went so far as to create a fictional character, Mavis Beacon, as an instructor).

    The author also suggests that it can also be effective for software to assume a more subservient role - such as the "Ask Jeeves" search engine that personified itself as a butler. This was likely intended to convey an impression that the service was helpful, respectful, and subservient.

    It is also noted that the choice of persona is largely dependent on the personality of the subject. For example, teenagers are acutely resistant to taking orders from an authoritative adult. The author describes an "aerobics trainer" software that allows the user to select their own coach from a number of personas, who differ not only in their appearance and voice but also in the nature of the feedback they provide (one example is the Drill Sergeant persona that is stern-looking, gruff, and provides primarily negative rather than positive feedback).

    Social Cues: Handle with Care

    While social cues can be effective in persuasion, they can also have a negative impact if used inappropriately. As an example, Microsoft's "Clippy", a character that was intended to make software help more user-friendly, was found to be annoying by many users. (EN: it's worth noting the degree of hostility - search engine queries indicated that users were not merely looking to turn off the feature, but wanted to "kill clippy."

    To complicate matters, the nature and level of social interaction that is helpful versus annoying varies among users, so there can be no specific guidance as to degree, though there is a general indication that the more you "turn up the volume" on the social element, the stronger the positive or negative reaction of the user.

    Another general principle seems to be that users are more tolerant of social interaction in entertainment software, even to the point of expecting or welcoming it, while being less tolerant of social interaction in software used for "serious" tasks.

    Another caution is that there are situations in which users seek out a technical solution in order to avoid social interaction. The success of ATMs, pay-at-the-pump gas stations, and vending machines can be attributed not only to the objective convenience of avoiding the walk to the cashier, but also to the customer's desire to avoid "dealing with" another person for a simple transaction.

    And finally, it is noted that design is perceived to be more effective when interaction is intermittent rather than constant, and when messages are varied so that the interaction does not seem repetitious.