3: Commitment and Consistency
Another common weapon of social influence is consistency: we heave a "nearly obsessive desire" to be consistent with our past behavior. Once we have made a decision or taken a stand, we will ignore evidence to the contrary and change our behavior as necessary in the expectation that the outcome we will achieve will be consistent to our expectations.
Cialdini mentions studies of gamblers at a racetrack. Thirty seconds before placing a bet, they are tentative and uncertain about which horse will win, but thirty seconds after making a bet, they are confident and assured. This is not because anything factual has changed, but the mere act of placing a bet causes gamblers to become committed to and confident in their choice. If effect, they have convinced themselves that the choice to which they have committed is correct.
It is not merely gamblers who suffer from this tendency. The author mentions the psychology of women of abusive husbands. While particularly violent incidents might drive they away for a short period of time, they almost always go back when their spouses show contrition and promise to stop their abusive behavior, even though this is a cycle that perpetuates. Such women are motivated by commitment to a partner as they wish them to be, not such as they are, and will give them a virtually unlimited supply of second chances because they will not accept that their decision to return was wrong.
An experiment is mentioned (Moriarty) in which a pair of actors would stage a theft at a beach: one would rise from his blanket and go for a swim while the other would then appear and steal some of his possessions. About 25% of the time, no-one would challenge the thief at all. However, if the first actor asked someone nearby to "please watch my things" before leaving, then 95% of the subjects would take action: chase the thief, demand an explanatory, call for help, or even physically restrain the thief.
The Mechanics of Consistency
Consistency is a powerful motivator because it is culturally valued. Consistency is associated with strength of character: a person who is consistent is intelligent (they make good choices), honest (their words express their true thoughts), and reliable (their deeds match their words).
Its opposite, inconsistency, is a n undesirable trait: a person whose words do not match their beliefs or whose deeds don't match their words is at best a cowardly or indecisive person who is unreliable. Worse, they may be a two-faced scoundrel whose inconsistency is a method of deception. Or even worse, they may be mentally impaired. In any case their words mean nothing and they cannot be counted upon to keep their promises, both of which make them socially undesirable.
Aside of societal perceptions, consistency provides a person with a sense of security in an uncertain world. We expect the same actions to achieve the same results - our lives would be in constant chaos if they did not. We want this consistency so much that we ignore conditions in the environment that would undermine our plans until a failure has occurred. And when a failure occurs, we often seek a way to identify the exception so that we can be consistent in our rules.
Consistency also serves as a mental shortcut: because an action has achieved an outcome in the past, we expect it will achieve the same outcome in future and do not have to expend mental energy thinking about how to proceed. Particularly in a complex environment, we seek to categorize and generalize things, and expecting consistency saves a great deal of mental effort.
Consistency is also a mental defense that provides a "safe hiding place" from the troubling realities. If we expect things to be consistent, or assure ourselves that something has been consistent, it bolsters a sense of security: we avoid confronting the unknown by denying that it is unknown.
Belief Overpowers Reason
Cialdini gives an extended account of attending a recruiting session for a transcendental meditation program. Transcendental meditation is a mystical practice that promises a wide range of completely implausible benefits: creating inner peace, curing diseases, and psychic abilities such as clairvoyance and telepathy. It is, in a word, hokum.
He had brought with him a friend who was a skilled logician and who, after hearing the outlandish claims of the presenters, took two minutes to completely demolish their arguments, clearly explaining the ways in which their claims were contradictory, illogical and unsupportable. The presenters weakly replied that his arguments were "common criticisms" made by people with no experience in their mystic discipline.
The rest of the audience was also unmoved by the logician's case, and many rushed to pay the $75 down payment for admission to the training program.
After the incident, he spoke with some of the people who had signed up for the class - and each of them had come prepared with homes that the program would help them with their problems. Once was an actor who wanted to be better at his art, another was an insomniac who hoped to find a way to sleep better at night, and another was a struggling student hoping that meditation would help focus his mind so he could study more effectively.
He questioned them specifically about the logician's comments, and to his surprise he found that they understood the argument he had made entirely. One of their remarks was quite telling: the man said that he wasn't convinced by their presentation and found it all a bit specious - but once he heard the argument, he realized that he should sign up right away before he thought about it too much and decided not to sign up.
He then realized what was going on: these people had real problems and were desperate for a solution - and had convinced themselves that transcendental meditation was the answer. They heard and understood the logician's criticism - and panic set in. They wanted the program to help them solve their problems, and didn't want to think about the possibility that it would not.
This demonstrates how people want things to be as they expect them to be, and how they will readily ignore or dismiss any evidence to the contrary and to allow themselves to simply go with their emotions and first impressions in defiance of all logic.
(EN: This calls to mind a number of encounters I've had with people who were into mysticism, whether eastern and western religions, that claim to have had supernatural experiences. My sense is that these are like so many other BS stories, meant to aggrandize themselves - but Cialdini's remarks, as well as those in a book about the psychology of self-deception, lead me to wonder if people have not deluded themselves, or are attempting to convince themselves that what they believe is actually true.)
The author then tells an anecdote that seems like a tall tale, and yet at the same time seems like it might be plausible:
The claims is that toy companies, who normally experience a sharp drop in depend in January and February because parents have spent a great deal on toys during the Christmas season, create artificial shortages of common toys.
That is, they run ads for highly desirable items during the Christmas season without having the inventory to support sales - knowing that parents who promised their child a specific gift and are unable to supply it because it is sold-out during the Christmas rush will purchase the item when it becomes available a few months later - simply to keep their promises to their children.
In psychological terms, commitment is the dedication of a person to a statement that they have made in the past in spite of conditions in the present. A person who makes a promise feels obligated to keep his promise in spite of a change in conditions, and a person who has emphatically stated an idea will stubbornly defend that idea in spite of evidence to the contrary.
In casual conversation, many people will say things that they do not intend simply because they believe it is the right thing. For example, one charity asked people if they "would be willing to volunteer" to participate in a fundraising campaign. People indicated they would, because they had the perception that they would seem indifferent to the needy if they said they wouldn't be willing to volunteer. Then, they were immediately asked to sign up to work as a volunteer during a specific time. Asking "would you be willing" before asking for a commitment resulted in a 700% increase in volunteers.
A similar ploy is the seemingly innocuous question of "how are you feeling?" prior to making a solicitation for a donation of victims of disease or hunger. The trick to this is that if a person mentions they are doing well, which is the customary response, it is awkward for them to deny assistance to a person who is less fortunate than themselves. An experiment with this approach found that asking the question doubled the success rate of a telephone solicitation campaign, and those on whom it was used had a 89% compliance rate when later mailing in the donation. (EN: An interesting observation, but I do not see what this has to do with commitment - it's more of a guilt play.)
He then mentions the methods used by the Chinese during the Korean war. Whereas the North Koreans treated prisoners harshly and submitted them to beatings and torture to extract information, the Chinese used a different technique. Their technique was to start small and build. Over a period of interviews with prisoners, they would first ask them to agree with statements that were only mildly critical, such as "The United States is not perfect." Later, they would ask them to make a list of problems with the American way of life. Later, they would ask them to write an essay on some of the social problems and discuss then in greater detail. Then, the essay would be published in a pamphlet that was distributed in that camp and others.
Through this process, the prisoner would find that he had become a collaborator and the author of anti-American propaganda. And once they had accepted that about himself, the prisoner would be more pliable to the Chinese to elicit stronger propaganda, confessions, self-criticism, and even a genuine disdain for his own country.
The same tactic is often used by sales organizations in the United States. Knowing that they cannot approach a new customer sith a substantial demand, they make a small sale - and may even take a loss on that transaction - but recognize that the customer has committed to the brand and would make additional purchases afterward. This is the "foot-in-the-door" technique.
The same is true of charitable causes: a few researchers (Freedman and Fraser) demonstrated this by asking people to erect a large and poorly-lettered sign that read "drive carefully" in their lawns. 83% flatly refused. They tried another group, who at first were asked to erect a much smaller sign, then after a few weeks were asked to erect the larger one. 76% complied. They even found a high rate of compliance when the second sign was to "keep California beautiful" - which was not clearly connected to the earlier sign.
A similar tactic often used in politics is petitioning: a person can rather easily be convinced to sign his name to a petition in favor of a cause - but the point of the petition is not to present a list of signatures to an elected official (which is not effective as a tactic), but to create a sense of commitment that can influence the way that people alter vote on a referendum.
Each of these tactics work the same way: getting a person to make a small commitment causes them to change what they believe in a subtle way, and the change is then supported by future actions that are consistent to the past.
The mark's commitment may pertain to his perception of himself, or of a brand, or of a topic, or of a political or religious belief. Once the perception is set, future behavior follows in the same pattern.
Deeds not Words
The best evidence of what people truly are and what they truly believe comes from their actions rather than their words. What they say can be a pose, front, or deception - but what they do reflects their true character. And for this reason, people can be careless of their words but attempt to be consistent in their actions.
For this reason, manipulators will attempt to get a person to take an action rather than merely make a statement. It was important to the Chinese interrogators to get American prisoners to write a list of grievances about their country rather than just speak them. It is likewise important for a political activist to get people to sign a petition rather than merely agreeing with a statement.
A second advantage of action is that it is something that others can witness. Because commitment plays on the fear that other people will see us as unreliable, it is of high value to the manipulator to get his victim to take an action that others are capable of seeing. Aside of having a physical artifact to use as a reminder of their previous decision, there is the sense that others will be aware of their action and hold them accountable if their future actions are inconsistent.
The artifact, evidence of a statement or action, also triggers others to be compliant. Back to the Chinese interrogators: publishing a list of grievances written by one prisoner not only made that person feel committed to taking other actions against his country, but indicated to other prisoners that it was acceptable to take the first step themselves.
Another tactic used by the Chinese was to get the prisoners to believe that their letters home to loved ones would be delivered if they contained pro-communist statements. The men, desperate to let loved ones know they were still alive, would plant these statements in their letters, and the Chinese would later excerpt them and use them as proof or suggestion that they truly believed in those statements.
For men who were reluctant to write essays denigrating America or supporting communism, the interrogators held contests that would offer the prisoners some convenience - a blanket, a piece of fruit, a pack of cigarettes, or whatnot - as prizes.
An experiment is mentioned in which Americans were asked to write a brief statement that said something positive about Fidel Castro - even if they disapproved of the man, there must be something he had done right. When these statements were shown to other people, they assumed that they came from Castro supporters - they did not know the conditions under which the original authors wrote their statements and assumed them to actually believe the things they had written.
This same tactic is used by door-to-door salesmen who often suggest to people that their neighbors have purchased a product - though this blends a couple of different techniques: evidence that others have taken an action, along with a need for people to feel consistent with peers in their social group. To refuse to purchase what their neighbors had purchased gives a person the sense that they don't "fit in" to a group they wish to be part of.
And of course, many people are goaded into doing things simply to demonstrate that they have positive qualities of character. Any child goaded into doing something for which they might be punished, or which might cause them harm, by being called a coward has been subject to this manipulation tactic. But likewise, many people can be goaded into donating to charity because they fear that if they refuse the request they will be perceived as being callus or unkind.
An arguably positive use of this manipulation tactic is in getting people to write down goals that they wish to achieve in order to get them to feel more committed to them. Amway corporation, for example, asked salesmen to set their own sales goals and to write them down on paper - and then used this act to pressure them to make the goals that they had set for themselves. In this way, the victims are made to feel that they are working to keep their own word, rather than having an obligation placed upon them by someone else.
Cialdini mentions children's essay contests, often sponsored by commercial companies or political groups. He often wondered what the groups that sponsored such contests got out of the deal - but considering the tactics of the Chinese interrogators, it is fairly obvious: a student who writes an essay favoring a brand or a political cause will later become more likely to purchase the brand or support the cause. The cost of the contests are often quite cheap compared to the potential value of tens of thousands of consumers or voters who will be led by their initial action to support the brand.
Just like the prisoners who participated in similar contents to win a small favor, the writers are hooked into taking actions to be consistent with their beliefs, and the essays can be used as bait to get others to do the same.
Surprisingly many people are more willing to do something they know to be unethical or illegal if they think that no-one else is watching - which conversely means that people will be more willing to do something that reflects positively on their character if they have an audience.
Going back to the Chinese interrogators, getting prisoners to write down their grievances about American culture was a method of turning words into actions. Publishing them in a camp newsletter, or getting them to read their list to a group, was a method of creating witnesses - such that others would reinforce the notion that the person was pro-communist.
Further, it didn't really matter how the witnesses reacted. In fact, when other people disparage us for our beliefs, we are just as likely to cling to our beliefs rather than change them - particularly in American culture, where there is an emphasis on individualism and people defend their right to hold opinions with which others disagree, disparaging a person for believing something causes their beliefs to grow stronger. That is, it is more important for people to feel that they have integrity than it is for them to feel that other people approve of their beliefs.
Alternately, the desire to be consistent with others can be leveraged before a person makes a statement by showing them statements made by others. Consider the psychological experiment in which a subject is placed in a group of actors who have been instructed to choose an obviously wrong option - very often, the subject will go along with the answer that others chose even though he knows it to be wrong.
In another experiment, subjects were asked to make an assessment and then asked if they were sure of their assessment. Subjects who did this in isolation were far more likely to reconsider their choice than those who were placed in a room with other people - in this instance, they were defending the integrity of their original answer because others had witnessed it, and changing their answer would be a sign of indecisiveness.
A similar experiment arranged people into juries of six to twelve people and asked them to resolve a sample case. In situations where voting was done by a "show of hands" people were more likely to remain devoted to their initial assessment than if the voting were done by secret ballot - again, because others had witnessed their choice, they were less likely to change for a desire to appear consistent.
Consider the effectiveness of group therapy: whether a person is attempting to break a drug addiction or merely to lose weight, giving them a sense of belonging with a group of people in a similar situation and compelling them to make commitments before this group makes them more likely to comply with their goals because to do otherwise would be breaking a promise to the group and being publicly known to be weak and unreliable - especially when the group includes members who have been successful.
The same tactic is used by people who tell others what they hope to accomplish: they are well aware that the people whom they tell will hold them to their commitment, and the pressure from a social group is in some instances more influential than the pressure they place upon themselves.
(EN: The psychology of group selling is similar. When a salesman deals with a pack of people in public, he can play on peer pressure to get someone to make a purchase - and once one person has purchased, others fall in line, even if the first buyer is a stooge.)
Commitment as Sunk Cost
Another psychologically tendency that makes people vulnerable to taking an action is giving them the perception that they have already invested time and effort into achieving the outcome. Not only do they have the sense that their future actions must be consistent with their past ones, but they also have a fear of "wasting" the time and effort they have put into the pursuit of a goal.
Cialdini considers the "initiation rituals" of savage tribes - in particular, a three-month ordeal by youth in the Thonga tribe. He describes in detail an escalating pattern of abuse: a boy is firms verbally hazed, then he is stripped and his hair is cut, then he endures deprivation and exposure to the elements, then ritual beatings, and finally circumcision. After undergoing this ordeal, he is made a member of the tribe. This is seen to create fierce loyalty afterward.
As bizarre as such rituals seem, they are carried forward to the present day: new members of a club or sports team are often similarly initiated, young men joining fraternities are hazed, gang members are "jumped in" by being beaten, and basic training for the military focuses on degrading individuals. All of this makes them feel stronger loyalty to the group for having undergone an ordeal to become a member of it.
Should the reader think that the modern initiation rituals are less severe, Cialdini lists a number of actual hazing incidents that are entirely similar: verbal abuse of pledges, making them do menial and degrading chores, subjecting them to exposure to cold and heat, imposing fasts and lock-ins, physical beatings, threats and punishments, tattoos or brands, and even threats of death are quite common elements in hazing ordeals.
He also spends some time in examining the reason that ordeals like this have not gone away, in spite of many attempts to do so. Most colleges impose severe penalties on fraternities for hazing, and it is also common for colonialists to forbid tribesmen from holding initiation rituals - but they inevitably ignore or find a way around the bans. The existing members of the group consider their own ordeal to have been an important part of gaining membership, and even new initiates want to undergo the ritual to win the respect of their peers.
A common principle of psychology, demonstrated in various experiments, is that a person tends to value something more highly if they put effort into attaining it, particularly when compared to those who were granted it without effort.
An ethnographic study in college fraternities found that the social hierarchy was sorted according to participation in ordeals. Those who had voluntarily suffered hazing had a higher status in the group, and looked down upon members who had not been through the same ordeal. And more interestingly, those who had not been through the ordeal felt less connection to the group than those who had, and were more likely to quit or defect.
(EN: This is not mentioned, but it seems obvious that shopping clubs and privileged patron rewards are likely to have the same effect. The "ordeal" may be frequency of purchases, and the "esteem" is given to the customer in special treatment from members of the staff - ultimately, it encourages them to have stronger loyalty to the brand.)
Cialdini considers an interesting aspect of the previous examples. The Chinese interrogators offered small prizes in their essay contests - a blanket or a bit of fruit. Why didn't they offer more significant ones? Fraternities often refuse to include civic service activities, however distasteful and onerous, in their hazing activities, even though this would help improve their public image. Why don't they take advantage of this?
The answer Cialdini proposes is that it would undermine the commitment value of the activities. That is, significant or ulterior rewards to the activities provide an escape from the commitment trap: a prisoner who wrote an essay for a significant reward could let himself believe the reward was his primary motivation, and a pledge who performed community service as part of a hazing ritual can find pride in having performed civic service.
These things distract from or overshadow the primary purpose of commitment activities, such that a person is less prone to take responsibility for their actions as a reflection of their character. That is, the person who undertakes the first action must not have the opportunity to think "this is unusual for me" or "I'm only doing this because [reason]" because that would undermine commitment.
As such, any major incentive of a commitment activity becomes a dodge for the mark - he must later regard his initial as having been done for its own sake and of his own free will in order to feel committed to it so that he may be persuaded to take additional actions that are consistent to his choice. The less the incentive, the more his sense that his integrity depends on defending and repeating his original choice.
There is a common principle that parents should never heavily bribe or threaten children into doing the things we want them to adopt as habits, for the very same reason. Pressure, positive or negative, will produce only temporary compliances with our wishes and possibly resentment: the child is doing something to please the parent, not because he thinks it is a good idea. To get him to form a habit, he must believe in the correctness of his own actions and have no other excuse for having taken an action.
He mentions a series of experiments (Freedman) performed on boys, ages six to nine, in which an experimenter tried various methods to discourage them from playing with an attractive toy (a robot).
- Threatening the boys with punishment caused them to avoid the targeted toy only when the person who threatened punishment was present
- Moreover, the threat of punishment seemed to make the targeted toy even more interesting to the boys, even up to six weeks later
- Simply telling them "it is wrong to play with the robot" without a threat was far more effective - few boys approached the targeted toy when the experimenter left the room
- Six weeks after the second experiment, a much lower number of boys (33% as opposed to 80%) showed any interest in the forbidden toy
Cialdini reckons that the reason the second method was that the boys were not given a reason (external punishment) they should not play with a desirable toy, but merely told it was wrong and had to imagine or invent their own reasons to avoid it.
People are more likely to remain consistent to the commitments they make themselves rather than commitments placed upon them by others against their will. In the latter case, people comply only when the person who has bribed/threatened them is present, and seek to defy this authority figure to reinstate their sense of independence after complying with an order.
So contrary to popular belief, threats are not effective in creating long-term changes in behavior. They gain immediate compliance, followed by defiance and resentment. The more effective approach is to discourage something and let the other person invent their own reasons - as their integrity is preserved rather than undermined by cooperating.
Examples of Commitment Schemes
Compliance professionals love commitments that produce inner change, because they have longevity. Any salesman would prefer a customer who will buy again and again because they feel committed to one who will purchase just once to get a specific reward. Not only will he repeat a specific action on request, but he is also open to requests that are similar in nature, drawing on the same sense of commitment - and all without having to provide him with additional enticements.
The brand-loyal customer identifies with a brand as being an expression of his identity: he does not buy because of the benefit, but because the act of purchasing supports the image of himself he accepts and wishes to project to others. He will campaign for it and convince others to join him in an action he feels is right.
For example, consider the low-ball tactic - not the one in which a buyer offers a low price as a starting point in negotiation, but the one in which a seller advertises or offers a price that is low in order to get a customer's attention. In truth, the advertiser has no intention of making the deal he has offered, but is just attempting to get the customer to commit to the idea of buying his product. There will then be a lengthy discussion that builds the mark's interest, and eventually the salesman will get around to making a real offer that is higher than the lowball price.
It is important in using this technique to immediately shift the discussion away from price and onto product, so the customer's commitment is not to the price, but the product - otherwise they will be "stuck" on the price and unwilling to pay more. The price is just to get attention and the salesman will give the customer several other reasons to be committed to the purchase, diminishing the importance of the initial price.
Tactics for introducing a higher price include claiming there was an error in the calculations, addition of features that increase the price from that of the base model shown, having the offered item "out of stock" and selling the mark an upgraded version instead, or having a boss or manager refuse to honor the salesman's deal.
For car dealers, the trade-in is another method of lowballing: the salesman will mention an inflated offer for the vehicle that the customer plans to trade in, then will send the car to their service department to be inspected, after which they will come back with a lower offer. This typically happens after the customer committed to a price on the new vehicle, having expected his trade-in to diminish the amount of money he would have to pay.
And in both cases, the change is positioned in a way that causes the mark to feel responsible or to wish to protect his "friend" the salesman. When the manager refuses to honor the offer or the service department comes back with a lower offer, the customer is given the impression that the salesman could be in trouble for having made the offer, and is inclined to help protect them from harm by accepting the revised offer.
He goes back to the concept of abused women as victims of this tactic: their abusers make promises they don't mean to keep, and then distract them with the benefits of remaining committed to the relationship (in spite of the harm they will suffer). In part, it is the deception of the offer that gets their victims' attention - but the real power of influence is in reminding the victim of their "primary" commitment (the marriage) and treating the undesirable conditions (the abuse) as a burden they must bear to keep their commitment.
Another observation comes from a conservation campaign in Iowa, who was looking for ways to get customers to use less fuel. Their previous attempts had involved giving people energy-saving tips and advice to save fuel - and while all agreed to try, there was no significant difference. They then used an incentive - that the top energy-savers would have their names published in local newspaper articles. The effect was immediate, in that families who had agreed to the terms with the promise of public praise reduced their consumption significantly.
Then the rug was pulled out: after a few months the organization sent a letter to the families it had contacted, stating in apologetic terms that it would not be possible to publicize their names after all. But rather than return to their old habits, the participants continued and even increased their conservation efforts through the rest of the winter.
On one hand, this demonstrates habituation - that once the energy-saving habits were formed, people did not revert to their old behaviors. But more importantly, the homeowners who acted to conserve fuel recognized that the reward of publicity was really not that significant, and felt better and more public-spirited merely by conserving natural resources.
It's noted that the experiment was repeated in the summer, in a campaign to reduce electricity rather than natural gas consumption. A similar promise of publicity was offered, and it was also revoked. The campaign had similar success.
Defenses to Commitment Schemes
Cialdini mentions that Emerson's famous quote is often truncated: he had written that "a foolish consistency is the hobgoblin of little minds" but many people drop the first two words, and do need to be added back to have a proper perspective: consistency is a mark of discipline, but only when there is a reason to be consistent.
This is the basis of the author's "only effective defense" against the commitment trap: to question whether being consistent is actually worthwhile. Sometimes consistency is sensible and it provides us with an effective way to choose on a regular basis, but other times consistency is needless and even harmful. So it's not important to always be consistent, but to be consistent when there is value in consistency.
He suggests that a feeling "in your stomach" tells you the difference, that you instinctively sense that you are being played but cannot figure out quite how - and rather than sort it out you take the easy path of consistency.
(EN: This seems like a bit of a cop-out insofar as detection is concerned. An observation from personal experience dealing with slimy people is that a person who is using the consistency trap often does so very overtly: they suggest that something you're saying now is inconsistent with what you said before - and when you question them they are very careful to include or omit certain details from their account. Another detection method is to notice a pattern or repetition - people ask questions that seem obvious or to which they already know the answer, but they are attempting to get you to set up a pattern so they can spring the trap with a later question - very often, the third.)
In particular, be aware of your own motivations: if you are buying something or agreeing to do something in order to be consistent with what you have said about it before, rather than because you really see its value, then you are likely responding to a consistency trap.
Once the consistency trap is detected, the most effective defense is to call it out: tell them exactly what they are doing. When they realize that you're on to their game, they recognize that it won't work, and make a hasty retreat. He testifies that in his experience, this is "the perfect counterattack."
An example he provides shows how he explains the situation. To someone who indicated they were taking a survey about restaurant dining and then attempted to sell him a club membership, he said: "I recognize what's going on here ... your story about doing a survey was just a pretext for getting people to tell you how often they go out ... I'm not interested in your club. ... I refuse to allow myself to be locked into a mechanical sequence of commitment when I know it's wrongheaded. It would be stupid of me to spend money on something I don't want."
(EN: All of this seems very pedantic and clumsy - but in it, there are seeds for a less verbose retort: mentioning the previous questions were "just a pretext" and then state another principle or practice to which you wish to remain consistent, such as refusing to spend money on something you don't want. In essence, you show them [and yourself] that you are a consistent person, but that you are in a position where consistency to one idea would be inconsistent to a more important one.)
The goes back to the topic of abused housewives, and suggests that they can combat their consistent choice to be loyal to an abusive partner by isolating their previous decision from the present one. That is, at the time that they got involved, their partner was not yet abusive - if they knew then what they knew now, they would not have become involved. It's a matter of recognizing that with experience comes wisdom, and this experience should not be disregarded when making a choice.
The same mental dialogue is helpful in recognizing that the conditions under which past choices were made are different to the conditions under which a similar choice is made: they are not the same. People often fall into habits and fail to reconsider their choice. For example, a person continues to purchase gas from a station whose prices are higher than another that is further down the street. Why? Because their initial purchase was made when they were unaware of the other station and they fell into the consistency trap. Had they known before that there was a cheaper station a few blocks away, they would not have made their original choice, and should not feel bound to keep going to the same place out of habit now that they know better.
He also mentions that people often find or fabricate additional reasons to back their foolish consistency. They may tell themselves the gas station with the higher price is more convenient, or that they think their car runs better on the particular brand of gas. It's important to recognize that this is simply justifying the past choice: it's not that much more convenient, and they really can't tell whether their car runs better because they haven't compared. Recognize these justifications for what they are.
Ultimately, consistency works when you are consistent to things that are important, valuable, and true - and it works against you when you are consistent to things that are unimportant, costly, and false. Your best defense against yourself is to sort those things out - to know what really matters, and to recognize where being consistent in a way that someone else wants is contrary to being consistent to the things that really matter.