Elsevier

Economics Letters

Volume 124, Issue 2, August 2014, Pages 195-198
Economics Letters

Persuasion, binary choice, and the costs of dishonesty

https://doi.org/10.1016/j.econlet.2014.05.013Get rights and content

Highlights

  • This paper studies persuasion when lies are costly and decisions are binary.

  • It shows that the equilibrium probability that a biased sender gets his way is a non-monotone function of these costs.

  • The results suggest that if the sender can determine these costs ex ante, he will choose intermediate costs.

Abstract

We study the strategic interaction between a decision maker who needs to take a binary decision but is uncertain about relevant facts and an informed expert who can send a message to the decision maker but has a preference over the decision. We show that the probability that the expert can persuade the decision maker to take the expert’s preferred decision is a hump-shaped function of his costs of sending dishonest messages.

Introduction

Many decision problems are binary in nature and characterized by the uncertainty that the decision maker faces about crucial decision-relevant facts. Examples include a policy maker’s decision whether or not to realize a given infrastructure project, a Board’s decision whether or not to replace a company’s CEO, the voters’ decision whether or not to re-elect an incumbent government, or a judge’s decision whether or not to convict a defendant. To reduce uncertainty decision makers often consult experts who are better informed about the underlying facts. However, experts may themselves have a preference over the decision, and this preference may not be well-aligned with the decision maker’s preference. Examples include industry experts who are interested in benefitting from public investment, CEOs and incumbent governments who want to remain in power and who know more about their performance and competence than Boards and voters, respectively, and (expert) witnesses in trials who have private information about the level of fault of the defendant, but may be biased towards a specific outcome of the trial.

The question of how likely an expert is to persuade the decision maker is of obvious interest and relevance for public policy and the economics of organization. In this letter, we focus on one key aspect that may affect persuasion: the expert’s costs of dishonesty, which can represent mental or moral costs of lying, reputational concerns, or expected punishment when misreporting is unlawful.

Our main result is that experts will not be able to persuade a critical decision maker if costs of dishonesty are very low, in which case the decision maker rarely follows the expert’s advice, or if costs of dishonesty are very high, in which case the expert rarely deviates from telling the truth. However, the expert frequently succeeds in persuading the decision maker to take the expert’s preferred binary decision when the costs of dishonesty are intermediate.

Persuasion started to become an influential concept in economics with McCloskey and Klamer (1995). Recent theoretical contributions include Mullainathan et al. (2008) who focus on how a sender can persuade receivers who are coarse thinking (rather than Bayesian), and Kamenica and Gentzkow (2011) and Kolotilin (2013) who study Bayesian persuasion in a setting in which the sender can choose the signal, but cannot misreport its realization. In contrast, we focus on Bayesian persuasion through misreporting the truth.

The canonical model to study strategic interactions between a sender (or expert) and a receiver (or decision maker) is Crawford and Sobel (1982). We depart from their framework in two important ways. First, we assume that it is costly for the sender to misreport the state of the world. Second, the choice variable of the receiver is binary rather than continuous. This second difference implies that there is a conflict of interest between the sender and the receiver for some, but not all states. While in Crawford and Sobel’s model the sender and the receiver perpetually disagree about the optimal policy (even under complete information), their disagreement is only partial in our setup.

Banks (1990) introduced lying costs into the literature on strategic information transmission. Subsequent contributions include Callander and Wilkie (2007), Kartik et al. (2007) and Kartik (2009). Kartik et al. (2007) and Kartik (2009) add lying costs to the framework of Crawford and Sobel, but maintain the assumptions that the receiver’s choice variable is continuous and that the sender’s preferred action increases in the state of the world. Our analysis thus complements theirs with the key modification that the receiver’s choice is binary, which makes our model suitable to study the real-world problems discussed above.

Our framework is relevant for various applications. First, it complements contributions showing how CEOs influence continuous outcome variables, such as the market price of the firm (Fischer and Verrecchia, 2000), his compensation (Goldman and Slezak, 2006), or the range of possible projects (Adams and Ferreira, 2007). Second, when applied to incumbent government behavior, our model is related to the aforementioned works by Banks (1990) and Callander and Wilkie (2007). While they analyze how two symmetric candidates make costly lies about future policies, our model applies to asymmetric elections in which an incumbent with an informational advantage about the state of the world runs for re-election. Hence, our model is also related to Rogoff and Sibert (1988) and Hodler et al. (2010), where an incumbent with private information about his competence or the state of the world may choose socially inefficient policies to improve his re-election prospects, and to Edmond (2013), where a dictator manipulates information to reduce the risk of an uprising.

This letter is organized as follows: Section  2 describes the setup, Section  3 provides the results and Section  4 concludes.

Section snippets

The setup

There are two strategic players: sender (or expert) S and receiver (or decision maker) R. The state of the world σ is a random draw from the distribution F(σ) with density f(σ)>0 and support [0,1], which is common knowledge. Timing and actions are as follows: first, S observes σ and sends message μ[0,1].1 Second, R

Equilibrium

Let σ be the unique number such that σ1uR(σ)f(σ)dσ=0. That is, if R’s posterior satisfies π(σ|μ)f(σ) for all σ[σ,1] and π(σ|μ)=0; otherwise, she is indifferent between accepting and rejecting the project. This is the case if some message μ is sent with equal probability in all states σ[σ,1] and with zero probability otherwise. Notice that σ<σˆ.

The equilibrium behavior depends on whether S would be willing to play μ(σ)=1 if v(1)=1 but v(μ)=0 for any μ<1, i.e., on whether S would

Conclusions

We have shown that an expert is most persuasive if misreporting the truth is neither too cheap, nor too costly. Hence, one should expect experts to be most influential in circumstances in which their costs of dishonesty are intermediate.

References (16)

There are more references available in the full text version of this article.

Cited by (0)

The authors would like to thank an anonymous referee whose comments have helped us improve the paper. Financial support by the Faculty of Business and Economics at the University of Melbourne via a Visiting Scholar Grant is also gratefully acknowledged. This paper supersedes an earlier version titled “Biased Experts, Costly Lies, and Binary Decisions”.

View full text