Transparency requirements are often put in place to ensure that decision makers are following the rules. After all, sunlight is the best disinfectant, right? That’s one reason many boards keep a record of meeting minutes, managers exhaustively document hiring processes, and marketers are required to communicate on monitored platforms.
But a new study shows that these kinds of measures may not always be very effective. Nemania Anditsassistant professor of managerial economics and decision science at Kellogg, and his colleagues find that it is often possible for people to share enough information during discussions to have the desired outcome while still maintaining reasonable skepticism about how much people need to know. their communications ever made public.
In other words, transparency can often be gamed.
In many cases, “you can get the same effect you would have without transparency, and that was very strange to us,” says Antic.
Depending on your perspective, this ability to bypass transparency is either really good or really bad.
Take people under surveillance, who of course may not be doing anything unethical or illegal. For them, the study offers a kind of road map for how to gather information and use it to make the best possible decision, while maintaining enough uncertainty about what you know to keep you safe from potential punishment. This could be useful for activists who want to share information under the watch of an authoritarian government, or for business leaders who want to discuss a potential takeover without their words later being taken out of context by antitrust regulators (or indeed for anyone transmitting sensitive information over an online platform that could one day be hacked).
But the finding also has implications for governments, regulators and others who design and use transparency measures. To ensure that their own interests are not violated, they may need to rely on other strategies, such as limiting how and when information is shared.
Covert Communication
For insights into how control can affect decision making, Antic and his colleagues, Archishman Chakraborty of Yeshiva University and Rick Harbaugh of Indiana University, turned to game theory. They constructed a mathematical model in which two parties with a common interest exchange information back and forth before making a final decision—all under the gaze of a careful observer with slightly different interests.
To understand how their model works, and thus how transparency can come into play, consider a scenario in which two managers work together to evaluate potential sites for a new mining venture. The former manager has information about the economic impact of new sites—small, medium, or large—while the latter is armed with information about small, medium, or large environmental costs. In order to know whether to proceed with a given site, both must contribute information to the decision.
Meanwhile, the general public also cares about whether a particular location is chosen and weighs the environmental impact more than the company does.
Sometimes, the company and the public are on the same page. For “good” sites, where the economic benefit outweighs the environmental cost, both business and the public agree that the project should go ahead. And for “bad” sites, where the environmental cost exceeds the economic benefit, everyone agrees that the project should not go ahead. But when the economic benefit and environmental cost are roughly similar—for example, a project with medium economic benefit and medium environmental cost, or high economic benefit and high environmental cost—the parties are at odds. The company wants to go ahead with these “mediocre” locations, while the public will not approve.
Essentially, the public can find out what information managers share with each other to make their decision. This means that directors cannot knowingly act on information in a way that is contrary to the public interest or face punishment. As a manager, Antic says, you have to show that “given all the information available to you at the time, you made a decision that was palatable to the public.”
If the public cannot be sure whether the trustees actually knew they were acting against the public’s wishes, however, the trustees will be given the benefit of the doubt. This means that administrators who promote a mediocre site can avoid a penalty if they can also show that, given the information discussed, the site could also have been good.
The researchers find that, for a scenario such as the one described, it is always possible for managers to make the same decision that they would have liked to make anyway, while maintaining a plausible deniability. By carefully planning the order in which information is shared and stopping before everyone’s cards fall on the table, administrators can work together to decide whether to move forward while not publicly distinguishing between good sites that the public will approve of and mediocre ones. will not.
(article continues below)
For example, an admin might say, “What do you think” as a way to imply that they have information about the site that they can share only after the other admin provides additional context. Or a manager might say that the cost is not “high,” leaving it unknown whether the cost is small or medium.
“These discussions are about providing information, but mostly about providing context for how to interpret any future comments,” Antic says. “You want to have enough information to make a decision, but not too much information.”
He points out that there are even cases in which parties may welcome public scrutiny. “Let’s say there’s a lot of mistrust” between a company and the general public, Antic says. In cases where the public is likely to object to a company’s decision, the company may indeed demand transparency, because by revealing to the public exactly what it knew when the decision was made, the company can show that the conflict of interest is not so great. as the public imagines.
Preparation for Audit
The utility of plausible deniability is nothing new. Organizations that know they will be audited may deliberately provide their leaders with as little information as possible. Remember how, regarding the CIA’s “enhanced interrogation techniques,” the White House counsel told President George W. Bush, “Mr. President, I think for your own protection you don’t need to know the details of what’s going on here.” Or similarly, a leader may deliberately avoid seeking information from a subordinate so as not to be held accountable for what he learns. These strategies, of course, can lead to terrible decisions.
“But the surprising lesson from this paper is that it’s often possible” to get the effect you want while maintaining this plausible deniability, Antic says.
For people sharing sensitive information under the threat of surveillance, the study suggests that there is much to be lost—and much to be gained—by communicating in a way that an outside observer cannot object to, even if those communications are made somewhat more circular. In the researcher’s model, this meant exchanging bits of information back and forth with the understanding that your interlocutor’s future observations will be understood in the context provided by your statements. Lawyers representing companies in antitrust cases, for example, often caution against using shorthand like “Get rid of them” that could later be misinterpreted. They suggest reserving any sensitive details discussed with the precise context in which they need to be understood. In other words, it’s generally wise to stick to language that would be acceptable if released, even if you think it won’t be released.
For those responsible for implementing transparency requirements, the lessons are just as stark. Other measures, such as limiting how the parties can communicate with each other, may be necessary to prevent the transparency game.
In fact, Antic suggests, this may be one reason sunshine laws and other transparency requirements don’t always have much of an impact. It highlights some of the policies intended to eliminate bias from hiring processes, such as hiring blindly or screening a hiring committee’s communications. If, however, a committee is able to design the hiring process and in particular to control the information shared, it may be able to maintain a reasonable refusal without changing the final hiring decisions, and so the policies may not be particularly effective.
“Maybe it shows why some of these policies don’t lead to real change,” Antic says.
Selected Faculty
About the Author
Jessica Love is its editor-in-chief Kellogg Insight.
About the Research
Disruptive Conversations. Antic, Nemanja, Archishman Chakraborty, and Rick Harbaugh. Working paper.