Posted on 7 mins read

Suppose a given audience can, in a majority of some kind, be expected to misinterpret a piece of information when presented factually. Is it then justified to present counterfactual information calibrated to be misinterpreted so that a correct meaning will be received?

“Misinterpretation” here includes not only a failure to understand the information as intended, but a failure to enact (or perform) it. The outcome is the same.

Take, for example, speed limits on public roads. Ideally, a speed limit would indicate the maximum safe speed that a road can be traveled. Somewhere between half and 85 percent of drivers routinely speed—that is, a majority of drivers enact a “misinterpreted limit” 5 to 15 miles per hour higher than the literal limit. Politicians find easy favor with voters by raising limits year after year, despite the safety risks, but drivers adjust their expectations and continue speeding regardless. The signage is unambiguous; the behavior is a highly predictable misinterpretation.

The nature of the misinterpretation is unclear. Maybe people are accustomed to an information environment that values legal defensibility over accuracy, so they see all rules as motivated by excessive caution and therefore flexible according to their individual level of risk tolerance. Maybe they see speeding tickets as the primary risk of speeding, and these are rare enough to trivialize the risk-benefit assessment. Maybe the data is misleading and there’s some “natural limit,” as conventional engineering wisdom holds, at which most people would drive regardless of signage. But for the sake of this discussion, we’ll assume that everything is as it seems and people will reliably drive slightly above whatever limit is chosen.

Is it the responsibility of government to set speed limits corresponding to the maximum safe speed of a road? Or, knowing that this information will be misinterpreted, is it their responsibility to set speed limits somewhat below the maximum safe speed, thereby influencing a majority of people to drive safely? The latter choice could be called a lie, or it could simply be called a performative act (a term I’ve written about before). This is an extreme case, one where literally thousands of lives hang in the balance, so here more than usual we are obligated to decide: does one have a greater responsibility to truth or effect?

Rarely is truth or effect so clear-cut. We constantly see information in the shape of fact. It’s not usually clear whether it represents best-effort knowledge and communication, a factless attempt to influence, or something in between. To say something you think will have beneficial effects with no regard for its truth or falsehood is no way to make friends, but an argument has frequently been made for its social utility. Governments and researchers have in the past lied to us “for our own good,” and history hasn’t looked kindly on it. Are we incensed because we found out they were lying? Or only because the effects were less positive than advertised?

Most people would agree that state secrets need to be protected. This is valuation of effect over truth; the lies necessary to keep privileged information under wraps range from denying knowledge of state operations to intentional misdirection and misinformation. But in practice, many of these lies have become carcinogenic to the careers and legacies of ranking officials. Meanwhile, whistleblowers who refuse to carry the lie face widespread condemnation and severe legal consequences. Should we conceptualize state secrets as a no-win scenario? Or is there some situation where an elected leader could admit dishonesty and we would all nod approvingly?

Let’s lower the stakes a bit.

Boxes of Q-tips are printed with the warning “Do not insert swab into ear canal. Entering the ear canal could cause injury.” This cannot possibly be genuine on the part of Unilever PLC, who owns the trademark. Despite the high risk of ear damage, the main reason people buy Q-tips is to stick them in their ears. Marketing campaigns have tried to convince us that Q-tips are useful for makeup application, first aid, and home crafts, but to limited effect. If everyone heeded the warning on the box, Q-tip sales would evaporate.

As I mentioned earlier, we live in a world where legal defensibility is king; liability is one of only a few truly powerful forces in business. Sometimes it seems companies are so concerned with lawsuits, they forget there are other ways to go bankrupt. Unilever prints a warning so we can’t sue them for perforated eardrums and dislocated ear bones, but they continue selling Q-tips with full awareness that the warning is ignored. It seems their loyalty is not to truth but to plausible deniability, and any loyalty to effect has been bought for the price of 200 million dollars annually. They could redesign the Q-tip with a flared shaft, a 90-degree bend at the tip, or some other feature that makes it hard to insert in your ear—without affecting its other uses—but they have no illusions about where their sales are coming from.

For comparison, consider the humble cigarette. A cigarette is barely longer than a Q-tip and could also be used in arts and crafts or as a makeup applicator (stick to the filtered ones unless you’re desperate for a smoky eye). One gets the impression that if they thought they could get away with it, Marlboro would put a warning on the box to the effect of “Do not light on fire; do not smoke; do not breathe fumes.” But the tobacco industry has been a public enemy for several decades now, and a stunt like that probably wouldn’t win any PR or legal battles.

Most of us would say the tobacco industry is “evil” for making and promoting a product that causes death and suffering (on the effect side of the scale) and lying about it (on the truth side). Must we likewise condemn the Q-tip industry? A perforated eardrum is much less serious than heart disease or cancer, but the bottom line is the same: Unilever makes and promotes a product they know to be harmful. The small print on the box only tells us not to use their product in the way it was specifically invented and designed to be used. They’ve committed a small budget to telling the truth, but none at all to reducing the product’s harmful effects.

Suppose Q-tips couldn’t be redesigned without a loss of usefulness for, say, makeup enthusiasts. What, then, is Unilever’s responsibility? One option would be to change the warning on the box to say “Keep away from ears. Entering the ear canal may result in deafness, blindness, and incontinence.” This may be an outrageous exaggeration, but some data indicates it would prevent misuse more effectively than the bland and nonspecific “Entering the ear canal may cause injury.”

Then again, if we require every manufacturer to exaggerate the risks of their products in order to reduce harm, public perception would likely adapt. Before long, consumers would read every warning as an exaggeration (which they may already do) and downplay it in interpretation, possibly well beyond what we intend.

Maybe specificity could help. If the warning were “Keep out of ears. 12,500 children go to the ER for Q-tip related injuries each year,” the impact would be clearer. But people are famously bad at understanding numbers larger than 5. It would be a difficult research task to find a way to express the actual, individual risk of a Q-tip’s conventional use in a way consumers would understand and assimilate.

There is, inconveniently, no universal formula of truth, effect, and social responsibility. However, I think it’s well established that all three are important. It isn’t enough to tell the truth and wash one’s hands of effect, nor is it necessarily right to lie for the public good. Sometimes harms can be mitigated on their own, but sometimes a change in messaging is required. Anyone whose job involves responsible communication must choose their approach with care.