Rad Geek People's Daily

official state media for a secessionist republic of one

The Infovore’s Dilemma

Here's a post from the blog archives of Geekery Today; it was written about 4 years ago, in 2020, on the World Wide Web.

The basic predicament for intelligent action in a crisis is that information is laborious to sort, measurements are costly to get and costly to vet, and analysis takes time. Peer review and consideration takes longer. People act under uncertainty; when they are urged to act rapidly and drastically, they necessarily act — at best — on the best information they have at hand, and they are going to seek out and to produce lots more information — tentative results, estimates, conjectures and seemingly reasonable assumptions — that are very rough, because they also happen to be ready. The more rapidly, the more urgently, and the more drastically decision-makers start to act, or try to react, the sharper the predicament becomes: the supply of information available becomes necessarily more rushed and more tentative at the same time that the demand for certainty and unanimity becomes higher; the range of possible effects grows wider and deeper; the decisions and reactions themselves change the very regularities in other people’s choices, other people’s knowledge, other people’s circumstances, and the natural world that you are trying to observe or to assume. The more drastically and rapidly you act, the more they change, both in ways that you may be able to foresee and in ways that you cannot foresee, do not intend, and cannot control.

The result is that crises often produce a vast glut of information; but a lot of that information, and often the information most critical to making urgent decisions or taking drastic measures, is relatively low-quality information at best, information which has been produced rapidly, not vetted carefully, made on multiple simplifying assumptions, with huge error bars and wild, systematic skews that may be understandable in the pragmatic context of making a decision. (The extreme worse-case scenario might be highly salient even if it’s unlikely; data sets that aren’t entirely comparable may be the best you have for two things that you really need to compare; you might need to piously hope that some things go as planned even if you can’t be sure.) The more or less necessary predicament — you need time and effort to understand intelligently, but you need speed and freedom to take action — is often made worse by a number of extremely tempting, but extremely misleading, errors. A real need bold conjectures and decisive action is often conflated with unrealistic demands for dogmatic certainty; the real benefits of coordinated action are often conflated with a punitive demand for unanimity in belief and deference or conformity to appointed authorities. The deep epistemic problem with understanding the situation intelligently becomes not only the fact that high-quality information becomes so hard to find, but that low-quality information, or misinformation, crowds around all the watering holes in the cognitive ecosystem. Anecdotes are presented as data, toy models are presented as charts, tentative results are presented as What We Know Now, large scale syntheses of poorly comparable data from disparate sources are put forward as observed facts, third-hand sloganeering reports of experts’ tentative conclusions are put forward as conclusive arguments, simplifying assumptions are put forward as obvious and incontestable dogmatic principles. Actively seeking out information and absorbing it doesn’t necessarily serve to better inform or to improve your cognitive position; it often ends up being an exhausting means to skew your own judgment towards the prevailing trends and groupthink of the info-garbage that is most readily available to you.

None of this is any reason not to rely on imperfect information if you have to make a decision — what else are you going to rely on? It is a reason to act with the awareness that you’re taking a certain number of shots in the dark. It is a reason to prominently state simplifying assumptions used in arguments or models, and to acknowledge them as assumptions, not as oracular revelations, wherever possible. It is a reason to actively seek out, and publicize, the parts of what you’re saying that you’re least certain of, or that you know will be most contested by others, and to acknowledge what would follow or what might follow if those underlying premises turned out to be false. It’s a reason to be ready for and to do whatever you can to hedge against the risks of unintended consequences. It’s a reason to state numbers with error bars and to try to figure out lowball and highball constraints on what the real figure might be, if you’re wrong.

In circumstances that lead to a high risk of groupthink and overreach, it’s a reason to explicitly employ evidential markers when reporting claims; it’s a reason to cite and link to specific sources for specific claims rather than simply repeating them or presenting them as What Experts Are Saying, and it’s a reason for readers to spend some time following links and footnotes where they have been made available, or to significantly discount stories that don’t bother to provide them. It’s also a reason to actively seek out and cultivate second guesses, minority reports and dissenting opinions, rather than ignoring, scolding or punishing them.

In a high info-garbage environment, it is often worthwhile to deliberately limit, compartmentalize or substitute the consumption of certain kinds of low-quality or risky information. In particular, to restrict your intake of information where the persuasive power of the presentation is especially likely to outrun its real evidential import. You may be better off glancing at boring charts a few times over a few days than you are looking at infographics in a newspaper article; you are almost certainly better off reading the abstract and a paragraph or two of one scientific paper than you are reading through an explainer article attempting to gloss the conclusion of that paper while weaving it together narratively with interviews from two or three other pronouncements by experts in the field. Commentary is prone to be less valuable than reporting, and reporting less valuable than sources or data. In a high info-garbage environment it’s also especially important to be sensitive to the likelihood of mistakes, to record claims in a testable and falsifiable form and to go back and check on them over time, to prepare for imperfect or piecemeal implementation of plans, and actively try to gather information on potential or actual unintended consequences and perverse incentives.

The problem here is not that people will draw conclusions that are wrong, or to make decisions that turn out to be mistakes. Of course they will. If that wasn’t a real danger, then it wouldn’t be a crisis in the first place. The problem here is that if you want to draw conclusions that are less wrong, more often, — if you want to do less damage and realize more quickly when you make the wrong decision, — if you want to lower the chance of being misled — then that may mean being more selective rather than more completist in the sources of information that you pursue. And the sources to be most selective about will often be the ones that seem the most appealing from the standpoint of your own social and ideological starting-points. Consume thoughtful discussion and information, not too much, mostly data.

2 replies to The Infovore’s Dilemma Use a feed to Follow replies to this article · TrackBack URI

  1. Discussed at radgeek.com

    Rad Geek People's Daily 2020-03-24 – A Number Walks Into An Error Bar, And They Say “We’re Closed Until April 6.”:

    […] There’s also a lot that we do not yet know about Covid-19 as a disease, and a lot that we do not yet know about the effects of the drastic responses that people and institutions have already made in response to the Covid-19 pandemic, or the capabilities and knock-on effects of continuing and escalating measures that are being proposed. There has been an immense, often frenetic attempt to gather more information about the disease, about its spread, and about social and institutional responses to it — not just among experts or hobbyists, but by nearly everyone through news media, scientific journals, situation reports, social media, and every other outlet under the sun. That produces lots of information; it also presents lots and lots of low-quality information and spurious or specious info-garbage. […]

  2. Discussed at radgeek.com

    Rad Geek People's Daily 2020-03-28 – Is epidemic Covid-19 much worse in New York and New Jersey than everywhere else? If so, why?:

    […] hard to know for sure, but such numbers as we have[1] seem to indicate (1) that Covid-19 has spread everywhere throughout […]

Post a reply

By:
Your e-mail address will not be published.
You can register for an account and sign in to verify your identity and avoid spam traps.
Reply to Rad Geek People's Daily 2020-03-28 – Is epidemic Covid-19 much worse in New York and New Jersey than everywhere else? If so, why?

Use Markdown syntax for formatting. *emphasis* = emphasis, **strong** = strong, [link](http://xyz.com) = link,
> block quote to quote blocks of text.

This form is for public comments. Consult About: Comments for policies and copyright details.

Anticopyright. This was written in 2020 by Rad Geek. Feel free to reprint if you like it. This machine kills intellectual monopolists.