Why  is 'expert error' so prevalent ?

jlh

When a whole advanced civilization is threatened with dramatic reversals of its plans, and the best experts have to change their advice from day to day, with confusing but compelling dissent all around, well, you wonder who to trust.  

My bottom line is that this is good evidence, of our having exceeded our understanding of the environment we're in.   Humanity has been profiting greatly from tackling ever more complicated problems for centuries, but we seem to have gotten in over our head, and using the methods that worked before didn't tell us this was coming.     I'd like to say "trust me instead" but I know better, because the evidence is that almost none of us can tell who to trust.   The trap in that is our tendency then to then fall back on what we used to trust, despite the clear evidence that's unwise too.


1- 5/1/09 People have always needed to decide what to believe for themselves.   That kind of "hard fact" in a slippery world of conflicting issues is useful.   In picking our way along, even conflicting evidence that refuses to go away is a good guide too.   Both give you firm and unshakeable connection with realities beyond the swirl of hearsay.  

That science has, curiously, described a world in which "individual events follow general laws" is one of those persistent questions for me, you know "how??".   It's a "hook" into the problem of why experts are so frequently wrong about any individual thing really, but still right on average about many unrelated things.   It's how the scientific method is designed, to ignore individual things and be right on average.   To improve our error rate with individual events we'd, if being logical, localize the errors and try new things.

I'm more than usually concerned with another pattern that won't go away.   It's that our culture's many efforts at "problem solving" (business, government, scientist, public intellectuals, etc...) are hardening their attachments to old solutions and becoming disinterested in "problem finding" that would lead to new solutions.   It seems we're all falling back on OLD hearsay in seeking safety of ideas in a confusing time.    It's evident in things as simple as the prevalent response to a down economy, trying to help people find work.   The actual problem is that there are not enough jobs rather than not enough or good enough resumes.    What I've noticed quite broadly is that people understand this kind of error but respond by ignoring it.    To respond to it, and promote the kind of connections that foster job creation, would mean departing from safe and familiar thinking       

I'm finding this broadly, and of most concern among career professionals, that the facts pointing to new questions are recognized but quickly forgotten, and this especially experts who are most capable of rigorous thinking about large complex issues.   They go back to their old way or reasoning, the old "hearsay", in preference to seeking a new ones.   One of a number of examples an the habit of professionals who were measuring "sustainability" as an efficiency improvement, and unable to change even when realizing it does not measure whether total  impacts are increasing or decreasing.   The sustainable development experts all acknowledge the problem as you describe it, but go right back to how they were working, seeming not to know how to even mark the question "unanswered" in their minds.  For professionals with a real due diligence obligation this amount to willful ignorance. 

I have a sadly long list of these.  Seemingly the higher the level of the expert or scientist the more incapable they seem to be to question the shaky assumptions of their science, seemingly because the world is changing in a way that exposes many errors in their old thinking, and their reaction is to go backwards rather than forwards.   Key among these is the inability of scientists to acknowledger that sets of equations can't reliably represent physical systems with many independent parts, so a way of tracking reality without a formula is needed.   In my experience they see it and drop it over and over.   This broad response to rigidify thinking rather than lubricate exploration when confronted with deep change seems to closely fit Jared Diamond's and Arnold Toynbee's observations on the failure of complex societies:

Diamond agrees with Toynbee that "civilizations die from suicide, not by murder" when they fail to meet the challenges of their times. However, where Toynbee argues that the root cause of collapse is the decay of a society's "creative minority" into "a position of inherited privilege which it has ceased to merit", Diamond ascribes more weight to conscious minimization of environmental factors. In either case it describes the impasse as a cognitive problem, societies which excel in problem solving having mental fixations that prevent their later problems from being recognized. Wikipedia on J.D. & A.T

The implication is that things in our hugely misbehaving world "seem normal" partly because of the temptation to cling tightly to what we grew up to believe was normal.  We may be living in a hallucination.    My first hand evidence, is that a broad range of professionals are acting in a willfully ignorant way about the discrepancies between their old thinking and the new world into which we have been pushed.     Being more open to exploring it would be better for all.



Synapse9  jlh  contact