One of the problems with
most scientific thinking as applied to political science, what this blog has
pointed out as the incorporation of the political systems model during the
1950s, is its tendency to frame a point of interest as a definite set of
variables. Usually, the question that
scientific analysis asks: what happens
to factor Y when factor X varies?
If
enough cases are looked at and a pattern is detected, a relationship is proposed. For example, if people with more education
(factor X) vote (factor Y) more often than people with less education, then one
can establish a relationship between education and voting. It even suggests a cause and effect between
the two.
Theoretically,
one can say there is something about education that, at least, encourages
people to go out and cast their votes on election day. This is never stated as a fact, but a
theoretical relationship. Other factors
can be affecting this correlation.
Perhaps higher educated people have jobs that allow them the time to
vote and that’s what is really causing the uptick in voting. That mystery will not be solved here but it
is cited to illustrate a point.
The point is that this sort of scientific investigation
tends to narrow one’s gaze to very narrow factors or variables. As such, something is sacrificed. That is, by having a narrow view one misses
the wholistic realities that inhabit the various landscapes and environments that
social scientists investigate. And this
is not just a matter of social scientists being deprived of important
information, but in turn, it also affects all those professional managers that
depend on the work of those scientists to make their managerial decisions.
This narrow perspective has a name – reductionism. Earlier in this blog, David Brooks was cited
over this concern. To repeat, here is what
Brooks writes:
This way of thinking
[reductionism] induces people to think they can understand a problem by
dissecting it into its various parts.
They can understand a person’s personality if they just tease out and
investigate his genetic or environmental traits. This deductive mode is the specialty of
conscious cognition – the sort of cognition that is linear and logical.
The
problem with this approach is that it has trouble explaining dynamic
complexity, the essential feature of a human being, a culture, or a
society. So recently there has been a
greater appreciation of the structure of emergent systems. Emergent systems exist when different
elements come together and produce something that is greater than the sum of
their parts. Or, to put it differently,
the pieces of a system interact, and out of their interaction something
entirely new emerges.[1]
Another expert, the
social scientist Sydney Dekker, delves into this problem of narrow viewing, and
brings out some patterns and ironies that his observations reveal.
In
an article, “Drifting into Failure: Theorising
the Dynamics of Disaster Incubation,”[2] Sidney Dekker and Shawn
Pruchnicki begin by citing a historical event that they feel makes their
overall point. That event was the
initial attacks – by an Arab coalition – that ignited the Yom Kippur war. Given the very competent intelligence capabilities
the Israelis enjoy(ed), they were caught off guard by the onset of the
hostilities. No, one cannot say the
needed information was not available or did not exist. It existed and it was available to their
intelligence apparatus.
The
intuitive bias is for people to believe such “screw ups” happen to less
competent people or organizations. They
might happen to them, but they also happen to the gifted or skilled-laden
entities. As a matter of course, their
very skills and competencies can facilitate the actuality of such unfortunate
results. They can even lead to systems
collapsing.
This
blogger believes that success (the product of being skillful and competent)
leads to complexity in that success usually leads to expansion. In turn, success and expansion lead to
systems thinking which includes the development of protocols, chains of
authority, established theoretical allegiances, modes of deference, norms, and
other mental constructs that define how one entity – a company, a governmental
agency, a government – “sees” things and, in turn, determines how it behaves or
should behave.
And
this includes factors such as organizational values, attitudes, dispositions,
and shared (what is taken to be) knowledge.
All this leads to a problem; that is, in that complexity, certain
problems exist and they, due to the set of lenses shared by that organization’s
personnel, are hidden. This reminds one
of what Donald Rumsfeld pointed out: “The
things we don’t know we don’t know.” And
since they are unknown – twice over – they just fester and grow, i.e., they
become malignancies.
In
short, they incubate. Somehow, the
common thinking that says find the broken part and fix it, just does not lead
to discovering and addressing a malignancy that is growing and causing not just
problems but undermining the very legitimacy of the system. There are various reasons for this
“blindness.”
Here
is how Dekker and Pruchnicki conclude their analysis:
·
Larger, successful organizations usually
operate in environments of pressures due to (1) scarcity of resources and
competition against other entities, (2) an imposed lack of transparency with
sprawling, complex structures, (3) information being pre-formatted in developed
styles or language, and (4) the usual incremental pacing of decision-making which
becomes more incremental over time.
·
Accepted ways and beliefs that develop to
protect the organization (e.g., risk assessment or risk management strategies
and personnel) encourages false confidence in them and serves to obstruct
seeing what “is not known.”
·
Structural elements that seek the
“unknown” have counter forces, i.e., costs involved with uncertain technologies
and un or underdeveloped knowledge and technologies that change promises to
entail. These potential costs tend to be
considered next to the incremental nature of incubating problems.
·
If needed, transformational change
(calling for changes in beliefs, attitudes, and/or values) is judged against the
pressures of scarcity and competition, making needed change appear to be
impossible – even when they are not – or just too expensive.
·
And
Organisations incubate accident not
because they are doing all kinds of things wrong, but because they are doing
most things right. And what they
measure, count, record, tabulate and learn, even inside of their own safety
management system, regulatory approval, auditing systems or loss prevention
systems, might suggest nothing to the contrary.[3]
These
are the terms in which problems develop and go unnoticed for meaningful extensions
of time; what Dekker and Pruchnicki describe as incubation. This results in disconnects between the
organization’s goals and assessments and the decisions their staffs make. The temptation is to “kick it down the road”
or simply ignore what might result in extraordinary, unforeseen events.
Incubation
occurs not because of incompetence, but because the organization is doing
things correctly by the “book” of success.
The problem is the “book” doesn’t address the problem incubating under
everyone’s nose. So, what does all that
have to do with polarization, the current concern of this blog? The answer is found in the next posting.
[1] David Brooks, The Social
Animal: The Hidden Sources of Love,
Character, and Achievement (New York, NY:
Random House, 2011), 108-109.
[2] Sidney Dekker and Shawn Pruchnicki,
Theoretical Issues in Ergonomics Science, 2013, accessed 7/8/2020, https://safetydifferently.com/wp-content/uploads/2014/08/SDDriftPaper.pdf , 1-11.
[3]
Ibid., 8 (Australian spelling).