SciELO - Scientific Electronic Library Online

 
 número62La gobernación de los pacientes adecuados en el acceso a la procreación médicamente asistida en PortugalPatrulha e proximidade: Uma etnografia da polícia em Lisboa índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • No hay articulos similaresSimilares en SciELO

Compartir


Sociologia, Problemas e Práticas

versión impresa ISSN 0873-6529

Sociologia, Problemas e Práticas  n.62 Oeiras abr. 2010

 

Technology, complexity, and risk: a social systems perspective on the discourses and regulation of the hazards of socio-technical systems

 

Tom R. Burns*, and Nora Machado**

* Woods Institute for the Environment, Stanford University, Stanford, CA; CIES, Lisbon University Institute, Lisbon, Portugal; Department. of Sociology, University of Uppsala, Box 821, 75108 Uppsala, Sweden. E-mail: tomburns@stanford.edu.

**CIES, Lisbon University Institute, Lisbon, Portugal; Science, Technology, and Society Program, Stanford University, Stanford, CA; Department of Sociology, University of Gothenburg, Gothenburg, Sweden (on leave-of-absence). E-mail: nora.machado@iscte.pt.

 

Abstract

This is the second part of a two part article. In Part I, a social systems theory was applied to the analysis of hazardous technology and socio-technical systems, their complex dynamics, and risky dimensions and likelihood of accidents. It identified many of the diverse human risk factors associated with complex technologies and socio-technical systems, thus contributing knowledge toward preventing — or minimizing the likelihood of — accidents or catastrophes. This second part of the article will systematically address the broader issues of risk conceptions, analysis, and management in contemporary society including policy and other practical aspects. The social systems perspective and its derivations are contrasted to such impressionistic conceptions as those of Ulrich Beck. Section 1 of the paper introduces the topic of risk as a discursive concept in contemporary society. Our point of departure is the social system approach introduced in Part I, which is contrasted to that of Ulrich Beck, who eschews systematic theorizing at the same time that he denigrates empirical sociology. The section stresses that contemporary society is not so much threatened by high risks all around (as in Ulrich Beck’s “risk society”) but is more characterized by its developed risk discourses (a great deal owing to Beck himself), risk consciousness, risk theorizing, and risk management. What is truly characteristic of modern society are discretionary powers to determine dimensions, levels, and regulation of risk, that is, choices can be made whether or not to develop a technology, whether or not to not to tightly regulate it, for instance limiting or banning its use or whether or not to allow its widespread application, and under what conditions. Section 2 provides a brief review of our social systems framework, actor-system-dialectics (ASD) theory. Section 3 treats risk and risk analysis in a systems perspective, emphasizing the limitations of risk assessment and the risk management of complex, hazardous systems. Section 4 considers several principles which may serve to guide policy-making and regulation with respect to the hazards and risks of complex technologies and socio-technical systems.

Key-words: actor-system dialectics, technology, socio-technical system, risk, risky system, accident, regulation, complexity.

 

Tecnologia, complexidade e risco: a teoria dos sistemas sociais na análise dos discursos e regulação de risco nos sistemas sociotécnicos

Resumo

O ponto de partida deste artigo é a teoria dos sistemas sociais apresentada na parte I, ou seja, a dinâmica actor-sistema, em contraposição a perspectivas como a de Ulrich Beck, que, em particular, rejeita o teorizar sistemático, ao mesmo tempo que denigre a sociologia empírica. Este artigo salienta que a sociedade contemporânea (definida por Beck como “sociedade de risco”) não é tão ameaçada por riscos elevados, sendo antes caracterizada pelos discursos generalizados de risco (em grande parte devido ao próprio Beck) — elaborações teóricas acerca do risco, formas de gestão do risco, consciência de risco. O que é verdadeiramente uma característica da sociedade moderna é a existência de poderes discricionários para determinar dimensões e níveis de risco, assim como medidas reguladoras. Por outras palavras, na sociedade moderna existe a opção de arquitetar, ou não, uma tecnologia, regulá-la fortemente, ou não, limitar ou banir o seu uso, permitir, ou não, a sua aplicação generalizada, bem como definir em que condições isso pode ocorrer.

Palavras-chave: dialética actor-sistema, tecnologia, sistema sociotécnico, risco, sistema de risco, acidente, regulação, complexidade.

 

Technologie, complexité et risque: la théorie des systèmes sociaux dans l’analyse des discours et de la régulation des systèmes sociotechniques

Résumé

Le point de départ de cet article est la théorie des systèmes sociaux présentée dans la partie I, c’est-à-dire la dynamique acteur-système, par opposition à des perspectives telles que celles d’Ulrich Beck, qui rejette en particulier la théorisation systématique, tout en dénigrant la sociologie empirique. Cet article souligne que la société contemporaine (définie par Beck comme “société de risque”) n’est pas si menacée par des risques importants, mais plutôt caractérisée par les discours généralisés du risque (en grande partie à cause de Beck lui-même) — élaborations théoriques sur le risque, modes de gestion du risque, conscience du risque. Ce qui caractérise vraiment la société moderne c’est l’existence de pouvoirs discrétionnaires pour déterminer les dimensions et les niveaux de risque. Autrement dit, on a le choix de développer ou non une technologie, de la réguler fortement ou non, de limiter ou de banir son utilisation, de permettre ou non son application généralisée, ainsi que de définir dans quelles conditions cela doit être fait.

Mots-clés: dialectique acteur-système, technologie, système sociotechnique, risque, système de risque, accident, régulation, complexité.

 

Tecnologia, complexidad y riesgo: discursos y regulamiento de riesgo en systemas socio-técnicos desde la teoria de sistemas sociales

Resumen

Nuestro punto de partida es la perspectiva de sistemas sociales introducido en la Parte I — o sea la dinámica actor-sistema, en contraposición a perspectivas de tipo Ulrich Beck, que particularmente rechazan el teorizar sistemático a la vez que niegan valor a la sociología empírica. Este articulo enfatiza que la sociedad contemporánea (por Beck definida como “sociedad de riesgo”) es caracterizada tanto más por los extendidos discursos acerca del riesgo (curiosamente mucho de esto debido al mismo Beck) — elaboraciones teóricas acerca del riesgo, gestiones del riesgo, sensibilización a la noción de riesgo —, que por los riesgos a que efectivamente estaría expuesta. Lo que es verdaderamente característico de las sociedades modernas es el enorme poder discrecional que estas tienen para determinar las dimensiones, niveles y regulación del riesgo. Esto quiere decir que la sociedad moderna tiene la opción de desarrollar o no diversos tipos de tecnología y de cuan estrictamente debe la tecnología ser regulada, estableciendo límites y condiciones de uso, o prohibiéndola en su totalidad.

Palabras-llave: dialéctica actor-sistema, tecnología, sistema socio-técnico, riesgo, sistema azaroso, accidente, regulación, complejidad.

 

 

Risk as a discursive concept in contemporary democratic society

Four factors make risk a major discursive concept in contemporary society. [1] (1) With modern science and engineering continually providing innovations, powerful agents can introduce and construct on an ongoing basis technologies, many of them causing, or threatening to cause, substantial harm to people and to the social and physical environments. (2) The complexity and originality of the innovations exceed the immediate capacity of relevant agents to fully understand and regulate them and their impacts. In this way, human communities are confronted with systems of their own making that are not fully knowable or controllable in advance and, therefore, are likely to generate negative, unintended consequences (the “Frankenstein effect”). Serious, unexpected problems, “near misses”, and accidents indicate that human knowledge and capacity to control such human constructions and their consequences are bounded. (3) Those managing and operating these systems often learn to know them better — in part through experience with them — and may be able to construct or discover better models and methods with which to diagnose and correct malfunctioning and negative unintended consequences.[2] (4) Within modern democratic societies, there is increasing collective awareness and critical public discussion about the limited knowledge and control capacity with respect to technology and some of the substantial risks involved. Growing public awareness about the level of ignorance and the risks involved in the context of democratic societies contributes to the politicalization of technology and technological development and to growing skepticism about, and de-legitimation of, major technological initiatives.

Several of the arguments of this article (Parts I and II) relate to those of Ulrich Beck (1992; 1997; Beck, Giddens and Lash, 1994). Like him, we would argue that modern societies as in earlier times are confronted with many potential hazards or risks. Some of these have natural causes. Many have anthropogenic causes, arising in connection with the introduction and operation of modern technologies and socio-technical systems. In Beck’s perspective, Western modernization has led to a transition from an “industrial society” to a “risk society.” It is confronted with its own self-destructive tendencies and consequences, which cannot be overcome by the system of industrial society itself. At the same time that risks are reduced in many areas — contagious diseases, poverty, unemployment, traffic accidents, etc. — human societies are threatened with new risks — many of which can be accounted for in terms of human causality, distribution of responsibility and authority, and the available capabilities and controls (Blowers, 1997: 855), in a word, human agency. The risks which Beck refers to are particularly those arising from the production of greenhouse gases, nuclear energy and nuclear waste, chemicals, genetic engineering, air pollution and the reduction of the ozone layer, among others. These have in common their potential negative consequences and their anthropogenic character (Pellizzoni, 1999).

Ironically, it is successful technology development — not its failure — which prepares the stage for processes of reflectivity and criticism, the essence of “reflexive modernity”. Reflexive modernization implies self-awareness about the limits and contradictions of modernity, for instance the complexity and risky character of many of its technologies and technical systems (Beck, 1997; Kerr and Cunningham-Burley, 2000: 283).[3]

The limitations of Beck’s perspective have been argued by many others — and it is not our intention to address here the diverse problems of his approach. Suffice it to say that Beck offers only a general and in many ways vague critique of modern, developed society, but suggests no practical prescriptions or models of what an alternative might look like, or how a transformation might proceed or be achieved politically, socially, or technically (Blowers, 1997: 867). Beck offers a particular perspective and a critical and provocative discourse[4] at the same time that he exhibits a serious lack of the capacity to theorize systematically. There are no theoretical propositions, models, or explanations. This is understandable, in part, because Beck rejects empirical sociology as a “backwater of hypothesis-testing scholars” (Beck, 1997; Kerr and Cunningham-Burley, 2000: 284). In our view, a necessary condition for meaningful theory development is the identification and analysis of empirical patterns and processes. Given Beck’s theoretical and empirical limitations, it is not surprising that he conflates substantially different technologies — biological, chemical, industrial, nuclear, among others — failing to recognize or to explore their different regulatory regimes as well as varying public conceptions and responses to such diverse technologies (Kerr and Cunningham-Burley, 2000). As the solid empirical works of La Porte (1978; 1984), La Porte and Consolini (1991), and Perrow (1994; 1999 [1984]; 2004) show, there are major differences in the riskiness of different technologies and technological systems. It would be more accurate to speak of particular risky systems and practices (Machado, 1990; Perrow, 1999 [1984]) rather than of “the risk society”.

Beck’s sweeping account of the “risk society” neglects the complexity of modern society, its differentiation and divergent tendencies. People reveal a range of heterogeneous understandings and interpretations of the “reality” of risk (Irwin, Simmons and Walker, 1999; Wilkinson, 2001: 14). Beck also appears “to have little regard for the problem of conceptualizing the empirical reality of ‘everyday’ social processes of risk perception” (Wilkinson, 2001: 14).

In contrast to Beck, we would not characterize modern conditions as high “risk” but rather as entailing differentiated risk and the variation in time and space of risky systems. At the same time, there is increasingly high risk consciousness, risk theorizing, risk discourse, and risk management. Arguably, modern society is not more hazardous than earlier forms of society (as suggested by measures of, for instance, average life expectancy or incidence of accidents) — but it is much more conscious of risks and the sources of risk, and it regularly conducts public risk discourses and assessments as well as develops regulatory measures to deal with risks defined socio-politically and/or technically as serious and calling for such action. Beck stresses self-induced risks (as characteristic of reflexive modern societies): nuclear power plants, air transport systems, chemicals in food, plastics and other everyday materials, pollution of the oceans and atmosphere, ozone depletion, among others. Not all modern risks arise from intentional human intervention in nature; immigration, financial and money market instability (Burns and DeVille, 2003), and the global dynamics of capitalism (Burns, 2006a; Yearley, 2002) are driven by actions of millions of social agents acting on their own initiatives and understandings.

Our approach applies social systems theory to the analysis of the risks arising from new, complex technologies, exposing the cognitive and control limitations in relation to such constructions (Burns and others, 2002; Machado, 1990; 1998). We emphasize, therefore, the importance of investigating and theorizing the particular ways in which human groups and institutions conceptualize and try to deal with their technologies, their unintended consequences and risks. Most substantially (on the theoretical as well as policy level), we emphasize the role of democratic culture and institutions as a challenge to many technological systems and developments (Andersen and Burns, 1992). We argue that there is emerging organically in advanced democratic societies (Burns, 1999) a new social order encompassing the long-established technocratic-industrial-scientific complex as well as a participatory democratic complex of civil society associations and the mass media as well as natural, medical and social scientific experts (see footnote 2). Thus, there also emerges challenges and countervailing forces against some of the projects, leadership and authority of the dominant complex with its many, diverse hazards.

 

Actor-system-dialectics theory in a nutshell

Introduction

Social systems approaches have played and continue to play an important scientific role within the social sciences and humanities (Burns, 2006a; 2006b). Above all, they contribute a common language, shared conceptualizations, and theoretical integration in the face of the extreme (and growing) fragmentation among the social sciences and humanities and between the social sciences and the natural sciences. The challenge which Talcott Parsons (1951) and others including Walter Buckley (1967) originally addressed still faces us: to overcome the fragmentation of the social sciences, the lack of synergies, and the failure to develop a cumulative science.

In spite of a promising start and some significant initial successes, “systems thinking” has been marginalized in the social sciences since the late 1960s (Burns 2006a; 2006b). The widespread rejection of the systems approach did not, however, stem the incorporation of a number of systems concepts into other social science theoretical traditions. Consequently, some of the language and conceptualization of modern systems theories has become part of everyday contemporary social science: e.g., open and closed systems, loosely and tightly coupled systems, information and communication flows, reflexivity, self-referential systems, positive and negative feedback loops, self-organization and self-regulation, reproduction, emergence, non-linear systems, and complexity, among others. Institutionalists and organizational theorists in particular have adopted a number of key system concepts without always pointing out their archaeology or their larger theoretical context (Burns, 2006a).

Earlier work (Burns, 2006b; 2008) has demonstrated that many key social science concepts have been readily incorporated and applied in social system description and analysis: institutional, cultural, and normative conceptualizations; concepts of human agents and social movements; diverse types of social relationships and roles; social systems in relation to one another and in relation to the natural environment and material systems; and processes of sustainability and transformation. It aims to provide a common language and an integrative theoretical framework to mediate, accumulate, and transmit knowledge among all branches and sub-branches of the social sciences and allied humanities (Sciulli and Gerstein, 1985).

Actor-system-dialectics (ASD)[5] emerged in the 1970s out of early social systems analysis (Baumgartner, Burns and DeVille, 1986; Buckley, 1967; Burns, 2006a; 2006b; Burns, Baumgartner and DeVille, 1985; Burns and others, 2002).[6] Social relations, groups, organizations, and societies were conceptualized as sets of inter-related parts with internal structures and processes. A key feature of the theory was its consideration of social systems as open to, and interacting with, their social and physical environments. Through interaction with their environment — as well as through internal processes — such systems acquire new properties and are transformed, resulting in evolutionary developments. Another major feature entailed bringing into model constructions human agents as creative (as well as destructive) transforming forces. In ASD, it has been axiomatic from the outset that human agents are creative as well as moral agents. They have intentionality, they are self-reflective and consciously self-organizing beings. They may choose to deviate, oppose, or act in innovative and even perverse ways relative to the norms, values, and social structures of the particular social systems within which they act and interact.[7]

A major aspect of “bringing human agents back into the analytic picture” has been the stress on the fact that agents are cultural beings. As such, they and their relationships are constituted and constrained by social rules and complexes of such rules (Burns and Flam, 1987). These are the basis on which they organize and regulate their interactions, interpret and predict their activities, and develop and articulate accounts and critical discourses of their affairs. Social rule systems are key constraining and enabling conditions for, as well as the products of, social interaction (the duality principle).

The construction of ASD has entailed a number of key innovations: (1) the conceptualization of human agents as creative (also destructive), self-reflective, and self-transforming beings; (2) cultural and institutional formations constituting the major environment of human behavior, an environment in part internalized in social groups and organizations in the form of shared rules and systems of rules; (3) interaction processes and games as embedded in cultural and institutional systems which constrain, facilitate, and, in general, influence action and interaction of human agents; (4) a conceptualization of human consciousness in terms of self-representation and self-reflectivity on collective and individual levels; (5) social systems as open to, and interacting with, their environment; through interaction with their environment and through internal processes, such systems acquire new properties, and are transformed, resulting in their evolution and development; (6) social systems as configurations of tensions and dissonance because of contradictions in institutional arrangements and cultural formations and related struggles among groups; and (7) the evolution of rule systems as a function of (a) human agency realized through interactions and games (b) and selective mechanisms which are, in part, constructed by social agents in forming and reforming institutions and also, in part, a function of physical and ecological environments.

 

Risk and risk analysis in a social systems theory perspective

Point of departure: discretionary conditions

This section emphasizes the importance of investigating and theorizing the particular ways in which human groups and institutions collectively conceptualize and deal with socio-technical systems and their consequences, stressing the cognitive-normative frameworks and models as well as strategies utilized in these dealings.[8] In particular, it focuses on the cognitive and control practices as well as their limitations with respect to complex systems and the hazards they entail (Burns and others, 2003; Machado, 1990; 1998). The argumentation goes as follows:

(1) Many hazards and risks are discretionary — they are the result of human decisions and constructions. For instance, “natural death” may be avoided to a certain extent, as the result of the application of life-support technologies and intensive care medicine. Thus, “natural death” is replaced, in a certain sense, by death as human deed (although not an arbitrary one) (Machado, 2005; 2009). In general, many natural hazards are replaced by discretionary and constructed hazards, often as unintended consequences of the development and application of new technologies. “Discretionary society” or “constructionist society” is a more accurate characterization of modern conditions than Beck’s notion of the “risk society”, in part because “risk” is far too narrow a concept to capture the complexity and diversity of social systems.[9]

Collective decisions determine the initiative, and particular features of the initiative, of such developments as industrial development, nuclear energy development, or advanced weapons development. In a certain sense, they are not only discretionary but “artificial” — including the quality of, and strength of commitment to, regulation and safety features encompassing a given technology. Risk configurations are discretionary, dependent on human judgment/decisions: the design and operation of the institutional arrangements constructed. Since these systems are based on collective decisions, most individuals cannot decide whether or not they want to take the risks — rather the decisions appear as sources of potential, unintended negative consequences, “unavoidable” hazards and dangers.

Bertilsson (1990: 25) points out:

Risks have always accompanied human life. However, in earlier times the risks were exogenous to man and his actions. They occurred because of nature’s own eruptions and man’s ignorance. Today, the situation is very different: Risks are often endogenous to modern systems of production and living and are the result of man’s own scientific-technical-industrial ingenuity in taming forces of nature. As part and parcel of the mode of production, risks are systemically produced today.

Thus, one distinguishes between exogenous and endogenous risks. Risks exogenous to human actions are exemplified by natural catastrophes (for example, epidemics, volcanoes, earthquakes, hurricanes, and other natural disasters). Such catastrophes, or their threat, are beyond the control of human decisions (although human groups may still adapt in ways to minimize their risks — and also may try to utilize magical powers to deal with such threats). Endogenous risks are those inherent to human constructions, which result in part from the unintended consequences of man’s own technical and scientific ingenuity. This includes technological hazards that threaten the entire biosphere such as global warming; the release or misuse of hazardous substances such as toxic chemicals or nuclear waste; or failures of large-scale technological systems such as nuclear power plants, or electricity networks. Adverse effects to the environment include threats to humans as well as non-human species, ecosystems, climate and the biosphere as a whole. For many individuals, these are equivalent to “natural catastrophes”. Still, there are numerous risks in modern society, with respect to which individuals can influence the degree to which they are subject to them by changing their behavior (smoking, food selection, living area, type of job, etc.).

(2) Some technologies and technical systems are much better modeled and understood than others and, therefore, can be better risk managed, provided the resources and infrastructure are available. In the case of known systems, one can calculate risks on the basis of established scientific models and historical patterns of performance. In the case of radically new technological developments, one proceeds in partial or almost total darkness — that is, radical uncertainty — about many interactions and unintended consequences (the “Frankenstein effect”).[10] Some technological systems are complicated beyond our understanding — and beyond our capacity to make them fully safe. For instance, Perrow pointed out that complex and tightly coupled systems have risky characteristics. Even attempts to improve safety through more effective regulation — introduces further complexity, intensifying non-linearity and increasing risks (although different than the initial risk challenge (Perrow, 1999 [1984]; Burns and Dietz, 1992b; Strydom, 2002; about complex money systems, see Burns and DeVille, 2003). At the same time, modern, advanced society may be producing “Frankensteins” faster than it can learn to deal with them (Rosa, McCright and Renn, 2001: 5). In a certain sense, discretionary powers are out of control.

There are always multiple consequences of an operating system, and some of these are unexpected. They may not be foreseen because of knowledge limitations, indeterminacies, or the actions of others who intentionally or unintentionally operate against intended or expected patterns (in a game-like manner). But some “knowledge” or beliefs that the actors have, may be misinformed or quite wrong with respect to it and its consequences. So, previous knowledge may or may not be useful; in any case, new uncertainties and risks arise in connection with unanticipated and unintended consequences. For instance, major dam projects have not only obvious ecological consequences but bio-medical consequences (Le Guenno, 1995). The Aswan Dam, for example (launched in 1960 and completed in 1971) was intended to control the Nile flood, allow its water to be used more systematically for irrigation, and generate electricity. There were numerous unanticipated and unintended consequences. For instance, silt no longer came down the Nile; much of the electricity from the dam had to go into manufacturing fertiliser to make up for the loss of silt. Salinisation increased in the absence of the flushing provided by the annual flood. The Nile Delta shrunk, depriving the Mediterranean of nutrients, which destroyed the sardine and shrimp fisheries.

Dams, in raising the water table, typically contribute to the multiplication of insects and bring humans and animals together in new population matrices. The irrigation canal system constructed in connection with the dam became a breeding ground for the snails that carry schistosomiasis, a disease of the liver, intestines and urinary tract that now affect the entire population in many rural areas around the dam. The increased water table and substantial bodies of irrigation water allowed mosquitoes to multiply rapidly, spreading diseases such as Rift Valley fever bringing about major losses of cattle and epidemics in the human population

As pointed out above, actors operate with incomplete models of their complex socio-technical systems, more so at certain stages than others. The models are used to identify hazards, determine their likelihood’s, and make risk assessments. The attribution to certain objects, procedures or human agents as “hazards” depends on prior judgment — otherwise, risk assessors would be faced with considering every element or combination of elements in any given environment or context. There are, of course, unidentified risks. As Fox (1998: 675) argues: “Inevitably, risk assessment must begin with some prior knowledge about the world, what is ‘probable’ and what ‘unlikely’, what is ‘serious’ and what is ‘trivial’ or seemingly ‘absurd’.” Such judgments may derive from “scientific” sources, or may depend on “commonsense” or experiential resources. Either way, the perception of a hazard’s existence will depend on these judgments. How the judgment is made (that is, what is counted as evidence to support the assessment) is relative and culturally contingent… Both risks and hazards are cultural products." (our emphasis)

In general, in the case of less complex and dynamic technical conditions, agents (individuals as well as groups) may readily know and calculate risks, expected gains, and tradeoffs. In the modern world, however, environments tend to be unstable because of the dynamics of scientific and technical developments, the turbulence of the economy, diverse government interventions and regulations, and the substantial movement of peoples. There is a continual and growing need for new knowledge and new analyses. At the same time, contemporary knowledge of nature and of social systems has never been greater.

Science and technical knowledge provide a major basis for risk definition, and for defining and systematizing many of the solutions to risk problems at the same time that scientific and technical development lead to the continuous production and elaboration of “risks”. Thus, through contributing to new technologies and socio-technical systems, science and technology plays a crucial role in creating many of the problems but also to finding solutions to the problems. In this way, they are part and parcel of the reproduction and development of the “risk society” (Beck, 1992; Bertilsson (1990; 1992).

But managerial and policymaking settings differ significantly in terms of conditions of reliability and certainty. The conditions are inherently contingent, offering myriad possibilities — even under conditions of a high degree of apparent control. Large-scale disorder constrains actions, turning many human plans to naught. A major source of disorder and uncertainty arises from social opposition and conflict. However, even in the absence of human conflict and destruction, there are fundamental problems in fully knowing and regulating many major socio-technical constructions and their impacts. Thus, it is useful to approach the problem of bounded knowledge and control of constructed systems, “discretionary systems”, drawing on cognitive, cultural, and institutional theories (Burns and Flam, 1987; Burns and others, 2003; Machado, 1998; Nowotny, 1973; Strydom, 1999).

In sum, science is essential to modern life, in defining, assessing, and regulating risks, among other things. Science is the most reliable way to produce empirical and related theoretical knowledge. But a new reflective stage is also needed, where science will be confronted with its own products, defects and limitations. What is needed is a “reflexive scientification” (Beck, 1992: 155). The current challenge is to push that reflexive theme further (Bertilsson 1992: 27). But this implies also the risk of a profound politicalization of science and technology, as discussed later.

 

Risk and risk discourse

Increased public concern about and political attention to environmental and technological hazards have promoted a re-assessment of older technologies and a critical scrutiny of the potential negative impacts of many new technologies. It is characteristic of most contemporary societies that technologies, despite their countless benefits, are increasingly subject to challenge by professionals and lay persons alike. In these challenges — and related public debates and policy-processes — two separate but interrelated concepts play a central role:[11] risk and hazard (Dietz, Frey and Rosa, 1993; La Porte and Consolini, 1991). Hazard refers to dangers or threats which may cause adverse consequences — it is a potentiality. For instance, it may refer to the characteristics of a technology such that if it fails significantly, the damage to life, property, and the environment might be substantial. Risk is the likelihood of it doing so (Fox, 1998: 665; The British Medical Association, 1987). Risk then is a compound measure of the magnitude of some future harmful event or effect and the probability of its occurrence. Standard models of risk can be employed, for instance, where risk is conceptualized as (see also footnote 12):

Risk = (Probability of a hazard, loss, undesirable outcome) x (impact or assessment of a hazard, loss, or undesirable outcome)

But we must bear in mind that such an approach decontextualizes many key physical as well as social factors (and shares a good deal of the weaknesses of rational choice theory (Burns and Roszkowska, 2008). Social contextualization implies the possibility of a variety of different risk assumptions, conceptions and models. The spectrum ranges from relatively qualitative ones to quantitative models.[12] Also, meta-judgment processes operate to determine not only the values (or ordering) of different hazards but also a “revision” of the “value” or weights given to likelihood estimates, depending, for instance, on how risk-prone or risk-averse one is (Burns and Roszkowska, 2008; 2010).[13]

Earlier we distinguished between exogenous and endogenous risks. Endogenous risks depend on collective decisions and the functioning of institutional arrangements, which are human constructions, and are, therefore, potentially discretionary. The exogenous risks are, of course, non-discretionary — they are beyond the capacities of those affected to change them. This is not strictly the case, however, since adaptation is a well-established individual and collective strategy for dealing with risks that cannot be controlled directly. For instance, buildings in an earthquake zone may be constructed in ways that reduce the likelihood of material and personal damage; infrastructures are moved back from water lines as an adaptive response to potential flooding.

Modern society exposes itself to risks through numerous innovations in production (for instance, industrialization of agriculture, nuclear energy, bio-technology developments) as well as consumption (use of hydro-carbon fuels, use of chloro-fluoro-carbons (CFCs), overeating, overdrinking, smoking). Decision-makers and practitioners can influence the degree to which people are subject to risks — for instance, by managing and regulating dangers more effectively. In this sense, they are discretionary threats and dangers. The safety policies and practices built into these systems are also based on collective decisions. The underlying premise is that, through choice, we can change or control risk: in other words, the dimensions, levels, and controls of risk to which we expose ourselves or others are often highly discretionary. One can choose not to develop, for instance, gene technology (or genetically modified foods), nuclear energy, or cloro-fluoride-carbons (CFCs). Or, one may choose to allow a modest, tightly regulated development of particular technologies. Or, one may pursue a laissez faire policy toward the technologies and applications. It is in this sense that we stress that the dimensions, levels, and controls of most humanly generated risks are discretionary; moreover, the risks may be distributed in diverse ways in a complex social system — whether intentionally or unintentionally.

The new discursive ideas relating to technology and environment[14] not only entail an elaboration of risk concepts, risk accounting, discourses, and management techniques, etc. They also bring to collective awareness across space and time matters of “choice” and “discretion.” There are deliberations on alternative options, the positive as well as the negative consequences anticipated, their likelihoods, possible safety measures, and ways of reducing or minimizing particular risks. Risk assessment and risk judgment are additional concepts that have become part of public discourse.

Paralleling developments in natural science and public discourses, the social sciences and humanities paying increased attention to risk problems and their role in modern society (Beck, 1992; Bertilsson, 1993; 1992; 1990; Dietz, Frey and Rosa, 1993; Dietz and Rycroft, 1989; Giddens, 1991; Lidskog, 1994; Jaeger and others, 2001). Much of the risk research has been conducted in terms of, on the one hand, “objective risk research” (that deals with the quantification of risks) and, on the other hand, “subjective risk research” (i.e. more psychological, socio-psychological, and anthropological investigations of people’s risk perceptions and assessments).[15] One challenge for a social science of risk is to combine the objective point of view with respect to the functioning of large-scale socio-technical systems, on the one hand, with the subjective “life-world” awareness of cultural beings, on the other hand (Bertilsson, 1993).[16] Moreover, there are initiatives to raise the level of awareness about unknown or unspecified risks, or risks yet to be identified (see Part I).

Risk analysis and management

The method of risk analysis is an attempt to measure and develop accounts about, the risks associated with a technology or socio-technical system in a given context. The methods entail identifying, estimating, and evaluating risks (Fox, 1998). The practitioners consider it as a technical procedure where, for a given setting, all risks may be evaluated and suitably managed — in that they may be predicted and regulated. In this way, it is believed that risks and accidents can be minimized or prevented altogether (Fox, 1998: 666; Johnstone-Bryden, 1995).

Risk configurations are open to social definition and construction, and can be changed, magnified, dramatized or minimized within a particular perspective or framework. Also, there are different, even contradictory perspectives. Insurance experts may contradict engineers (Beck, Giddens and Lash, 1994: 11). While the latter diagnose “minimum risk,” the former decide a project uninsurable, because of “excessive risk”. Experts are undercut or deposed by opposing experts. Politicians encounter the resistance of citizens’ groups, and industrial management encounters morally and politically motivated consumer and NGO organized boycotts. Fox (1998: 669) argues: “What is considered as a risk, and how great that risk is, will be perceived differently depending upon the organization or grouping to which a person belongs or identifies, as will the disasters, accidents, or other negative occurrences which occur in a culture.” (See also Douglas and Wildavsky, 1992)

Risk assessment — including technology assessment — was intended as a tool for risk management. The basic idea of such assessments has been that an analyst investigates and reports on, among other things, the risk implications of a new technology or technological development. Such a study would help policymakers to decide about the appropriateness of the technology, possibly the need to redesign it, or to take a variety of necessary steps to deal with potential or anticipated negative consequences.[17]

Risk management — A socially and politically important class of socio-technical systems are defined by LaPorte (1984; 1978) as benefit-rich but hazardous (see Part I). Such systems are constructed and operated precisely because of their great benefits. At the same time they may entail substantial risks: for example in the cases of nuclear power plants, nuclear waste storage systems, air traffic control systems, chemical plants, etc. A critical goal for such systems is to avoid operational failures altogether — hence the attention to constructing and maintaining highly reliable organizations with their regulatory frameworks. These systems, even if they entail substantial hazards, are designed to be low risk. When successful, they are characterized by a capacity to manage them effectively and to provide expected levels and qualities of products and services with a minimum likelihood of significant failures that risk damage to life and property (LaPorte, 1984). In this way, a potentially hazardous system is shaped and managed as a low risk system through design and operational codes and standards.

Conventional technology assessment and risk analysis fail in the face of technology developments where many important consequences and further developments cannot be specified and modeled beforehand. This is, in part, a result of the limitations of the method. There are also problems of legitimacy and the growing awareness of the need to engage a variety of stakeholders in the assessments and practical decisions. Technical experts often disagree among themselves, as pointed out earlier. Stakeholders may or may not succeed in identifying what are the “significant” implications for them of a given innovation or system. Since their values and concerns are the point of departure, identifying such dimensions is essential. But often they have difficulty in identifying initially many of the relevant values involved, a failure that can have serious consequences (McCarthy, 2001: 292).

In sum, technology assessment and risk analysis for calculation and prudential judgment are very limited tools for dealing with innovations such as those outlined above. In the face of radical technological innovations where knowledge is incomplete and fuzzy, one is forced to adopt an “experimental” attitude; one monitors developments and re-iterates discovery, definition, and assessment processes. Continuing discussions, debates, and joint analyses are essential and should be institutionalized.

While technology assessment and risk analysis were initially seen as technical matters, critics as well as practitioners have come to emphasize the need for greater “participation” of non-experts and those affected or likely to be affected by the technology. One obvious reason for this is to bring into the process participants who could identify or articulate important values and consequences which would, otherwise, be missed by technical experts in their judgments. This provides for a more common point of departure for any risk and technology assessment. In short, the emphasis is on extensive participation that goes beyond the narrow limits of a technical or scientific engagement. But given the range of values and considerations activated in such processes, there is an exponential growth in complexity and possible contentiousness and a continuing need for organizing more multi-dimensional and integrated assessments, hence the emergence of “integrated assessment models” which entail bringing together, for instance, “focus groups” involving actors representing different perspectives and value orientations.

In the case of well-defined and largely knowable technologies and socio-technical systems, one can identify and analyze the major impacts, “calculate” benefits and costs as well as risks, and specify suitable regulatory measures. In such cases, technology assessment and risk analysis are useful tools. On the other hand, for many or most, new technological developments, particularly radically new ones, information or knowledge about the likely outcomes is typically very incomplete. There is a profound uncertainty about many developments and outcomes.

In carrying out risk analysis and in ultimately managing a technology system — one requires a model. It may be simple, a very rough approximation of a functioning system. Or it may be relatively well-specified and developed. Typically, not all subsystems and aspects of a complex, new socio-technical system are well understood and modeled. Relatively well-understood processes can be reasonably modeled. Often one ignores or greatly simplifies elements that are not well understood or unknown. Of course, a model, although inevitably incomplete and inaccurate, may still be sufficiently representative and accurate to be of great practical use.

In conclusion, bounded knowledge (Simon, 1979) implies some degree of ignorance or uncertainty but also limited control of technologies and socio-technical systems. Most complex, dynamic systems are particularly problematic in that there can never be complete knowledge. There will be unintended and only partly understood interactions and unanticipated consequences. Such complexity may lead to unexpected and hazardous behavior of the system, and may lead to situations in which key actors of the socio-technical system including operators, technical experts, and “managers” as well as “regulators” are unable to adequately “understand” (within the working model) the system and to effectively regulate or control its mis-performances and sources of hazards. This situation is potentially one of “danger” or even catastrophe.

 

Discussion and concluding remarks

The politics of science and technology and the scientification of politics and policymaking

Science and technology are increasingly intertwined with modern politics and policymaking.[18] There is an increased scientification of politics itself[19] at the same time that there is a growing politics to the question of applying new scientific and technical knowledge in technological innovation and development. The “politics of knowledge” concern, among other things, the application of new scientific and technical knowledge in defining and articulating policies. Issues concern, for instance, whether or not such knowledge ought to be introduced and, if so, to what extent and in which ways, and by which social agents. Although regulative issues of this sort have been around for some time (e.g. pharmaceutical products, dangerous chemicals, nuclear substances, etc.), the scale and the contentious character of knowledge politics has increased considerably. The politicalization of technology and science is a result of the fact that the general public and political leaders have learned, and come to expect, that technology and science developments often have major, possibly negative, impacts on health, the social environment and the natural world. Historically this has been a problem, particularly in the course of industrialization. As Langdon Winner (1977) argues, major technological innovations are similar to legislative acts or political foundings that establish a framework for public choice and order that will endure over many generations. For that reason, the same careful attention one would give to the rules, roles, and relationships of politics must also be given to such things at the building of highway systems, or the introduction of the New Genetics, or the development of information and communication technology (ICT). Today the developments are increasingly rapid, and the scale is global. Consider issues such as:

genetic testing and therapy. Many major developments in this area are highly contentious. What are the risks? Since there are many uncertainties (see earlier), how rapidly and extensively should one proceed; which should be the areas of applications?

xenotransplantation. Xenotransplantation (transplantation of organs and tissues from one species to another). For instance, there is the risk of interspecies transmission of infectious agents via xenograft; this has the potential to introduce infectious agents into the wider human community with unusual or new agents. This is also the case in connection with transgenic pigs (pigs manufactured with human genes in order to reduce rejection by the immunity system of the patient) and patients with compromised immunity (QJM Editorial, 2000).

genetically modified foods. Should the sale of such foods be allowed. If so, all such foods? If not all, what should be the criteria of selection? Who should determine the selections and how?

cloning. To what extent should cloning be allowed. If permitted, who should be allowed to perform it, and under what conditions?

the world wide web. It appeared initially to be a purely promising development but which resulted in, among other things, the exploitation of its opportunities by pornographers, extremist political groups, pedophiles, etc. To what extent should the internet be regulated, by whom and in what ways?

global warming. To what extent is it a genuine threat? If a threat, what are its causes and what can and should be done about it?

industrialized animal-food production. Increased outbreaks of infectious diseases are associated with animal herds (pigs, cattle, chickens). An important factor in these outbreaks is the increasing industrialization of animal-food production in confined spaces in many areas of the world that has propelled the creation of large-scale animal farms keeping substantial number of, for instance, pigs or chickens. These conditions are commonly associated with a number of infectious outbreaks and diseases in the animal population, many of them a threat to human populations. Not surprisingly, this also explain the widespread use of antibiotics in order to avoid infections and to stimulate growth in these animal populations (increasing, however, the risk of antibiotic resistant infections in humans).(QJM Editorial, 2000).

globalized food production. Today, an increased proportion of the fruits and vegetables consumed in highly developed countries is grown and processed in less technologically developed countries. The procedures to process food (e.g. pasteurization, cooking, canning) normally ensure safe products. However, these processing procedures may fail. With a global food supply, we encounter the risk that one defective product may contaminate a number of individuals spread in different countries. The existing nationally or regionally based health care infrastructures are not prepared to handle these problems. Earlier, people were infected by food and drink, locally produced and locally consumed.

creation of many large-scale, complex systems. We can model and understand only to a limited extent systems such as nuclear-power plants or global, industrial agriculture,[20] global money and financial systems, etc. As a result, there are likely to be many unexpected (and unintended) developments. What restructuring, if any, should be imposed on these developments? How? By whom?

Regulatory institutions are expected to assume responsibility for and to deal with these as well as a multitude of other developments. There is a sustained call for political action and regulation (as well as opposition to such control in the name of freedom or liberalism). This is the contemporary politics of science and technology development. At the same time, scientific and technical expertise play a key role in providing policymakers with technical categories, descriptions, standards, assessments, etc. The scientification of politics and regulation is driven by many of issues that become the stuff of contemporary political debate, conflict and action — expressed in political discourses that are generated or discovered in and through science and science-based knowledge production.[21] For instance, the issue of climatic change originated among natural scientists. A similar pattern is also observable in relation to the new genetic technologies — geneticists and physicians involved in applying and developing these technologies have raised a number of ethical, legal, and policy issues (Machado and Burns, 2001). At the same time, politicians depend on such technical and scientific expertise in defining problems and analyzing what is the nature of the problem, what should and can be done, how should the consequences or impact of potentially risk technologies — or developments arising from them — be regulated.

As science and technology, industries, and other complex systems are developed, new “hazards” are produced which must be investigated, modeled, and controlled. At the same time, conceptions of risk, risk assessment, and risk deliberation evolve in democratic societies. These feed, in turn, into management and regulatory efforts to deal with (or prevent) hazards from occurring (or occurring all too frequently). One consequence of this is the development of “risk consciousness”, “risk public discourses”, and “risk management policies”. Such a situation calls forth public relations specialists, educational campaigns for the press and public, manipulation of the mass media, formation of advisory groups, ethics committees, and policy communities — that have become equally as important as research and its applications as well as regulatory measures. They provide to a greater or lesser extent some sense of certainty, normative order, and risk minimization.

The politicalization of science and technology development is characteristic of democratic society with an independent and robust mass media. Not only is expertise used to inform and legitimate collective decisions but science and technology innovations are increasingly questioned and challenged in democratic contexts. The “risk society” (Beck, 1992) as discourse and struggle is not characteristic of a dictatorship with censorship and suppression of truth and high levels of public ignorance. In any case, there are, of course, likely to be real and substantial risks to the social and physical environments in such societies as modernization forges ahead. The dictatorship to which the population is subjected generates a spectrum of risks, that typically cannot be articulated and discussed publicly by those affected. At the same time, a semblance of social acceptance and order is maintained through coercion.

On the other hand, democratic societies run their own risks. In a free, open society with an independent, vigorous mass media, some technological developments and their negative impacts are identified, debated and opposed — quite rightly. Public controversies about scientific and technological developments follow from the general awareness and uncertainty with respect to the risks of technology and technological development. Such risk perceptions give rise to new kinds of social tensions and conflicts, and also imply a need for new forms of social knowledge and regulation.[22] The democratic politicalization of science and technology results in controversies that, in turn, lead to the mobilization of scientific and technical judgments in risk assessment and regulation (Bertilsson, 1993; Sundquist, 1991).

But democratic society also facilitates under some conditions the development of widespread skepticism about, and de-legitimation of, many modern systems: science, capitalism, corporate culture, markets and commercialism, and the varieties of technologies and socio-technical systems that may be brought under suspicion and opposed ranging from nuclear power to high power electrical lines, GMOs, nanotechnologies, biotechnologies, mobile telephones, and computer screens. At the same time, contemporary innovation processes are so rapid and so diverse that the assessment and regulation of major innovations lags behind their introduction and spread. Thus, there is a regulatory as well as more generally a “cultural” lag in relation to technological development.

Addressing the limits of knowledge and control

Since the 1960s there has been a growing concern about the social and environmental impacts — as well as the rate of change — of technological innovation and development. Public perceptions and assessments have changed with respect to resource depletion, pollution, work life conditions, and employment, as well as other areas of the social and physical environments. In a number of Western societies, green movements and green political parties have emerged, struggling for pollution control, protection of the environment, and changes in public policies as well as social values and life styles. They refuse to accept unrestrained technological and socio-economic development. They attempt to set forth, usually not fully successfully, new demands and constraints relating to the application and development of modern technologies and global political economy. Consequently, there is increasingly a politics and social conflict relating to technological change and development.

As a response to these movements, social learning about — and increased politicalization of — technological development are taking place. This is leading to greater recognition that:

(1)    Technological innovations and the development of sociotechnical systems not only produce positive, intended effects but also negative, unintended consequences for the environment, for working conditions and employment, and for social life generally. Many of the impacts are unanticipated. As Camilleri (1976: 222) has argued:

Inventions and discoveries in such fields as medicine, communication and transport may have revolutionalized man’s relationship with the natural order but they have at the same time made him the victim of these new forms of power. What is in question is not the misuse of power, widespread though it is, but the disparity in power which enables a small minority of bureaucrats, planners, and engineers to establish their technocratic rule over millions of men, and one dominant age to achieve mastery over generations yet unborn. It is not that this new race of conditioners is composed of evil men but that they have undermined their own humanity and that of their subjects by their very decision to shape humanity. In this sense, both the conditioners and the conditioned have been reduced to artifacts. Far from subduing reality to the wishes of men, the technical process of conditioning risks producing “the abolition of man”.

(2)    The benefits as well as costs or negative impacts of technological development may be experienced in different time frames. Immediate obvious costs may appear quite small in comparison with the expected benefits. However, in the case of complex socio-technical systems, the process of learning about and assessing consequences may be a long and difficult undertaking. It is the unintended (and often unanticipated) consequences that frequently show up as costs, having failed to be considered from the outset. By the time they are recognized, the technology is well entrenched with vested interests, an established social organization, and physical infra-structures; it appears unfeasible or far to costly to replace (the problem of “irreversibility”).

With advances in science, technology, and industry, many technology systems have become larger, more complex, potentially much riskier with the capacity to affect far greater numbers of people, ecological systems, and many of the things that people value and the landscapes they occupy and cherish (Rosa, McCright and Renn, 2001). Also, scientific and technological advances — especially those motivated by public concerns, social movements, and the failures of technology systems and their regulation — have led to increased capabilities to detect and assess undesirable side-effects or risks

(3)    The benefits and “costs” of technologies and technological development are usually distributed unequally among groups and segments of society — as well as among generations, in the latter case leaving as a heritage, for instance, a polluted and depleted physical environment and shattered community structures.

(4)    Many individuals, groups, organizations, and social movements are increasingly alert to the possible negative impacts of modern technologies and technological developments (Andersen and Burns, 1992). This may be in response to distributional effects, to environmental damage, to the depletion of resources and pollution, to the loss of jobs or meaningful work, or to the declining quality of the work environment or everyday material and social life.

In the context of scientific and technological revolutions, new strategies and technologies of policy and regulation are required. Science and technology need be harnessed to enable political leaders and parliamentary bodies to play a more prominent role in relation to the development of institutional arrangements and policies to regulate the introduction and application of new socio-technical systems. Earlier we considered technology and risk assessment as potentially useful tools in making public decisions and regulating technologies. However, we concluded that while these tools are useful, there is no clear-cut institutional fix, no panacea for dealing with the problems of regulating technologies and technological development. To a substantial degree, human agents are creating socio-technical systems that are difficult, if not impossible, to fully know and to fully control (Burns and Dietz, 1992b; Burns and others, 2003). Also, modern society is often confronted with experts who do not speak as a single authority or with a single voice.

Principles of epistemology, democracy, and policymaking in dealing with risk and risky systems

This article has stressed that many of the risk dimensions in modern society are discretionary. That is, they are dependent on human decisions: the design and operation of socio-technical constructions. In other words, collective decisions determine the initiative, and particular features of the initiative, of such developments as nuclear energy, advanced weapons, advances in chemicals and biotechnology, etc. In a certain sense, they are discretionary and “artificial” — including the quality of, and strength of commitment to, the safety guarantees surrounding a given technology. Since these systems are based on collective decisions, most individuals cannot decide whether or not they want to take the risks — rather the collective decisions are the sources of potential, unintended negative consequences and even unavoidable dangers. Thus, they are similar for many individuals to “natural catastrophes”. On the other hand, there are many risks in modern society, with respect to which individuals can influence the degree to which they are subject to them by changing or adapting their behavior (smoking, food selection, living area, type of job, etc.).

Our analyses suggest several principles useful in orienting public discussion and policy:

(1) Principle of incompleteness — bounded knowledge and limited control

It is a truism of contemporary social science that all knowledge is socially constructed, but construction may take place in different ways and with varying consequences. The social and institutional context of knowledge production and application differs substantially in science, a religious community, the business world, or political settings.

We create social systems for our own purposes, whether socio-technical systems, research institutes, enterprises, administrative units, democratic bodies, regulative agencies. Some of our creations are dangerous, or very risky constructions. At the same time, our knowledge of many of our creations is bounded, because the systems are all-too-complex and dynamic to be fully modeled and understandable. That is, the consequences of establishing and operating such constructions cannot be completely anticipated beforehand.

Our limited knowledge capabilities concerns not only the operation (and especially interactions) of such systems but their social, economic, political, and environmental impacts. Examples are abundant: socio-technical systems such as nuclear power plants, information and communication systems, the New Genetics and its myriad of potential applications; weapon systems such as nuclear and biological weapons; the nation-state as a system of modern governance and welfare, capitalism as a system of production with its complex of enterprises, markets, global outreach; and “globalization” generally. They all are familiar to us, but they are not fully understood as operating systems even by experts; even experts typically know only a part, in some cases only a small part, of the systems in question. Thus, our knowledge about many of our greatest creations and their consequences is inevitably bounded.[23]

In the face of potential dangers and risks, systematic attempts are directed at regulating these systems. Such efforts — even if politically feasible under some conditions — are rarely 100% successful. This is not only the result of a lack of sufficient resources, weakness in the regulatory machinery, or the impact of the “human factor” (see Part I). It is also often the result of an inability to mobilize necessary knowledge and knowledge capacity. This is, in large part, due to the fragmentation of scientific and technical knowledge and of the knowledge gap between the sciences, on the one hand, and policymakers and the general public, on the other. Many recognize that something radical should be done in the face of increasingly hazardous and risky human constructions — not only the obvious technical creations but institutional ones such as those associated with global capitalism or its financial systems, or new conceptions of military intervention.

Bounded knowledge (Simon, 1979) implies that effective control of any complex system will be limited. In part, this is because the models of the system are approximations of the actual behavior of the system. Thus, the effects of any given control action may not be adequately predictable — the linear, non-interactive models can adequately predict in only a very limited domain the behavior of complex, non-linear interactive systems. Indeed, the accuracy of prediction in that domain is in itself a source of danger and risk, since it leads to overconfidence in assessing the ability to control outside that domain, and a lack of sufficient attention to deviations from the model and to external drivers.

The limitations in knowledge and our ability to control complex systems arises not only from human frailty but from the complex nature of the systems themselves. Models of the systems are simplified, abstract representations of the system and are of necessity incomplete. They provide very limited capability to deal with emergent processes and with unexpected changes in the environment. But the complex systems that are at the heart of modern applications of science and technology nearly always generate emergent processes, and their social and natural environments are always changing.

General knowledge about complex systems tends to be decontextualized. By de-contextualized knowledge, we mean understandings of the system that are based on one perspective of the system, typically an abstract one, that does not or cannot take adequate account of other views that are based on particular, local situations or “hands-on” principles. The manager or engineer designing the system may have limited understanding of the practical problems faced by the construction worker or the system operator in a given social and physical context. The senior manager may see a project as a source of profit or loss while the worker on the line will see it as a job and/or as a source of professional accomplishment or as a context of solidarity or competition with fellow workers. Different actors approach the system with different perspectives and in essence are dealing with different socially constructed realities. Yet the system has both social and physical realities. Because of the decontextualization of knowledge, or more accurately, the lack of inter-contextual knowledge, physical and social realities are not fully understood anywhere in the system (this is a consequence then of “the tyranny of abstraction”). As a result, miscalculations, mistakes and failures are virtually inevitable.

Designing socio-technical systems that take account of the real limits to our understanding and control is a formidable challenge, but it is one that allows applied natural science and engineering and the social sciences to inform one another, and to facilitate the growth of each. In our view, the integration of the theory of actor-system-dialectics with cybernetic and engineering theories dealing with the complexity and stability of technological systems is essential for a proper understanding of modern socio-technical systems and the associated problems of effective regulation of their risks. This should imply a further operationalization and specification of both social rules and the rules built into complex systems, including their technologies. It is certainly important to develop models in which questions about stability and complexity can be linked unambiguously to the character of the particular social rule regimes regulating the systems. This is an important theoretical task with high practical relevance.

(2) If “to err is human”, what are the implications for technology design

Most modern education generates hubris. While many accomplishments are impressive, experience teaches us that our understanding of, and ability to control, complex technologies and socio-technical systems is limited (see earlier). These limitations come not only from human frailty but from the nature of the systems themselves. Models of the systems are simplified representations and are, therefore, inevitably incomplete. They provide limited capability to recognize and deal with emergent processes and with many potential changes in the environment. At the same time, complex, socio-technical systems nearly inevitably generate emergent processes, and their social and natural environments are also always changing. Change is invariant.

All of this suggests that systems should be designed to be error tolerant, to enhance the ability to learn from experience, including trial and error, and to enhance contextualized knowledge. When complex, tightly coupled, high risk systems are necessary (or strongly motivated), the social context of those systems should be simple and consistent. The conflicts that result from mixed messages and incentives will make error and failure likely (e.g., one is supposed to build and operate a fail-safe system but also minimize costs and maximize profitability). The results can be catastrophic in some cases. Requiring that socio-technical systems take into account the real limits on our understanding and control is a formidable challenge to designers, responsible professions and political and administrative leaders.

Social science research on technical controversy provides considerable insight into the dynamics of such controversy and the strategies and tactics of interested parties but does not yet provide much useful guidance regarding technological choice or design. We hope that our discussion to this point has addressed the first problem. Here, in conclusion, we turn to the second problem, the implications of our framework for technological choice.

Perhaps the most important principle that follows from our analysis of technology is that of humility. There are many forces that perturb human analyses and controls of complex socio-technical systems toward confusion, error, and catastrophe. The drivers that push toward accurate knowledge and effective control are real but are not as strong as is often presumed, for reasons pointed out earlier (see Part I also). Thus, we believe it is appropriate to be particularly humble in assessing what can and can not be understood, what can and cannot be accomplished through design and regulation.

We believe it is appropriate to be cautious in several regards. Forecasts of demand, technological performance or other key aspects of the future depend on models that in turn depend on what may be very limited or very de-contextualized experience and information. At a minimum, any use of such forecasts should include estimates of uncertainty based not only on standard statistical procedures and expert judgment, but also on the historical record of forecast accuracy across a variety of applications.

Humility about assumptions is warranted. Analyses of technology and socio-technical systems often ignore the encompassing societal system, or treat it only as disturbing sources of demand or obstruction, and reduced efficiency (see Part I). Yet the functioning and impact of the socio-technical system and the larger society constitute the environment for the technology. Ignoring them replaces systematic analysis with highly dubious assumptions; in many cases those assumptions are often either naive or politically biased. Carried a bit further, increased humility about modeling and even conceptualizing whole systems in new ways suggests that the critical problem in modeling and systems design are more political and ethical than technical. Efforts to develop approaches that allow for a more sophisticated understanding and elaboration of the political and ethical bases of technologies are crucial.

A final implication of humility comes in an understanding that the systems on which we depend and that are so influential in all aspects of our lives can never be perfectly designed from the outset, but must evolve, hopefully on the basis of greater knowledge and accumulated experience. Modern society should encourage structures that facilitate not only innovation but discussion and debate, learning and evolution of our innovations rather than “locking in” systems to stubbornly defined choices that while appropriate at one point in time by some criteria, may prove disastrous at other points in time and by other criteria. We need diversity in technology forms to allow selective forces to work, and active evaluations and impact analyses to guide those selective processes over which we have some discretionary control. Finally, we must develop new cultural orientations and rules reflecting bounded rationality and bounded control. As suggested above, even scientific communities and rational professions are capable of exaggerated self-confidence, self-deception, blindness and commitment to illusions and half-truths.

(3) The development of risk consciousness and prudentiality in an open, democratic society.

What characterizes modern society is not so much its high risks, natural as well as manufactured risks, but risk discourses, systems of risk assessment and management (in line with Weber’s principle of rationalization), and the politics of risk assessments and judgments. The discussions and deliberations of risk entail the encounter of different perspectives and value orientations — especially in the context of democratic norms and procedures.[24] Not surprisingly, there has emerged a politics of risk which engages proponents and opponents, the latter questioning and challenging proponents of, for instance, nuclear power, the new genetics, genetically modified foods, risk analysis itself, etc. Earlier, opponents such as Luddites in the early 1800s were viewed as irrational and devoid of vision and knowledge. In the contemporary world, opponents learn to use science and other expert knowledge in their struggles and negotiations with proponents of new projects and systems.

The risk of models formulated by experts — for instance embodying the value or values they consider important, or that they attribute to society as important — is that they ignore or leave out of consideration values that may be important to other groups or that may be defined as important later in time. One obtains greater simplicity and there is more certainty in that there are fewer dilemmas or conflicts to resolve. For instance, an agent may pursue wealth or power (or an abstract ideal) without concern about the strategies or means used as in the case of agents with absolutist value orientations (this type of commitment is obviously a familiar one in our economic and political culture). John Hall (2000) has pointed out that the early Karl von Clausewitz (1780-1831), drawing on observations of the success of Napoleon, formulated the principle that the essence of state behavior is that of pursuing its ends without limit. After witnessing the collapse of Napoleon’s ambitions, he came to distrust the unlimited quest for power and proposed a bounded or prudential orientation, thus imposing constraints on the pursuit of ends and also the construction of means, that is, to construct a more pluralistic and balanced value framework.

The perspective and analyses outlined in this paper suggest processes that will contribute to developing and normalizing prudentiality (a wider concept than precautionarity, a matter we shall take up in a later paper): (i) particular attention should be given to the limits of models of technologies and socio-technical systems, the inherent uncertainty and unpredictability of the systems we construct, manifested for example in the unanticipated consequences which are endemic to complex, dynamic systems (Burns and others, 2002). In other words, the hypothetical character of our knowledge, models, and beliefs must be emphasized. (ii) stress should also be put on increasing public awareness and reflectivity on the technology systems proposed or created — this means opening up and engaging people in governance processes; (iii) also of great importance — especially in democratic societies — is to encourage and give value to multiple perspectives, a pluralist culture, collective or public discussions and deliberations, and institutional arrangements to generate, reflect on, and judge alternative proposals (Burns and Ueberhorst, 1988).

Of course, our own proposals entail risks of their own for elites and their advisors, namely the risk of disagreement, opposition, and loss of a “contest”. The fear of, and even lack of tolerance for, this risk, especially among elites, must be overcome through the reinforcement of norms and practices which are basically egalitarian and democratic. Miller (1996: 224) refers to this as a form of “anomie” — a normative based pluralism which accepts free thought, disagreement, and uncertainty. But there would also have to be a community of agreed norms and procedures — which are understood and accepted even when people hold different philosophies, metaphysics and world views (rather than a single coherent system). In our terms, Miller rejects a “community of beliefs” and advocates a “community of norms and procedures” which accepts free thought, disagreement and uncertainty. Social integration and cohesion rests then on this diversity of a division of labor and the profound sentiment — the essence of religion — of sociability and attachments to one another and society. However, at the base, one still needs a community of belief or conviction in the ideal itself — which is one of the keystones of democratic culture (Burns, 1999).

Finally, we need to engage in imagining and designing new institutional arrangements, ones that minimize particular hazards and risks; and to identify or create arrangements that generate products and make use of production processes compatible with environmental protection, sustainability, etc.; those areas of development of hazardous technologies — where innovativeness, experimentation, and exploitation are driven by “competitive systems” — must be stringently constrained and regulated in some areas of technology.

Social sciences and humanities should be encouraged to contribute to methodological and epistemological discussions which highlight uncertainties and risks associated with technological and environmental developments; expert claims to infallibility and absolute neutrality need to be debunked; emphasis needs to be put on continuous norm formation and the development of legal, ethical, and social regulation (that is, multi-dimensional) of complex hazardous systems (Kerr and Cunningham-Burley, 2000: 298).[25] Many of the basic questions addressed in this article are not purely technocratic or economic issues, but entail questions of institutional design and development. For such questions there are no obvious or true answers. One has to decide which collective agents should assume and exercise social responsibility and deal with major collective issues and problem situations. In the context of redefining roles, the rights and obligations of participating actors, we emphasize the importance of establishing greater transparency and accountability for the far-reaching and diverse policy and “law-” making that goes on outside of the normal corridors of parliament and central government. In our view, the clock cannot be turned back to simpler more consistent arrangements for governance. Modern society is too complicated and far-too-dynamic to be overseen in any detail from a “center” (Andersen and Burns, 1992; Burns, 1999). At the same time, there has emerged a variety of highly flexible and adaptable forms of “self-governance” on all levels of modern society. The old forms of regulation (e.g., detailed legal regulation) are often less applicable and less effective, particularly in the numerous specialized, technically demanding sectors of modern society, which forge ahead of laws and administrative regulations. Ethics, as a form of regulation, becomes increasingly relevant and promising (Machado and Burns, 2001).

 

References

Andersen, S. S., and T. R. Burns (1992), Societal Decision-making. Democratic Challenges to State Technocracy, Aldershot, Hampshire, Dartmouth Publications.

Archer, M. S. (1995), Realist Social Theory. The Morphogenetic Approach, Cambridge, Cambridge University Press.         [ Links ]

Baumgartner, T., T. R. Burns, and P. DeVille (1986), The Shaping of Socio-Economic Systems, London, Gordon and Breach.

Baumgartner, T., and T. R. Burns (1984), Transitions to Alternative Energy Systems. Entrepreneurs, Strategies, and Social Change, Boulder, CO, Westview Press.

Beck, U. (1992), Risk Society. Towards a New Modernity, London, Sage Publications.

Beck, U. (1997), The Reinvention of Politics. Rethinking Modernity in the Global Social Order, Cambridge, Polity Press.

Beck, U. (2000), What Is globalization?, Cambridge, Polity Press.

Beck, U., A. Giddens, and S. Lash (eds.) (1994), Reflexive Modernization. Politics, Tradition and the Aesthetic in the Modern Social Order, Cambridge, Polity Press.

Bertilsson, M. (1990), “The role of science and knowledge in a risk society: comments and reflections on Beck”, Industrial Crisis Quarterly, 2, pp. 25-30.

Bertilsson, M. (1992), “Toward a social definition of risk: a challenge for sociology”, in F. Sejersted and I. Moser (eds.), Humanistic Perspectives on Technology, Development, and Environment, Oslo, Centre for Technology and Culture, University of Oslo.

Bertilsson, M. (1993), “Law, power, and risk: a subversive reflection”, in K. Tuori, Z. Bankowski and J. Uusitalo (eds.), Law and Power. Critical and Socio-Legal Essays, Liverpool, Deborah Charlies Publications.

Blowers, A. (1997), “Environmental policy: ecological modernisation or the risk society?” Urban Studies, 34 (5-6), pp. 845-871.

Buckley, W. (1967), Sociology and Modern Systems Theory, Englewood Cliffs, NJ, Prentice-Hall.

Burns, T. R. (1999), “The evolution of parliaments and societies in Europe: challenges and prospects”, European Journal of Social Theory, 2 (2), pp. 167-194.

Burns, T. R. (2006a), “Dynamic systems theory”, in Clifton D. Bryant and D. L. Peck (eds.), The Handbook of 21st Century Sociology, Thousand Oaks, CA, Sage Publications.

Burns, T. R. (2006b), “The sociology of complex systems: an overview of actor-systems-dynamics”, World Futures, The Journal of General Evolution, 62, pp. 411-460.

Burns, T. R. (2008), “Rule system theory: an overview”, in Helena Flam and Marcus Carson (eds.), Rule Systems Theory. Applicastons and Explorations, Berlin, Peter Lang Publishers.

Burns, T. R., T. Baumgartner, and P. DeVille (1985), Man, Decision and Society, London, Gordon and Breach.

Burns, T. R., T. Baumgartner, T. Dietz, and N. Machado (2002), “The theory of actor-system dynamics: human agency, rule systems, and cultural evolution”, in Encyclopedia of Life Support Systems, Paris, UNESCO.

Burns, T. R., and P. DeVille (2003), “The three faces of the coin: a socio-economic approach to the institution of money”, European Journal of Economic and Social Systems, 16 (2), pp. 149-195.

Burns, T. R., and Tom Dietz (1992a), “Cultural evolution: social rule systems, selection, and human agency”, International Sociology, 7, pp. 259-283.

Burns, T. R., and Tom Dietz (1992b), “Technology, sociotechnical systems, technological development: an evolutionary perspective”, in M. Dierkes and U. Hoffman (eds.), New Technology at the Outset. Social Forces in the Shaping of Technological Innovations, Frankfurt am Main, Campus Verlag.

Burns, T. R., and H. Flam (1987), The Shaping of Social Organization. Social Rule System Theory and Its Applications, London, Sage Publications.

Burns, T. R., C. Jaeger, M. Kamali, A. Liberatore, Y. Meny, and P. Nanz (2000), “The future of parliamentary democracy: transition and challenge in European governance”. Green Paper prepared for the Association of European Union Speakers of Parliament. Available at: http://www.camera.it/_cppueg/ing/conferenza_odg_Conclusioni_gruppoesperti.asp

Burns, T. R., and E. Roszkowska (2007), “Multi-value decision-making and games: the perspective of generalized game theory on social and psychological complexity, contradiction, and equilibrium”, in Y. Shi (ed.), Advances in Multiple Criteria Decision Making and Human Systems Management, Amsterdam, IOS Press, pp. 75-107.

Burns, T. R., and E. Roszkowska (2008), “The social theory of choice: from Simon and Kahneman-Tversky to GGT modelling of socially contextualized decision situations”, Optimum-Studia Ekonomiczne, 3 (39), pp. 3-44.

Burns, T. R., and E. Roszkowska (in process), “The GGT social theory of complex risk judgments: modeling socially contextualized decision-making under conditions of multiple values and differing qualities of uncertainty”, Optimum-Studia Ekonomiczne.

Burns, T. R., and R. Ueberhorst (1988), Creative Democracy, New York, Praeger.

Camilleri, J. A. (1976), Civilization in Crisis. Human Prospects in a Changing World, Cambridge, Cambridge University Press.

Chapman, A. (2007), Democratizing Technology. Risk, Responsibility and Regulation of Chemicals, London, Earthscan.

Dietz, T., and R. W. Rycroft (1989), The Risk Professionals, New York, Russell Sage Foundation.

Dietz, T., R. S. Frey and E. A. Rosa (1993), “Risk, technology, and society”, in R. E. Dunlap and W. Michelson (eds.), Handbook of Environmental Sociology, Westport, CT, Greenwood Press.

Douglas, M., and A. Wildavsky (1992), Risk and Culture. An Essay on the Selection of Technological and Environmental Danger, London, Routledge.

Fox, N. (1998), “’Risks’, ‘hazards’ and life choices: reflections on health at work”, Sociology, 32, pp. 665-687.

Geyer, F., and Johannes van der Zouwen (1978), Sociocybernetics. An Actor-Oriented Social Systems Approach, Leiden, Martinus Nijhoff.

Giddens, A. (1991), Modernity and Self-Identity. Self and Society in the Late Modern Age, Cambridge, Polity Press.

Hall, J. (2000), “A theory of war and peace”, presentation at the Swedish Collegium for Advanced Study in the Social Sciences, Uppsala, Sweden, February 3, 2000.

Hill, C. (1997), Technology Assessment. A Retrospective and Prospects for the Post-OTA World (manuscript).

Irwin, A., P. Simmons, and G. Walker (1999), “Faulty environments and risk reasoning: the local understanding of industrial hazards”, Environment and Planning, 31, pp. 311-326.

Jaeger, C., O. Renn, E. A. Rosa, and T. Webler (2001), Risk, Uncertainty, and Rational Action, London, Earthscan.

Johnstone-Bryden, I. M. (1995), Managing Risk, Aldershot, Avebury.

Kasperson, R. E., and J. X. Kasperson (1996), “The social amplification and attentuation of risk”, The Annals of the American Academy of Political and Social Sciences, 545, pp. 95-105.

Kerr, A., and S. Cunningham-Burley (2000), “On ambivalence and risk: reflexive modernity and the new human genetics”, Sociology, 34, pp. 283-304.

La Porte, T. R. (1978), “Nuclear wastes: increasing scale and sociopolitical impacts”, Science, 191, pp. 22-29.

La Porte, T. R. (1984), “Technology as social organization”, IGS Studies in Public Organization, Working Paper No. 84-1, Berkeley, CA, Institute of Government Studies.

La Porte, T. R., and P. M. Consolini (1991), “Working in practice but not in theory: theoretical challenges of ‘high reliability organizations’”, paper presented at the Annual Meeting of the American Political Science Association, September 1-4, 1988.

Le Guenno, B. (1995), “Emerging viruses”, Scientific American, October, pp. 30-36.

Lidskog, R. (1994), Radioactive and Hazardous Waste Management in Sweden. Movements, Politics and Science, Ph.D. dissertation, Uppsala University, Stockholm, Almqvist & Wiksell International.

Machado, N. (1990), “Risk and risk assessments in organizations: the case of an organ transplantation system”, paper presented at the XII World Congress of Sociology, July 1990, Madrid, Spain.

Machado, N. (1998), Using the Bodies of the Dead. Legal, Ethical and Organizational Dimensions of Organ Transplantation, Aldershot, Ashgate Publishers.

Machado, N. (2005), “Discretionary death: cognitive and normative problems resulting from advances in life-support technologies”, Death Studies, 29 (9), pp. 791-809.

Machado, N. (2009), “Discretionary death”, in C. Bryant and D. L. Peck (eds.), Encyclopedia of Death and the Human Experience, London/Beverley Hills, Sage Publications.

Machado, N., and T. R. Burns (2001), “The new genetics: a social science and humanities research agenda”, Canadian Journal of Sociology, 25 (4), pp. 495-506.

McCarthy, Michael (2001), “FDA wants more disclosure of gene therapy and xenotransplantation risks”, The Lancet, 357 (9252), p. 292.

Miller, W. W. (1996), Durkheim, Morals, and Modernity, London, UCL Press.

Nowotny, H. (1973), “On the feasibility of a cognitive approach to the study of science”, Zeitschrift für Soziologie, 2, pp. 282-296.

Osterholm, M. (ed.) (2000), “Emerging infections: another warning”, New England Journal of Medicine, 342 (17), p. 1280.

Parsons, Talcott (1951), The Social System, Glencoe, IL, The Free Press.

Pellizoni, Luigi (1999), “Reflexive modernisation anda beyond: knowledge and value in the politics of environment and technology”, Theory Culture and Society, 16 (4), pp. 99-125.

Perrow, C. (1994), “The limits of safety: the enhancement of a theory of accidents”, Journal of Contingencies and Crisis Management, 2 (4), pp. 212-220.

Perrow, C. (1999 [1984]), Normal Accidents. Living with High-Risk Technologies, 2nd ed. Princeton, NJ, Princeton University Press, New York, Basic Books.

Perrow, C. (2004), “A personal note on normal accidents”, Organization & Environment, 17 (1), pp. 9-14.

QJM Editorial (2000), “Xenotransplantation: postponed by a millennium?”, QJM, Monthly Journal of the Association of Physicians, 93, pp. 63-66.

Rosa, E. A., A. M. McCright, and W. Renn (2001), “The risk society: theoretical frames and state management challenges”, paper presented at the American Sociological Association Annual Meeting, August, 2001, Anaheim, California.

Rosenberg, N. (1982), Inside the Black Box. Technology and Economics, Cambridge, Cambridge University Press.

Sciulli, D., and D. Gerstein (1985), “Social theory and Talcott Parsons in the 1980s”, Annual Review of Sociology, vol. 1, pp. 369-387.

Simon, H. A. (1979), Models of Thought, New Haven, Yale University Press.

Strydom, P. (1999), “Hermeneutic culturalism and its double: a key problem in the reflexive modernization debate”, European Journal of Social Theory, 2, pp. 45-69.

Strydom, P. (2002), Risk, Environment, and Society, Buckingham, Open University Press.

Sundquist, G. (1991), Vetenskapen och miljöproblemen (“Science and Environmental Problems”), Ph.D. dissertation, University of Gothenburg, Gothenburg, Sweden.

The British Medical Association (1987), Living with Risk. The British Medical Association Guide, London, Wiley Medical Publications.

Turner, B. L., R. E. Kasperson, P. A. Matson, J. J. McCarthy, R. W. Corell, L. Christensen, N. Eckley, J. X. Kasperson, A. Luers, M. L. Martello, C. Polsky, A. Pulsipher, and A. Schiller (2003), “A framework for vulnerability analysis in sustainability science”, PNAS, 100 (14), pp. 8074-8079.

Tversy, A., and D. Kahneman, (1981), “The framing of decisions and the psychology of choice”, Science, vol. 211, pp. 453-458.

Vaughan, D. (1999), “The dark side of organizations: mistake, misconduct, and disaster”, Annual Review of Sociology, 25, pp. 271-305.

Wallerstein, I. (2004), World-Systems Analysis. An Introduction, Durham, NC, Duke University Press.

Wilkinson, I. (2001), “Social theories of risk perception: at once indispensable and insufficient”, Current Sociology, 49, p. 1ff.

Winner, L. (1977), Autonomous Technology, Cambridge, MA, The MIT Press.

Woodward, W. E., J. Ellig, and T. R. Burns (1994), Muncipal Entrepreneurship and Energy Policy. A Five Nation Study of Politics, Innovation, and Social Change, New York, Gordon and Breach.

Yearley, S. (2002), “The social construction of environmental problems: a theoretical review and some not-very-Herculean labors”, in R. E. Dunlap, F. H. Buttel, P. Dickens and A. Gijswijt (eds.), Sociological Theory and the Environment. Classical Foundations, Contemporary Insights, Lanham, Boulder, New York and Oxford, Rowman & Littlefield, pp. 274-285.

 

Notas

[1] This is the second part of a two part article (Part I appeared in Sociologia, Problemas e Práticas, No.61, 2009). Aversion of the article was presented at the First ISA Forum on Sociological Research and Public Debate, “The Sociology of Risk and Uncertainty” (TG4), Barcelona, Spain, September 5-8, 2008. The paper has been prepared and finalized while Burns was a Visiting Scholar at Stanford University (2007-2009). The work draws on an initial paper of the authors presented at the workshop on “Risk Management”, jointly sponsored by the European Science Foundation (Standing Committee for the Humanities) and the Italian Institute for Philosophical Studies, Naples, Italy, October 5-7, 2000. It was also presented at the European University Institute, Florence, Spring, 2003. We are grateful to Joe Berger, Johannes Brinkmann, Mark Jacobs, Giandomenico Majone, Rui Pena Pires, Claudio Radaelli, and Jens Zinn and participants in the meetings in Naples, Florence, and Barcelona for their comments and suggestions.

[2] As we emphasize later, some of the risks — and degrees of risk — of many new technologies and technical systems cannot be known in advance. The introduction and operation of these systems is an “experiment”. One learns or discovers as one goes along. In some cases, sophisticated methods and tools of analysis are required to identify risks. For instance, the use of the birth-control pill was found to entail an increased risk for blood clots among individual users. But the increase was so small that only massive use of the pill with millions of persons revealed this factor: 30 per million dying of blood clout among users of the pill versus 5 per million among those not using the pill.

[3] Beck’s related proposition that reflexive modernity leads principally to individualization is simply empirically false. The modern world is, rather, dominated by collective agents, organizational citizens, and major socio-political processes involving organizations (Burns, 1999; Burns and others, 2003).

[4] Beck (2000: 95) sees the confrontation with risk as contributing to norm formation and community building and integration. He suggests, for instance, that if the states around the North Sea regard themselves as subject to common risks, and therefore a “risk community”, in the face of continuing threat to water, humans, animals, tourism, business, capital, political confidence, etc., thus establishing and gaining acceptance of definitions and assessments (and measures dealing with). Threats create a shared cognitive-normative space — space for values, norms, responsibilities and strategies — that transcend national boundaries and divisions.

[5] Earlier ASD served as an acronym for actor-system-dynamics for many years. However, this labeling failed to convey the profound interdependence of actors as socially defined entities and systems. Also, it did not sufficiently convey the mutual transformative character of the actor-system complex. Actor-system-dialectics captures better second-order dynamics.

[6] Elsewhere (Burns, 2006a; 2006b), one of us has identified and compared several system theories emerging in sociology and the social sciences after the Second World War: Parsonian functionalism (1951), some variants of Marxist theory and World Systems Theory (Wallerstein, 2004), and the family of actor-oriented, transformative systems theories (ASD, the work of Buckley, 1967, and Archer, 1995, as well as Geyer and van der Zouwen, 1978).

[7] The formulation of ASD in such terms was particularly important in light of the fact that system theories in the social sciences, particularly in sociology, were heavily criticized for the excessive abstractness of their theoretical formulations, for their failure to recognize or adequately conceptualize conflict in social life, and for persistent tendencies to overlook the non-optimal, even destructive characteristics of some social systems. Also, many system theorists were taken to task for failing to recognize human agency, the fact that individuals and collectives are purposive beings, have intentions, make choices, and participate in the construction (and destruction) of social systems. The individual, the historic personality, as exemplified by Joseph Schumpeter’s entrepreneur or by Max Weber’s charismatic leader, enjoys a freedom — always a bounded freedom — to act within and upon social systems, and in this sense enjoys a certain autonomy from them. The results are often changed institutional and material conditions — the making of history — but not always in the ways the agents have intended or decided.

[8] Miller’s (1996) analysis of Durkheim and morality emphasizes the sociological perspective on moral and metaphysical risk, uncertainty about ideals that govern action and institutional arrangements (justice, democracy, welfare, socialism, etc.). Ideals may demand action involving not just risk but the virtual certainty of some sort of personal sacrifice, including life itself (as in the collective declaration of war). Or, societal risks can threaten the individual as with suicidogenic currents or currents of social pressure and conformity that sweep people off to their death, or to genocidal or other immoral actions — the risk of “killing oneself” or “killing others” for an ideal. Thus, actors may be caught up in intense, emotional convictions about an “ideal”, where a social cause (whatever it may be) is an essential expression or realization of it — whether socialism, racial equality, environmental protection, women’s liberation, or a radical form of participatory democracy.

[9] The concept focuses our attention on the discretionary powers of modern societies and their elites. It also suggests the discussions, deliberations, and judgments that go into determining which risks to take, how much such risk to take, and which institutional arrangements and policies should deal with (or neglect) particular risks. In an hierarchical social system, dominant actor(s) calculate from her (their) perspective and impose an order. In a more open, egalitarian system, the actors with different value orientations and risk judgments contend with one another, debate, and negotiate, that is, produce a “negotiated order,” but one which involves in any case discretionary choices and different types of risks.

[10] Advanced societies are characterized by a “contradiction” between the forces of technological development (based on science and engineering) and the potentialities of existing institutional arrangements to provide for effective learning, knowledge production and regulation. The growing awareness and concern about this contradiction in advanced, democratic societies has resulted in questioning the authority, legitimacy, and level of risk associated with contemporary technological development. This politicalization challenges, and even threatens, the entire enterprise.

[11] Risk was once (before 1800) a neutral term, referring to probabilities of losses and gains. A gamble which was associated with high risk meant simply that there was great potential for significant loss or significant reward (Fox, 1998).

[12] Some of many qualitatively and quantitatively different definitions of risk, which vary depending on specific situational contexts and applications are the following (Chapman, 2007):

— risk = an unwanted event which may or may not occur

— risk = the cause of an unwanted event which may or may not occur

— risk = the probability of an unwanted even which may or may not occur

— risk = the statistical expectation value of unwanted events which may or may not occur.

—risk = the fact that a decision is made under conditions of known probabilities (“decision under risk”?)

She suggests that all involve the idea of an unwanted event and/or that of probability. An unwanted event is a happening, an outcome of a situation perhaps, not a property of a thing. One problem for risk assessment then is the establishment of a causal connection between a technology in question and an event of a specified type that is regarded as constituting harm (Chapman, 2007: 82). Chapman quotes Hansson (2004) “…in non-technical contexts, the word 'risk' refers, often rather vaguely, to situations in which it is possible but not certain that some undesirable event will occur”.

People would call such situations risky. “I suggest that the riskiness of a situation is a measure of the possibility of harm occurring in that situation. The greater the magnitude of the possible harm, or the more possible it is (here the degree of probability comes into play), the more risky the situation. Riskiness differs from risk because it applies directly to a situation, rather than to an outcome or an event that results from the situation, and because it is primarily a matter of possibility rather than probability.” (Chapman, 2007: 84-85)

The idea of focusing on possibility gives greater weight to small probabilities, as Prospect Theory suggests that people do when making decisions (Tversky and Kahneman, 1981; Chapman, 2007: 86).

[13] The societal context of risk conceptualization and analysis is typically ignored. Risk in the work of Burns and Roszkowska’s new game theory (2007, 2008, 2009) is a socially context dependent composite judgment about the likelihood of damage (or loss) and the value (negative) of this loss. Risk judgment can be expressed abstractly as a composite function:

Risk Judgment [ f(v), g(l) ] = f(v) ⊗ g(l)

where:

l – denotes a hazard, potential loss, etc.

v – impact or perception of the hazard, potential loss;

f(v) – socially based value judgment(s) relating to hazards and potential losses l,

g(l) – socially based judgment(s) about the likelihoods or probability estimates relating to hazards or losses l.

⊗ – algorithm which relate hazard value judgments and likelihood judgments to one another

A variety of empirically meaningful algorithms are used to relate hazard assessments and likelihood estimates (see Burns and Roszkowska, 2008; 2009). For instance, the most common model (see page 108) involves a combinatorial algorithm which simply “multiplies” a cost/impact measure by likelihood (probability in some cases) to get an expected loss. Some GGT models are formulated with a matrix encompassing multiple judgment values. Expected net benefit (benefits, losses or costs) denotes judgments relative to the actor’s salient values and the likelihoods of potential gains and losses in connection with future actions or developments. Risk then denotes the likelihood of a potencial negative impacts or an action or event in terms of some characteristic value associated with the action or a future event. There are, however, other algorithms that are more complex and take into account the fact that valuation and likelihood estimates may not be integratable in such terms.

[14] Discourses are structured in terms of a few categories (1)What are the problems, what are their causes. Here we find causal narratives; (2) Conceptions or definitions of who are knowledgeable authorities. Who has the legitimacy to define a particular problem and possible solutions. (3)

Who has problem-solving responsibility and authority. That is, the authority which has the formal or informal responsibility for addressing and/or resolving the issue or problems. This is related to expertise, but is also grounded in social roles and norms for determining who should be empowered to pass judgment about problem-solving strategies or initiate necessary action on behalf of the community or polity.

[15] Bertilsson (1992) traces the increased interest in measuring how humans themselves perceive risks to the increasing difficulties to locate proper sources of risks and dangers. It is obvious that objective risks (as calculated by technical/scientific experts) do not necessarily correspond to how people themselves perceive risks (also, see Dietz, Frey and Rosa, 1993). Although, for instance, it is objectively more safe to fly than to go by car, most of us would probably perceive the risks differently (Bertilsson 1992: 9).

[16]  She states that the strength of Beck’s Risk Society (1992) is that it combines these points of view, and moves simultaneously on the levels of social structure and social action, but also noting the ambivalence of their interrelationships (Bertilsson, 1992: 10; and 1993: 5).

[17] A well-known institutional innovation in carrying out technological assessment for political leaders and the general public was the Office of Technology Assessment designed to serve especially the U.S. Congress and the general public. The Office of Technology Assessment (OTA) was established in the early 1970s and continued until the mid 1990s. It served Congress with more than 700 reports on important scientific issues. OTA testified, participated in press conferences with committee officials, gave briefings, held workshops, and conducted many other activities in support of congressional decision-making. OTA also served the U.S. public as a whole. Its studies were widely distributed and quoted by other analysts, by the professional and general press, by executive agencies, by interest groups, by individual companies, by consulting and management firms and individual citizens. Its reports provided authoritative foundations for academic research and teaching in such fields as engineering, public policy, environmental management, and international relations. Foreign countries used OTA reports to understand the USA better, as well as to provide a foundation for the decisions they had to make (the above is based on Hill, 1997). OTA functioned to provide highly technical information and assessments with a minimum of bias. (One of those having experience with and at OTA, Christopher T.Hill, points out that it operated with a minimum of policy bias because members of Congress would immediately react to such bias.) It was also effective in gaining access to diverse sources of information and perspective, etc., because it could claim that it was “calling in the name of Congress”. One of the major limitations was that while it produced results that were of broad national interest, they were only indirectly of immediate interest to Congress in helping it make decisions. Another drawback was that OTA was committed to a form of technology assessment which tended to treat technologies as well-defined systems. In many cases, the technologies or technology systems are not well-defined or fixed but highly dynamic and evolving. This is the case with the current development of information technologies, the new genetics or nano-technologies, matters to which we shall return later.

[18] Science and technology may be distinguished in the following terms (Burns and Flam, 1987). Science is an institutional arrangement designed to produce certain types of empirical and theoretical knowledge, using particular methods, logics, etc. Technology is a set of physical artifacts and the rules employed by social actors to use those artifacts (see Part I). Thus, technology has both a material and a cultural aspect. These rules are part of the “technology”; they are the “instruction set” for the technology, the rules that guide its operation. These rules can be analytically distinguished from the cultural and institutional arrangements of the larger socio-technical system in which the technology is embedded. A socio-technical system includes rules specifying the purposes of the technology, its appropriate applications, the appropriate or legitimate owners or operators, how the results of applying the technology will be distributed and so on. The distinction between the specific instruction set and the rules of the broader socio-technical system with its social relationships are not rigid, but the distinction is useful for analytical purposes. The production, use, management, and regulation of technologies are socially organized: for example, a factory, a nuclear power plant, electricity system, transport system, intensive care unit of a hospital, an organ transplantation system, or telecommunication network. Such socio-technical systems consist, on the one hand, of complex technical and physical structures that are designed to produce or transform certain things (or to enable such production) and, on the other hand, of social institutions, legal orders, and organizing principles designed to structure and regulate the activities of those engaged in operating the technology. The knowledge of these different structures may be dispersed among different occupations and professions. Thus, a variety of groups, social networks, and organizations may be involved in the construction, operation, and maintenance of socio-technical systems. For any technology a model and judgment system, even if only an elementary one, of the technology and its interaction with the physical, biological, and socio-cultural environment is essential for operation, management, and regulation. The scientific and technical knowledge incorporated into the model with respect to physical and biological dimensions and relationships are often relatively developed to a greater or lesser extent. The model of the interaction of the technology with human beings and its impact on the larger society is often left partially implicit and is rarely as consciously conceptualized or as carefully articulated (see Part I) as the elements of the model describing interaction with the physical and biological environments (but even here there is no complete knowledge).

[19] This is stressed in a personal note from Nico Stehr. This paragraph articulates part of his argument.

[20] We see here, in connection with technological developments, the differences between exogenous dangers and risks as opposed to endogenous dangers and risks.

[21] In this sense the scientification of political action connects with the question of knowledge politics (and policy).

[22] Beck suggests that class awareness is fading as one becomes more preoccupied with technological and environmental risks common to all. But, empirically, this is not the case, at least not in Europe. Social exclusion is still a major issue. While the European environmental movement has put the risk issues on the political agenda, this is not the only or even the primary issue. Economic growth, employment, welfare, social inclusion remain major issues. And, in some cases, they are linked. Often, weaker groups are more subject to environmental and technical risks than well-off groups. There are new questions of fair distribution and justice, differing from those considered earlier.

[23] One may recall the assessments of nuclear power as almost totally safe — with an extremely small, almost negligible, “probability” of a nuclear accident. Experience taught us otherwise. Because of the division of labor in science as well as all technical fields, systematic knowledge is fragmented. So are the respective communities of knowledge producers. Their languages and cognitive frames (with technical concepts, models, particular methods, etc.) are deeply divided. Our capacities are severely constrained in the mobilization and development of integrated knowledge to better understand and to manage complex systems. Ironically, the human species has never known so much about nature, human history, and social, political and economic systems. Nevertheless, we are unable to fully mobilize and integrate this knowledge in effective ways for understanding and dealing with many of the problems arising in connection with our greatest accomplishments. There is no lack of “information”. But knowledge requires a model to select what is relevant or important, and what is not, and ways to link and organize different pieces of information. Models are particular human constructions, which filter, organize, transform, and help interpret “information”. Furthermore, there is a major knowledge gap between scientific communities and the general public and their political leaders. This results in tensions, misunderstandings, and distortions in the interactions between scientific communities and policymakers, for instance in the process of applying expert knowledge to policy and regulative problems. (One may observe major contemporary efforts to overcome the gap, through the use of scientific advisors, offices of technology assessment serving executive and legislative bodies, science shops, focus groups, etc.). Knowledge fragmentation and knowledge gaps would simply be regrettable, a mere failing of modern universities and the communities of knowledge professionals, if there were not great dangers and risks connected with many of the systems we construct. Some of the dangers are obvious, as in the case of nuclear weapons (or nuclear energy), or the availability and utilization of dangerous chemicals or biological materials. Still, for many or most people, some of these dangers are not so apparent. They are identified and characterized by experts, for instance, the ozone and global warming developments. Others may not be apparent at all: for example, the modern nation-state (closely associated with many welfare developments but also a major factor in systematic violation of human rights, population cleansing, and genocide); or modern complex money systems; or globally expanding capitalism with its risks of economic, social, and political destabilization. Science and technical communities play a substantial role in conceptualizing and providing data and knowledge about such systems — at the same time, that unfortunately, social sciences and humanities remain highly limited in this respect.

[24]Earlier, elites and dictators could conceptualize and judge matters in their own terms without taking into account the perspectives, concerns or sufferings of dominated or marginal groups.

[25] One might also consider as an important principle of policy to support social science and humanities research on technological impacts. Here we have in mind taxing major (e.g., apparently revolutionary) developments for the purposes of systematic investigation and assessment. More specifically, major developments, such as information technology or new genetics, should be “taxed” (for example, a percentage of R&D investments initially and eventually a percentage of capital investments) in order to support research on the social impact of these developments. In a certain sense, this is already being done in the case of the Genome program with E.L.S.I. (Ethical, Legal, and Social Implications), which is funded on the basis of a certain percentage of the Genome program. This entails 2-5% of research budgets being devoted to consideration of social, legal, and ethical issues associated with genetics research and applications.

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons