There is an saying that goes something like *"To get along one needs to go 
along."* and I believe this old saying encapsulates the issue of  'Cultural 
Theory of Risk'. The general issue loosely known as 'The Moral Hazard' is 
not an overly complicated scenario and core guidance in understanding that 
scenario may be found at the definitional level of 
Metaethics<http://moralphilosophy.info/metaethics/>. 
I've tried to simplify this general issues involved in the following 
statement (from a working draft on marine 
biomass<https://docs.google.com/document/d/1m9VXozADC0IIE6mYx5NsnJLrUvF_fWJN_GyigCzDLn0/pub>GE).






"Mapping out the moral paradox:

The primary opposing views of metaethics revolves around the issue of ones’ 
perspective. To qoute 
http://moralphilosophy.info/<http://moralphilosophy.info/metaethics/>
:

“Perhaps the biggest controversy in metaethics is that which divides moral 
realists and antirealists.

Moral realists hold that moral facts are objective facts that are out there 
in the world. Things are good or bad independent of us, and then we come 
along and discover morality.

Antirealists hold that moral facts are not out there in the world until we 
put them there, that the facts about morality are determined by facts about 
us. On this view, morality is not something that we discover so much as 
something that we invent.”.

In the context of Global Warming Mitigation (GWM), the highly complex 
matrix of the socioeconomic, political and environmental realities, 
encompass both ‘realistic’ and ‘antirealistic’ valid moral views. This 
creates a co-realistic moral paradox.

Solving the moral paradox:

Solving paradoxes requires identifying the point of fallacy in the paradox 
and then avoiding that point. The premise that fossil fuels are currently 
irreplaceable at the global scale is the fallacy which needs avoiding as 
FFs are the core cause of GW and FFs can be replaced with current 
technology.

The overall issue of large scale mitigation of global warming offers up a 
blinding array of relative rights or wrongs which can possibly be reduced 
to one core question and a simply stated strategy.

Is the continued use of FFs, on a global scale, scientifically, morally or 
ethically supportable? If not, ending the FF era should be the prime 
objective.  Any large scale mitigation strategy which can support the 
primary objective of replacing FFs should be given priority.

Until trans-formative improvements in energy storage and or distribution 
occurs, production of vast amounts of carbon negative, renewable, low cost, 
portable biofuels are needed to supplant FF use.

Under a global carbon negative fuel scenario, *the failure to increase fuel 
production and use would be considered unethical due to the CDR/CCS 
potential of BECCS*. Thus, production of carbon negative biofuels seems to 
ethically negate the moral hazard of mitigating FF induced global warming.".




 

I believe the study of meta-ethics can provide important guidance to both 
GW mitigation technology strategies and the cultural understanding of the 
complex matrix of scientific issues related to GW and GE. From the 
technical side, ethically negating the moral hazard of global warming 
mitigation is technically possible using carbon neutral/negative bio-fuels 
(BECCS) and or space based energy. From the cultural side, the 'global 
community' (what ever that may be) seems to demand both abundance and 
environmental balances. The use of carbon neutral/negative energy or space 
based energy seems capable of meeting this apparent core cultural demand of 
'Having the cake and eating it too'.


 

In the final analysis, the cultural cognition of the need for global 
warming mitigation and the mitigation method(s) employed will need to be 
well synchronized if there is to be a globally inclusive effort with a high 
level of cultural acceptance. To-date, from both the technical and cultural 
levels, only BECCS and space based energy provide the ability to ethically 
mitigate the moral hazard of global warming mitigation. The other 
mitigation methods have their value and need to be integrated with the 
primary mitigation means, when and where appropriate, as secondary 
strategies.


      

Best,


Michael   





On Thursday, February 27, 2014 12:04:00 AM UTC-8, andrewjlockley wrote:
>
> Poster's note : This is just brilliant. At last an explanation of why 
> believing nonsense is rational. Useful to reflect on how this paper replies 
> to the origin and persistence of other belief systems, as well as climate 
> change. Leaves me wondering what nonsense I believe. 
>
>
> http://www.culturalcognition.net/blog/2014/2/23/three-models-of-risk-perception-their-significance-for-self.html
>
> Three models of risk perception & their significance for self-government
>
> Dan Kahan Posted on Sunday, February 23, 2014 at 7:52AM
>
> From Geoengineering and Climate Change Polarization: Testing a Two-channel 
> Model of Science Communication, Ann. Am. Acad. Pol. & Soc. Sci. (in press).
>
> Theoretical background
>
> Three models of risk perception
>
> The scholarly literature on risk perception and communication is dominated 
> by two models. The first is the rational-weigher model, which posits that 
> members of the public, in aggregate and over time, can be expected to 
> process information about risk in a manner that promotes their expected 
> utility (Starr 1969). The second is the irrational-weigher model, which 
> asserts that ordinary members of the pubic lack the ability to reliably 
> advance their expected utility because their assessment of risk information 
> is constrained by cognitive biases and other manifestations of bounded 
> rationality (Kahneman 2003; Sunstein 2005; Marx et al. 2007; Weber 
> 2006).Neither of these models cogently explains public conflict over 
> climate change—or a host of other putative societal risks, such as nuclear 
> power, the vaccination of teenage girls for HPV, and the removal of 
> restrictions on carrying concealed handguns in public. Such disputes 
> conspicuously feature partisan divisions over facts that admit of 
> scientific investigation. Nothing in the rational-weigher model predicts 
> that people with different values or opposing political commitments will 
> draw radically different inferences from common information. Likewise, 
> nothing in the irrational-weigher model suggests that people who subscribe 
> to one set of values are any more or less bounded in their rationality than 
> those who subscribe to any other, or that cognitive biases will produce 
> systematic divisions of opinion of among such groups.
>
> One explanation for such conflict is the cultural cognition thesis (CCT). 
> CCT says that cultural values are cognitively prior to facts in public risk 
> conflicts: as a result of a complex of interrelated psychological 
> mechanisms, groups of individuals will credit and dismiss evidence of risk 
> in patterns that reflect and reinforce their distinctive understandings of 
> how society should be organized (Kahan, Braman, Cohen, Gastil & Slovic 
> 2010; Jenkins-Smith & Herron 2009). Thus, persons 
> with individualistic values can be expected to be relatively dismissive of 
> environmental and technological risks, which if widely accepted would 
> justify restricting commerce and industry, activities that people with such 
> values hold in high regard. The same goes for individuals 
> withhierarchical values, who see assertions of environmental risk as 
> indictments of social elites. Individuals 
> with egalitarian and communitarian values, in contrast, see commerce and 
> industry as sources of unjust disparity and symbols of noxious 
> self-seeking, and thus readily credit assertions that these activities are 
> hazardous and therefore worthy of regulation (Douglass & Wildavsky 1982). 
> Observational and experimental studies have linked these and comparable 
> sets of outlooks to myriad risk controversies, including the one over 
> climate change (Kahan 2012).Individuals, on the CCT account, behave not as 
> expected-utility weighers—rational or irrational—but rather as cultural 
> evaluators of risk information (Kahan, Slovic, Braman & Gastil 2006). The 
> beliefs any individual forms on societal risks like climate change—whether 
> right or wrong—do not meaningfully affect his or her personal exposure to 
> those risks. However, precisely because positions on those issues are 
> commonly understood to cohere with allegiance to one or another cultural 
> style, taking a position at odds with the dominant view in his or her 
> cultural group is likely to compromise that individual’s relationship with 
> others on whom that individual depends for emotional and material support. 
> As individuals, citizens are thus likely to do better in their daily lives 
> when they adopt toward putative hazards the stances that express their 
> commitment to values that they share with others, irrespective of the fit 
> between those beliefs and the actuarial magnitudes and probabilities of 
> those risks.The cultural evaluator model takes issue with the 
> irrational-weigher assumption that popular conflict over risk stems from 
> overreliance on heuristic forms of information processing (Lodge & Taber 
> 2013; Sunstein 2006). Empirical evidence suggests that culturally diverse 
> citizens are indeed reliably guided toward opposing stances by unconscious 
> processing of cues, such as the emotional resonances of arguments and the 
> apparent values of risk communicators (Kahan, Jenkins-Smith & Braman 2011; 
> Jenkins-Smith & Herron 2009; Jenkins-Smith 2001).But contrary to the 
> picture painted by the irrational-weigher model, ordinary citizens who are 
> equipped and disposed to appraise information in a reflective, analytic 
> manner are not more likely to form beliefs consistent with the best 
> available evidence on risk. Instead they often become even more culturally 
> polarized because of the special capacity they have to search out and 
> interpret evidence in patterns that sustain the convergence between their 
> risk perceptions and their group identities (Kahan, Peters, Wittlin, 
> Slovic, Ouellette, Braman & Mandel 2012; Kahan 2013; Kahan, Peters, Dawson 
> & Slovic 2013).Two channels of science communication
>
> The rational- and irrational-weigher models of risk perception generate 
> competing prescriptions for science communication. The former posits that 
> individuals can be expected, eventually, to form empirically sound 
> positions so long as they are furnished with sufficient and sufficiently 
> accurate information (e.g., Viscusi 1983; Philipson & Posner 1993). The 
> latter asserts that the attempts to educate the public about risk are at 
> best futile, since the public lacks the knowledge and capacity to 
> comprehend; at worst such efforts are self-defeating, since ordinary 
> individuals are prone to overreact on the basis of fear and other affective 
> influences on judgment. The better strategy is to steer risk policymaking 
> away from democratically accountable actors to politically insulated 
> experts and to “change the subject” when risk issues arise in public debate 
> (Sunstein 2005, p. 125; see also Breyer 1993).
>
> The cultural-evaluator model associated with CCT offers a more nuanced 
> account. It recognizes that when empirical claims about societal risk 
> become suffused with antagonistic cultural meanings, intensified efforts to 
> disseminate sound information are unlikely to generate consensus and can 
> even stimulate conflict.
>
> But those instances are exceptional—indeed, pathological. There are vastly 
> more risk issues—from the hazards of power lines to the side-effects of 
> antibiotics to the tumor-stimulating consequences of cell phones—that avoid 
> becoming broadly entangled with antagonistic cultural meanings. Using the 
> same ability that they reliably employ to seek and follow expert medical 
> treatment when they are ill or expert auto-mechanic service when their car 
> breaks down, the vast majority of ordinary citizens can be counted on in 
> these “normal,” non-pathological cases to discern and conform their beliefs 
> to the best available scientific evidence (Keil 2010).
>
> The cultural-evaluator model therefore counsels a two-channel strategy of 
> science communication. Channel 1 is focused on information content and is 
> informed by the best available understandings of how to convey empirically 
> sound evidence, the basis and significance of which are readily accessible 
> to ordinary citizens (e.g., Gigerenzer 2000; Spiegelhalter, Pearson & Short 
> 2011). Channel 2 focuses on cultural meanings: the myriad cues—from group 
> affinities and antipathies to positive and negative affective resonances to 
> congenial or hostile narrative structures—that individuals unconsciously 
> rely on to determine whether a particular stance toward a putative risk is 
> consistent with their defining commitments. To be effective, science 
> communication must successfully negotiate both channels. That is, in 
> addition to furnishing individuals with valid and pertinent information 
> about how the world works, it must avail itself of the cues necessary to 
> assure individuals that assenting to that information will not estrange 
> them from their communities (Kahan, Slovic, Braman & Gastil 2006; Nisbet 
> 2009).
>

-- 
You received this message because you are subscribed to the Google Groups 
"geoengineering" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to geoengineering+unsubscr...@googlegroups.com.
To post to this group, send email to geoengineering@googlegroups.com.
Visit this group at http://groups.google.com/group/geoengineering.
For more options, visit https://groups.google.com/d/optout.

Reply via email to