basilisk (basilisk) wrote,

О восприятии и распознавании псевдо-глубокой ерунды. Часть 2.

О восприятии и распознавании псевдо-глубокой ерунды. Часть 1.

17 General discussion

The present study represents an initial investigation of the individual differences in receptivity to pseudo-profound bullshit. We gave people syntactically coherent sentences that consisted of random vague buzzwords and, across four studies, these statements were judged to be at least somewhat profound. This tendency was also evident when we presented participants with similar real-world examples of pseudo-profound bullshit. Most importantly, we have provided evidence that individuals vary in conceptually interpretable ways in their propensity to ascribe profundity to bullshit statements; a tendency we refer to as “bullshit receptivity”. Those more receptive to bullshit are less reflective, lower in cognitive ability (i.e., verbal and fluid intelligence, numeracy), are more prone to ontological confusions and conspiratorial ideation, are more likely to hold religious and paranormal beliefs, and are more likely to endorse complementary and alternative medicine. Finally, we introduced a measure of pseudo-profound bullshit sensitivity by computing a difference score between profundity ratings for pseudo-profound bullshit and legitimately meaningful motivational quotations. This measure was related to analytic cognitive style and paranormal skepticism. However, there was no association between bullshit sensitivity and either conspiratorial ideation or acceptance of complementary and alternative medicine (CAM). Nonetheless, our findings are consistent with the idea that the tendency to rate vague, meaningless statements as profound (i.e., pseudo-profound bullshit receptivity) is a legitimate psychological phenomenon that is consistently related to at least some variables of theoretical interest.

17.1 Response bias and sensitivity

We proposed two mechanisms that explain why people might rate bullshit as profound. The first is a type of response bias wherein some individuals are simply more prone to relatively high profundity ratings. Although this mechanism is not specific to bullshit, it may at least partly explain why our pseudo-profound bullshit measure was so consistently positively correlated with epistemically suspect beliefs. Some people may have an uncritically open mind. As the idiom goes: “It pays to keep an open mind, but not so open your brains fall out”. In Study 3, some people even rated entirely mundane statements (e.g., “Most people enjoy at least some sort of music”) as at least somewhat profound. Our results suggest that this tendency – which resembles a general gullibility factor – is a component of pseudo-profound bullshit receptivity. There is, of course, a great deal of research on this sort of mechanism. As a prominent example, consider the “Barnum effect”. In his classic demonstration of gullibility, Forer (1949) had introductory psychology students complete a personality measure (the “Diagnostic Interest Blank”, DIB). One week later, he gave each of the students an ostensibly personalized personality sketch that consisted of 13 statements and asked them to rate both the accuracy of the statements and the overall efficacy of the DIB. Unbeknownst to the students, Forer had actually given every student the same personality sketch that consisted entirely of vague, generalized statements taken from a newsstand astrology book (e.g., “You have a great need for other people to like and admire you.”). Although some people were more skeptical than others, the lowest number of specific statements accepted was 8 (out of 13). Moreover, the students were quite convinced of the personality tests’ efficacy – “All of the students accepted the DIB as a good or perfect instrument for personality measurement” (Forer, 1949, p. 121). Meehl (1956) first referred to this as the Barnum effect, after the notorious hoaxer (bullshitter) P. T. Barnum.2

As a secondary point, it is worthwhile to distinguish uncritical or reflexive open-mindedness from thoughtful or reflective open-mindedness. Whereas reflexive open-mindedness results from an intuitive mindset that is very accepting of information without very much processing, reflective open-mindedness (or active open-mindedness; e.g., Baron, Scott, Fincher & Metz, 2014) results from a mindset that searches for information as a means to facilitate critical analysis and reflection. Thus, the former should cause one to be more receptive of bullshit whereas the latter, much like analytic cognitive style, should guard against it.

The foregoing highlights what appears to be a strong general susceptibility to bullshit, but what cognitive mechanisms inoculate against bullshit? Drawing on recent dual-process theories that posit a key role for conflict detection in reasoning (De Neys, 2012; Pennycook et al., 2015), we proposed that people may vary in their ability to detect bullshit. Our results modestly support this claim. Namely, we created a bullshit “sensitivity” measure by subtracting profundity ratings for pseudo-profound bullshit from ratings for legitimate motivational quotations. Increased bullshit sensitivity was associated with better performance on measures of analytic thinking. This is consistent with Sagan’s (1996) famous claim that critical thinking facilitates “baloney detection”.

Further, bullshit sensitivity was associated with lower paranormal belief, but not conspiratorial ideation or acceptance of complementary and alternative medicine. This was not predicted as all three forms of belief are considered “epistemically suspect” (e.g., Pennycook, et al., in press). One possible explanation for this divergence is that supernatural beliefs are a unique subclass because they entail a conflict between some immaterial claim and (presumably universal) intuitive folk concepts (Atran & Norenzayan, 2004). For example, the belief in ghosts conflicts with folk-mechanics – that is intuitive belief that objects cannot pass through solid objects (Boyer, 1994). Pennycook et al. (2014) found that degree of belief in supernatural religious claims (e.g., angels, demons) is negatively correlated with conflict detection effects in a reasoning paradigm. This result suggests that the particularly robust association between pseudo-profound bullshit receptivity and supernatural beliefs may be because both response bias and conflict detection (sensitivity) support both factors. Further research is needed to test this claim.

17.2 Future directions

The focus of this work was on investigating individual differences in the tendency to accept bullshit statements, and our initial evidence indicates that reflectiveness may be a key individual difference variable. At a very basic level, the willingness to stop and think analytically about the actual meanings of the presented words and their associations would seem an a priori defense against accepting bullshit at face value (i.e., to avoid an excessively open-minded response bias). Moreover, increased detection of bullshit may reinforce a critical attitude and potentially engender a more restrained attitude to profundity judgments. The present findings also provide evidence that an increased knowledge of word meaning (via verbal intelligence) may assist in critical analysis. An understanding of more precisely nuanced meanings of words may reveal inconsistencies, incongruities, and conflicts among terms in bullshit statements. Conflict detection is a key aspect of dual-process theories (e.g., De Neys, 2012; Pennycook, et al., 2015), though in this case it remains unclear precisely what features of bullshit statements might cue reflective thinking. What is it about a statement like “good health imparts reality to subtle creativity” that might cause someone to stop and consider the meaning of the sentence more deeply?

Although a reflective thinking style appears to militate against bullshit acceptance, other cognitive processes that underlie the propensity to find meaning in meaningless statements remain to be elucidated. It may be that people naturally assume that statements presented in a psychology study (vague or otherwise) are constructed with the goal of conveying some meaning. Indeed, the vagueness of the statements may imply that the intended meaning is so important or profound that it cannot be stated plainly (Sperber, 2010). In the current work, we presented the participants with meaningless statements without cueing them to the possibility that they are complete bullshit. Although this is likely how bullshit is often encountered in everyday life, it may be that some skepticism about the source of the statement is the key force that may guard against bullshit acceptance. For example, poems attributed to prestigious sources are evaluated more positively (Bar-Hillel, Maharshak, Moshinsky & Nofech, 2012). Interpretation is difficult and humans surely rely on simple heuristics (e.g., “do I trust the source?”) to help with the task.

In this vein, psychological research should aim to elucidate contextual factors that interact with individual differences in the reception and detection of bullshit. As noted by philosophers studying the topic, the bullshitter oft has the intention of implying greater meaning than is literally contained in the message, though the nature of the intent can vary. For example, the literary critic Empson (1947) describes the use of ambiguity in literature, including a type of intentional ambiguity used by poets in which a passage “says nothing, by tautology, by contradiction, or by irrelevant statements; so that the reader is forced to invent statements of his own …” (p. 176). The employment and reception of such literary devices in the context of a broader meaningful work seems related to but dissociable from isolated statements such as those used here. By examining pseudo-profound bullshit in an empirical fashion, we set the stage for further refinement of this important conceptual variable as it converges with and diverges from other related uses of vagueness. We anticipate that there are many variations of vague, ambiguous, or otherwise unclear statements that have unique psychological correlates in varied contexts that are amenable to study.

18 Limitations and caveats

Bullshit comes in many forms and we have focused on only one type. Frankfurt (2005) discusses the so-called bull session wherein “people try out various thoughts and attitudes in order to see how it feels to hear themselves saying such things and in order to discover how others respond, without it being assumed that they are committed to what they say: It is understood by everyone in a bull session that the statements people make do not necessarily reveal what they really believe or how they really feel” (p. 9). This qualifies as bullshit under Frankfurt’s broad definition because the content is being communicated absent a concern for the truth. Nonetheless, the character of conversational bullshit is likely quite different from pseudo-profound bullshit, and by extension the reception and detection of it may be determined by different psychological factors. It is important for researchers interested in the psychology of bullshit to be clear about the type of bullshit that they are investigating.

Our bullshit receptivity scale was quite successful overall, but future work is needed to refine and improve it. In particular, the bullshit sensitivity measure would be improved if there was a more direct mapping between the pseudo-profound bullshit and the genuinely meaningful control items. Naturally, more items would improve both scales. Finally, knowledge of Deepak Chopra may subtly confound experiments using our bullshit sensitivity scale (or, at least, slightly restrict the effect size).

Finally, we have focused on an individual differences approach given that our primary goal was to demonstrate that bullshit receptivity is a consequential thing that can be reliably measured. This preliminary work is required for experiments to be meaningful. Future work should focus on the dual goals of further refining our measure of bullshit receptivity and experimentally modulating profundity ratings for pseudo-profound bullshit.

19 Conclusion

Bullshit is a consequential aspect of the human condition. Indeed, with the rise of communication technology, people are likely encountering more bullshit in their everyday lives than ever before. Profundity ratings for statements containing a random collection of buzzwords were very strongly correlated with a selective collection of actual “Tweets” from Deepak Chopra’s “Twitter” feed (r’s = .88–89). At the time of this writing, Chopra has over 2.5 million followers on “Twitter” and has written more than twenty New York Times bestsellers. Bullshit is not only common; it is popular.3 Chopra is, of course, just one example among many. Using vagueness or ambiguity to mask a lack of meaningfulness is surely common in political rhetoric, marketing, and even academia (Sokal, 2008). Indeed, as intimated by Frankfurt (2005), bullshitting is something that we likely all engage in to some degree (p. 1): “One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Each of us contributes his share.” One benefit of gaining a better understanding of how we reject other’s bullshit is that it may teach us to be more cognizant of our own bullshit.

The construction of a reliable index of bullshit receptivity is an important first step toward gaining a better understanding of the underlying cognitive and social mechanisms that determine if and when bullshit is detected. Our bullshit receptivity scale was associated with a relatively wide range of important psychological factors. This is a valuable first step toward gaining a better understanding of the psychology of bullshit. The development of interventions and strategies that help individuals guard against bullshit is an important additional goal that requires considerable attention from cognitive and social psychologists. That people vary in their receptivity toward bullshit is perhaps less surprising than the fact that psychological scientists have heretofore neglected this issue. Accordingly, although this manuscript may not be truly profound, it is indeed meaningful.

20 References

Arthur, W., & Day, D. (1994). Development of a short form for the Raven Advanced Progressive Matrices test. Educational and Psychological Measurement, 54, 395–403.

Atran, S., & Norenzayan, A. (2004). Religion’s evolutionary landscape: Counterintuition, commitment, compassion, communion. Behavioural and Brain Sciences, 27, 713–770.

Baron, J. (1985). Rationality and intelligence. New York: Cambridge University Press.

Baron, J., Scott, S., Fincher, K. S., & Metz, E. (2014). Why does the Cognitive Reflection Test (sometimes) predict utilitarian moral judgment (and other things)? Journal of Applied Research in Memory and Cognition, 4, 265–284.

Bar-Hillel, M., Maharshak, A., Moshinsky, A., & Nofech, R. (2012). A rose by any other name: A social-cognitive perspective on poets and poetry. Judgment and Decision Making, 7, 149–164.

Black, M. (1983). The prevalence of Humbug and other essays. Ithaca/London: Cornell University Press.

Boyer, P. (1994). The naturalness of religious ideas: A cognitive theory of religion. Berkeley, CA: University of California Press.

Brotherton, R., French, C. C., & Pickering, A. D. (2013). Measuring belief in conspiracy theories: The generic conspiracist beliefs scale. Frontiers in Personality Science and Individual Differences, 4, 279.

Browne, M., Thomson, P., Rockloff, M. J., & Pennycook, G. (2015). Going against the herd: Psychological and cultural factors underlying the “vaccination confidence gap”. PLoS ONE 10(9), e0132562.

Buekens, F. & Boudry, M. (2015). The dark side of the long: Explaining the temptations of obscurantism. Theoria, 81, 126–142.

Campitelli, G. & Gerrans, P. (2014). Does the cognitive reflection test measure cognitive reflection? A mathematical modeling approach. Memory & Cognition, 42, 434–447.

Chiesi, F., Ciancaleoni, M., Galli, S., Morsanyi, K., & Primi, C. (2012). Item response theory analysis and differential item functioning across age, gender, and country of a short form of the Advanced Progressive Matrices. Learning and Individual Differences, 22, 390–396.

Chopra, D. (1989). Quantum Healing. New York: Bantam Books.

Chopra, D. (2008). The Soul of Leadership. New York: Harmony Books.

De Neys, W. (2012). Bias and conflict: A case for logical intuitions. Perspectives on Psychological Science, 7, 28–38.

De Neys, W. (2014). Conflict detection, dual processes, and logical intuitions: Some clarifications. Thinking & Reasoning, 20, 167–187.

Empson, W. (1947). Seven Types of Ambiguity. Chatto & Windus, London

Evans, J. St. B. T., & Stanovich, K. E. (2013). Dual-process theories of higher cognition: Advancing the debate. Perspectives in Psychological Science, 8, 223–241.

Forer, B. R., (1949). The fallacy of personal validation: A classroom demonstration of gullibility. Journal of Abnormal and Social Psychology, 44, 118–123.

Frederick, S. (2005). Cognitive reflection and decision making. The Journal of Economic Perspectives, 19, 25–42.

Frankfurt, H. G. (2005) On Bullshit. Cambridge: Cambridge University Press.

Furnham, A., & Schofield, S. (1987). Accepting personality test feedback: A review of the Barnum effect. Current Psychological Research and Reviews, 6, 162–178.

Gervais, W. M., & Norenzayan, A. (2012). Analytic thinking promotes religious disbelief. Science, 336, 493–496.

Gilbert, D. T. (1991). How mental systems believe. American Psychologist, 46, 107–119.

Gilbert, D. T., Tafarodi, R. W., & Malone, P. S. (1993). You can’t not believe everything you read. Journal of Personality and Social Psychology, 65, 221–233.

Gosling, S. D., Rentfrow, P. J., & Swann, W. B. (2003). A very brief measure of the Big-Five personality domains. Journal of Research in Personality, 37, 504–528.

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Strauss, & Giroux.

Lindeman, M. (2011). Biases in intuitive reasoning and belief in complementary and alternative medicine. Psychology & Health, 26, 371–82.

Lindeman, M., & Aarnio, K. (2007). Superstitious, magical, and paranormal beliefs: An integrative model. Journal of Research in Personality, 41, 731–744.

Lindeman, M., Cederström, S., Simola, P., Simula, A., Ollikainen, S., & Riekki, T. (2008). Sentences with core knowledge violations increase the size of n400 among paranormal believers. Cortex, 44, 1307–1315.

Lindeman, M., Svedholm-Hakkinen, A. M., & Lipsanen, J. (2015). Ontological confusions but not mentalizing abilities predict religious belief, paranormal beliefs, and belief in supernatural purpose. Cognition, 134, 63–76.

Lipkus, I. M., Samsa, G., & Rimer, B. K. (2001). General performance on a numeracy scale among highly educated samples. Medical Decision Making, 21, 37–44.

Lobato, E., Mendoza, J., Sims, V., & Chin, M. (2014). Examining the relationship between conspiracy theories, paranormal beliefs, and pseudoscience acceptance among a university population. Applied Cognitive Psychology, 28, 617–625.

Malhorta, N., Krosnick, J. A., & Haertel, E. (2007). The psychometric properties of the GSS Wordsum vocabulary test, GSS Methodology Report No. 111. Chicago: NORC.

Meehl, P. E. (1956). Wanted—a good cookbook. American Psychologist, 11, 262–272.

Pacini, R., & Epstein, S. (1999). The relation of rational and experiential information processing styles to personality, basic beliefs, and the ratio-bias phenomenon. Journal of Personality and Social Psychology, 76, 972–987.

Pennycook, G., Cheyne, J. A., Barr, N., Koehler, D. J. & Fugelsang, J. A. (2014). Cognitive style and religiosity: The role of conflict detection. Memory & Cognition, 42, 1–10.

Pennycook, G., Cheyne, J. A., Seli, P., Koehler, D. J. & Fugelsang, J. A. (2012). Analytic cognitive style predicts religious and paranormal belief. Cognition, 123, 335–346.

Pennycook, G., Fugelsang, J. A., & Koehler, D. J. (2015). What makes us think? A three-stage dual-process model of analytic engagement. Cognitive Psychology, 80, 34–72.

Pennycook, G., Fugelsang, J. A., & Koehler, D. J. (in press). Everyday consequences of analytic thinking. Current Directions in Psychological Science.

Perry, T. (1997). “So Rich, So Restless”. Los Angeles Times. 7 September.

Sagan, C. (1996). The fine art of baloney detection. The Demon-Haunted World: Science as a Candle in the Dark. New York: Random House, 201–218.

Schwartz, L. M., Woloshin, S., Black, W. C., & Welch, H. G. (1997). The role of numeracy in understanding the benefit of screening mammography. Annals of Internal Medicine, 127, 966–972.

Shenhav, A., Rand, D. G., & Greene, J. D. (2012). Divine intuition: Cognitive style influences belief in god. Journal of Experimental Psychology: General, 141, 423–428.

Shermer, M. (2010). Deepakese: The Woo-Woo Master Deepak Chopra Speaks.\_b\_405114.html.

Sokal, A. (2008). Beyond the Hoax: Science, Philosophy and Culture. New York: Oxford.

Sperber, D. (2010). The guru effect. Review of Philosophical Psychology, 1, 583–592.

Stanovich, K. E. (2011). Rationality and the reflective mind. New York, NY: Oxford University Press.

Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23, 645–726.

Svedholm, A. M., & Lindeman, M. (2013). The separate roles of the reflective mind and involuntary inhibitory control in gatekeeping paranormal beliefs and the underlying intuitive confusions. British Journal of Psychology, 3, 303–319.

Swami, V., Voracek, M., Stieger, S., Tran, U. S., & Furnham, A. (2014). Analytic thinking reduces belief in conspiracy theories. Cognition, 133, 572–585.

Tobacyk, J. (2004). A revised paranormal belief scale. International Journal of Transpersonal Studies, 23, 94–98.

Toplak, M. V., West, R. F., & Stanovich, K. E. (2011). The Cognitive Reflection Test as a predictor of performance on heuristics-and-biases tasks. Memory & Cognition, 39, 1275–1289.

Toplak, M. V., West, R. F., & Stanovich, K. E. (2014). Assessing miserly information processing: An expansion of the Cognitive Reflection Test. Thinking & Reasoning, 20, 147–168.

Department of Psychology, University of Waterloo, 200 University Avenue West, Waterloo ON, Canada, N2L 3G1. Email:
Department of Psychology, University of Waterloo.
The School of Humanities and Creativity, Sheridan College.
Funding for this study was provided by the Natural Sciences and Engineering Research Council of Canada.

Copyright: © 2015. The authors license this article under the terms of the Creative Commons Attribution 3.0 License.
This example came from See Method section of Study 1 for further details.
In an amusing irony, P. T. Barnum is often erroneously attributed the phrase “There’s a sucker born every minute.” This is true even in at least one review of research on the Barnum effect (Furnham & Shofield, 1987).
And profitable. Deepak Chopra is one of the wealthiest holistic-health “gurus” (Perry, 1997). This is not to say that everything Deepak Chopra has written is bullshit. Nonetheless, some of it seems to meet our definition of pseudo-profound bullshit. Our goal here is to simply raise the possibility that Chopra’s tendency to bullshit (as claimed by others, Shermer, 2010) may have played an important role in his popularity.
Tags: science

Posts from This Journal “science” Tag

promo basilisk september 23, 2014 09:00 7
Buy for 40 tokens
Только весной Крым снова стал российским. Раньше здесь, в Порт-Кавказе на косе Чушка, проходила государственная граница. В порту ещё стоят здания таможни и пограничников. 1. Автомобильная и железная дороги в Крым заканчиваются на станции Порт-Кавказ. Дальше путь по воде. 2. Порт-Кавказ -…
  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your IP address will be recorded