Science AMA Series: We're George Loewenstein, Russell Golman, and David Hagmann, three behavioral economists studying why people sometimes avoid information they know to be useful. AUA!

Abstract

[removed]

Hi! Thanks for the AMA, fascinating topic!

Simply out of curiosity, when it comes to day-to-day (non-political/religion focused) decision making, is there any comparative data for prevalence/evenness of this behavior between:

1) Different political groups?

2) Different faiths?

3) Various political/religious world leaders through history and their willingness to exploit this human flaw to their advantage.

HerbziKal

(David) Not across political groups, as far as we can tell. I'm not aware of any studies comparing information avoidance across different faiths and I'd be reluctant to draw a causal association there: faith is correlated with a lot of other cultural values and I suspect a measurement would rather pick up the latter than the former. That's especially problematic if the questions used to measure avoidance aren't designed to hold across cultures. Just think about financial information avoidance: hugely important if you are in the US and need a 401(k) to retire, less so if you live in country that has a well-funded pension system.

I can't speak to specific political or religious leaders (would be fascinating to see historians pick this up!), but I suspect wishful thinking, which is fairly related, is fairly prevalent. Think about President Obama's campaign and the subsequent disappointment of some of his supporters: they didn't expect he'd face as strong an opposition as he did. Or President Trump's promises of bringing back certain jobs or building a wall: I suspect many of his supporters underestimate the difficulty of doing this. But imagine a campaign that was realistic about the powers of the presidency... it wouldn't inspire much turnout.


Is the reason people avoid this information because the cost of changing their life is higher than the marginal benefit they could gain from the change?

Kozyre

(David here): I know it's early, but such a deep question and great follow-ups below... I couldn't resist!

The term "rational," as it's used in economics, is different from everyday language. It merely means that people adhere to a set of axioms. For example, if you prefer drinking water to drinking Coke and you prefer drinking Coke to drinking Pepsi, then you should prefer water to drinking Pepsi. The assumptions underlying rationality actually all seem intuitive, even as there are experiments that have shown to violate them.

We might think of information avoidance as irrational, because making a decision with more (accurate) information can't possibly be worse than making it with less information. But the moment we allow for people to get utility (some kind of benefit) from what they believe, it may be quite rational to avoid information. Now I have to balance the gains from making a better decision with more information against the potential losses from having my beliefs challenged.

And I think your question gets at the root of that: does the emotional cost of the information outweigh the benefit that could come from making better decisions? Especially because learning the information may be painful now, while the benefits might be off far in the future? I think that may be true in many cases and a good parallel here is exercising: unpleasant initially, but good (and even enjoyable) in the long run. Do we think of people who don't exercise as irrational?

I'd be hesitant to say people are making mistakes, but I think the consequences can often be costly -- and maybe costlier than many people would have thought when they decided to avoid the information. If thinking about finances creates anxiety, for example, you might leave your money in a money-market fund rather than investments suited for a long-term horizon -- and that could impact your ability to retire. Maybe there, some momentary anxiety is a worthwhile trade-off.


Have you noticed a greater prevalence of this behavior in specific populations such as professions, age ranges, or education levels? This topic has been on my mind since I first heard about it.

Follow up question, has there been any method found that effectively discourages or subverts information negation?

MyLittleLamprey

(George) Well, Dan Kahan (who has done the most extensive, and certainly some of the best research on information avoidance) finds: (1) that scientific expertise is not an antidote; in fact, his research suggests that people with greater expertise may be more biased, perhaps because they use their expertise to bolster the positions they are motivated to take. On the other hand, he has also done research which suggests that people who are characterized by greater scientific curiosity (as opposed to expertise) ARE somewhat less biased.

In my research, with Linda Babcock, on the self-serving bias, we find that getting people to think about weaknesses in their own arguments is somewhat helpful. It may be helpful because it's something that we aren't naturally prone to do.


How does your work link up to communications and psych research about filter bubbles and echo chambers?

firedrops

(Russell) In related research with George, Karl Ove Moene, and Luca Zarri, we've discussed filter bubbles and echo chambers and proposed that people are hostile to attitudes and opinions that conflict with their own as a result of a kind of sunk-cost effect for investing in your own beliefs. So if I've spent my life believing that America is great and that freedom and liberty are values that are worth fighting and dying for, and if I've served in the military and made personal sacrifices in service of these beliefs, then anti-American opinions are an attack on my own life's work.


Any particular advice on how to deal with such people when you need to work with them in a team? Try to make them feel that they found out the information on their own or something along those lines?

Swarlsonegger

(George) Well, your question implies that there are some "such people" who avoid information, and hence implies that there are some people who don't. Inevitably, there are individual differences; some avoid information more than others, but I think most people avoid some information at least some of the time. Among the difficult traits of people you might find yourself on a team with, I'm not convinced that information avoidance is one of the most dire. But, picking up on the second part of your question, I think that one very successful approach to working on a team is to give people ownership of ideas, whether deserved or not; people will work much harder to promote ideas/projects/investments they think of as their own than to advance those that other people claim ownership of.


Does your research show a relation between the complexity of the information and its chances to be avoided ?

Or is it only a matter of confirming one's decisions/opinions, and unrelated to complexity ?

monsieur_h

(George) Well, again referring to my old (and some newer) research, my co-researchers and I have found that the self-serving fairness bias (the tendency to believe that what favors your own interests is what is fair) is much greater in more complex situations, where there are lots of factors entering into a judgment of what is fair, that can be selectively invoked. In recent research with Linda Dezso (University of Vienna) we find that history, which is of course almost infinitely complex, provides fertile grounds for self-serving biases. If two parties interacted in the past, and one got the better of the other, the party that got the raw end of the stick in the last interaction believes that it's 'their turn' to get the lion's share in the current round. The party that did better in the past tends to have the attitude: "let bygones be bygones"; it's a new regime; what happened in the past is irrelevant.


How can I apply your research to encourage people (including myself) not to avoid useful information?

Akura-

(David) There are a couple steps to consider. First and most straight-forward: people have to look at the information. But once obtained, they also have to evaluate it neutrally -- and remember it. Our work finds that all three are problems: even if information is obtained, favorable information is given more weight, while unfavorable information is dismissed; and when information is unfavorable, it's also more likely to be forgotten.

So consciously looking for disconfirming evidence (things that challenge what you believe) is an important first step. If you are part of a team, ask specifically for reasons not to do something. If you can't come up with any, you're probably not thinking hard enough: it's rare that something has no risks or downsides. Make sure to write down not just your conclusions, but also your reasons -- and re-evaluate them down the road. And make sure to consider evidence not merely as a means of furthering your argument: that has the risk of making you too gullible when something supports your views and too dismissive when it does not.


How can I apply your research to encourage people (including myself) not to avoid useful information?

Akura-

(Russell): Not necessarily very persuasive, but I like to remind myself that I'll get over bad news faster than I think. It's easy to focus on how bad it's going to be find out something bad and to forget that I'll get used to it and move on, and I'll be better able to move on once I get the information I need.


Thanks for the AMA. What does the research show to be the best way to encourage people to absorb information they disagree with?

If you scanned all the different political subreddits here, how do you see your research playing out in the real world?

straydog1980

(David) I haven't looked systematically at the political subreddits, but the existence of political subreddits is interesting on its own. Here we have a group of smart people who choose to segregate themselves according to the candidates they support.

There is a paper that looks at the r/changemyview titled "Winning Arguments: Interaction Dynamics and Persuasion Strategies in Good-faith Online Discussions." The authors rely on properties of the text to look at what persuades the OP. It's great that a community like that subreddit exists and it'd be interesting to see what motivates people to open themselves up to persuasion there.


What individual differences most influence people's tendency to ignore/reinterpret information counter to their expectations?

Congratulations Dr. Golman!

sparzson

(George) We don't know what the underlying causes are of individual differences in information avoidance, but in research with David and Emily Ho, we have been developing an information avoidance scale, to measure such individual differences, and are finding that there is considering heterogeneity across individuals. On our scale, I personally am at the absolute extreme -- on the information avoidance side. I would always prefer to avoid information if it threatens to be unpleasant. I suppose that relates to the rationality question above; my view is that life is too short to make oneself miserable by getting mired in negative stuff that it's possible to avoid, and I don't think that is necessarily irrational, even if it can have negative consequences for the information-avoider, and for society.


Do businesses appear to engage in this to the same degree, or does competitive pressure apply enough incentive to overcome this tendency?

Xiipre

(David) Competitive pressure can make things worse. As mentioned in another reply, the market for news is a great example. Rather than leading to an unbiased, objective news source winning out, the prediction is actually what we observe in the real world: many different news sources splitting up the market and catering to the ideological preconceptions of their viewers and readers. So a policy prescription for better news might not be "more market" (whatever that means), but a news source not subject to competitive pressures -- maybe something like the BBC.


Do you find any difference in how information is interpreted based on the source of that information? As a crude example, certain information (such as calorie counts on restaurant menus) are being mandated in order to influence "better eating habits." Is this sort of information more likely to be ignored by individuals than information that may simply be provided without any motivation to influence behavior--such as, for example, stuff learned in school or stuff publicized in the news?

(My wife is a registered dietitian, and one interesting thing she notes is that calorie counts sometimes has the opposite effect: people choose meals with more calories because they feel hungry and want a larger portion.)

w3woody

(David) The source of information definitely matters! An illustration is a paper by David Tannenbaum, in which he asks people about their attitudes toward "Nudges" (interventions based on behavioral economics). He uses a real policy example (I believe it was enrolling employees by default into their company's 401k plan) and says either that it was supported by Democrats or by Republicans in congress. In truth, the proposal had bipartisan support. He makes clear that he doesn't want to know about support for this policy in particular, but the approach in general. People support nudges when their party was in favor of them, but oppose nudges when they were favored by the other party. David had access to participants who held elected office (e.g. mayors) and found that they were just as biased as the rest of us.

I think disclosing the purpose also matters. There's a study (I'm blanking on the authors) showing that when you show cost savings of CFL bulbs, both conservatives and liberals like them. But if you also add a message about their being the "green" choice, conservatives are actually less likely to choose them. It stops being a financial choice and turns into a political/ideological decision, and that can backfire.


First, congrats to Russell on the birth of his second kiddo!

Second, for all the time I think about thinking, I had no idea most of these niche areas of study existed in the way they do. My nerd brain is excited to read what comes out of this AMA.

Can you tell us a little bit about what drives curiosity, but also what causes us to squash curiosity both at an individual and at a societal level?

Do you think of avoidance as something separate from repressing curiosity?

4SAO

(David): George, I, and Emily Ho are developing a scale to measure information avoidance across domains. We find fairly low (negative) correlation with curiosity -- so I don't think these are strong complements. You might have someone who is very curious about the world, but doesn't want to get information about their health.

George is the expert on curiosity, so I'll leave the rest of the question to him. :)


Scholars of political psychology have been wrangling with these exact issues for quite some time, yet we often remain relegated to a niche of the broader political science field, much of which still utilizes rational actor models that ignore these cognitive biases. I'm curious if it is the same for economists. Do you see yourselves as part of a niche, and do you expect your perspective to ever become the mainstream in economics?

A second question, if you have time: given the fact that all humans carry these biases, do you think that institutions created and directed by humans reflect the same biases? In other words, do you find that governments, businesses, groups, and markets also display cognitive biases when acting with the collective "wisdom" of their members?

Thanks!

Thors_lil_Cuz

(David): I've been fortunate to "grow up" in a time when behavioral economics is becoming mainstream. George's first papers got rejected on grounds that "this is not economics" -- so it certainly hasn't always been this way. I think the line between behavioral and standard economics is also getting blurrier. For example, economists for a long time assumed people to be selfish and it was surprising that participants in experiments cared about the earnings of others. Interestingly, there's nothing in economic theory that implies people have to be selfish -- if I care about someone, it's perfectly consistent with rationality that I consider their payoffs when making decisions. Indeed, what has become known as "other-regarding preferences" is part of mainstream thinking today. I'm hoping that the same will be true for our thinking on information and beliefs.

Our paper was published in a highly regarded mainstream journal, so I think we are reaching the right audience. I think we have been very careful in how we interpret the evidence, but I'm sure some very smart people will disagree with us. Which is great for both sides: it will make us better aware of our assumptions and force us to evaluate them.

For your second question: I think institutions may cater toward our biases. Consider the market for news, where both the Wall Street Journal and the New York Times enjoy great reputations while catering to different audiences. In a fantastic (theoretical) paper, Sendhil Mullainathan and Andrei Shleifer, two economists at Harvard, show that markets actually lead to this kind of segmentation when readers seek to validate their beliefs as well as obtain news. It's not surprising that we'd also see media targeting the ideological fringes.

There's quite a bit of research on confirmation bias in teams, which make up organizations. So I don't think they're immune at all. Organizations (and governments) are often in a good place to device practices that help overcome the biases, however.

Consider mandatory calorie labeling at chain restaurants. That's one way to overcome avoidance: you can't help read the calories on the menu. But is it actually a good policy? Well, people may still order a piece of cake, but now feel guilty for eating so many calories, rather than enjoying the dessert. Targeting chain restaurants also means we label burgers bought by low-income consumers at McDonald's food, but not foie gras (or cocktails) at pricier restaurants.


Scholars of political psychology have been wrangling with these exact issues for quite some time, yet we often remain relegated to a niche of the broader political science field, much of which still utilizes rational actor models that ignore these cognitive biases. I'm curious if it is the same for economists. Do you see yourselves as part of a niche, and do you expect your perspective to ever become the mainstream in economics?

A second question, if you have time: given the fact that all humans carry these biases, do you think that institutions created and directed by humans reflect the same biases? In other words, do you find that governments, businesses, groups, and markets also display cognitive biases when acting with the collective "wisdom" of their members?

Thanks!

Thors_lil_Cuz

(George) Academics tend to be fairly parochial. They tend to know what's going on in their own field and to be unaware of and hence to not credit (though perhaps to insufficiently credit even when they are aware) parallel developments in other fields. Working in both economics and psychology, I can see countless instances of what you're talking about, and I'm sure it extends to political science as well (probably in both directions?).

Now I see that I haven't really answered your first question at all, but have instead gone on about a pet peeve of my own. Turning to your question, I do think that, perhaps not to a person, but generally, economists have become pretty open to ideas from psychology and other disciplines, including new ideas about how people process (and in some cases avoid (fail to process)) information.

Turning to your second question, I think that most of the evidence suggests that collectives of people tend to be even more biased than the individuals who compose them. Irving Janis did some path-breaking research on this -- which he called "groupthink" in the 1960s; and there has been a huge amount of research showing that group processes can exacerbate bias. Part of the reason is that, for a wide range of reasons, groups often hold relatively homogeneous beliefs, so that the individual opinions of the people in the group don't tend to carry much new information. But people don't take this redundancy into account, nor their own influence on the group. So, much of what one learns from people in one's own group is uninformative or even just echos of one's own beliefs. Apologies if that wasn't very coherent; lots of research on this topic!


How does this, avoiding useful information, play into information overload? In current terms, things like avoiding certain discussions of politics not because a person doesn't care, but there's just SO MUCH going on right now and so many sources to get it from. So perhaps then, avoiding useful information, information overload, and information fatigue? Do we sometimes do this to just make life simpler/easier?

gfpumpkins

(David) There's definitely something called "rational inattention." We can't possible process all the information that's out there. So you have to filter what you take in -- and, ideally, you ignore the least useful information and focus on what's most useful. It gets interesting when people don't ignore the information that is (or they expect to be) least useful, but the information that they don't want to be true.

A study by Emily Oster, for example, looks at people who have one parent with Huntington's disease -- a genetic condition that greatly reduces life expectancy. Those with one parent who suffers from the condition have a 50% chance of carrying the mutation themselves -- and there's a perfectly diagnostic test available. Yet, fewer than 10% of people in that situation end up getting the test. Those who don't get tested live their lives the same way as those who got tested and found they did not have the condition.

Information overload may be a convenient rationalization for avoidance: "I can't possibly read the New York Times and the Wall Street Journal -- so let me just read X" (where X happens to be the one that aligns with our ideological views).


How does this, avoiding useful information, play into information overload? In current terms, things like avoiding certain discussions of politics not because a person doesn't care, but there's just SO MUCH going on right now and so many sources to get it from. So perhaps then, avoiding useful information, information overload, and information fatigue? Do we sometimes do this to just make life simpler/easier?

gfpumpkins

People definitely avoid information that's too difficult to make sense of. We think it's even more interesting when people avoid information that's easily accessible just because they don't want to know.


What are the most commonly neglected pieces of information people choose to ignore, in regards to finances?

TheRedLayer

(Russell): Their own account balances. (I don't know if it's actually the most commonly neglected, but it's well documented that it is commonly avoided.)


What are the most commonly neglected pieces of information people choose to ignore, in regards to finances?

TheRedLayer

(George) I think the biggest piece of information that people ignore is EXPENSES -- e.g., on mutual funds. In fact, to the extent that people pay attention to expense, research by Terry O'Dean, Bridgitte Madrian and others show, they sometimes respond to it in exactly the opposite way that they should; they assume that it is an indication of quality. This is true even of index funds, which are pretty much all the same (except in their expenses). That's probably the most neglected piece of information. The piece of information that's is totally misleading but people pay a lot of attention to is the fund's return since inception. That is mainly a function of when the fund happened to go into operation. A fund that opened before the mortgage crisis is going to have returns since inception than an otherwise identical fund that opened right after the market collapsed.


Did you find a difference between genders? Did they avoid the same information? To the same degree? Were there obvious variations on strategy?

Edit: spelling

UhmBah

(David) George, I, and Emily Ho are working on a scale to measure information avoidance in a range of settings. We haven't found any systematic differences across gender or political orientation that I can recall -- and I don't think there were any major gender differences in the papers we reviewed.


What are 3 things that this research will imply that wont be strictly about consumption?

Kusokuso69

(David) What do you mean by consumption? To some extent, beliefs are primarily interesting when they lead to decisions. Maybe political polarization and voting behavior would be an example? People might not even evaluate a candidate for local office because he or she is affiliated with the opposing party -- even when they might find themselves agreeing more with that candidate than the one of their own party.


An ideal rational (i.e. Bayesian) agent will have a tendency to interpret uncertain evidence as reliable or unreliable according to whether it confirms or conflicts with its posterior judgement -- until the point at which the weight of contradicting evidence causes its judgement (and its guesses as to which evidence is reliable) to flip.

I wonder if you can distinguish between this rational partial disregard of conflicting evidence and your hypothesis of an irrational human characteristic of avoiding information he or she does not like.

astrolabe

(George) What you say is very true, but it can't explain the many situations in which people even avoid getting the information (as opposed to getting it but updating less).

Returning to an answer to an earlier question (about expertise), when deciding how much to update his/her beliefs, a Bayesian should take account of the credibility of the information source. However, people often don't seem to operate that way. People often seem to not care much about the credibility/expertise of an information source (and may even feel most threatened by and resistant to expert sources), but seem to care much more about whether the information supports or goes against their own cherished beliefs. For example, Sunstein and colleagues find that the beliefs of people who believe in climate change are much more responsive to information that the problem is worse than expected (and people who don't believe it it are much more responsive to information that the problem is not as bad as expected). Eil and Rao, similarly find that people avoid information that could lead them to believe that they are less smart or attractive than they believe, but that when they do get information, they are much more likely to adjust their beliefs upward in response to information suggesting that they are smarter or better looking than they had thought, but that their belief do not budge in the face of adverse information.


Thank you for taking time out for this AMA! My question is, what steps do you propose, particularly in education (colleges and schools) in changing mindset of people to prevent them from avoiding useful information?

kafkaesque9

(David) Our research doesn't directly address this, but I'll take a stab anyway. We spend a lot of time teaching kids what to think and not much time teaching them how that knowledge was derived. Textbooks don't contain contradictions, unless someone made a mistake... and I think that gives a false sense of the world. This is especially true for science, which isn't a set of laws divined out of nowhere, but the results of a messy iterative process with many missteps along the way. I think that gets lost in schools, where we tend to gloss over the messy part.

The world is inherently uncertain and so rather than searching for some "true" view of the world and getting completely invested in it, it's important to re-evaluate and challenge what we believe. That's a messy process -- and accepting uncertainty and messiness is an acquired taste.


How does one's own self-perception affect this tendency ? Is there any evidence of influence from varying levels of self-esteem at different times in an individual life ? And what about stress ? Just from self observation, it seems to me that the more confident and secure I feel the more I am able to consider conflicting evidence/advice/opinions.

dony007

(George) Aggression is often an indication of insecurity (e.g., Seneca's "All cruelty spring from weakness"); if you are physically very secure, you probably won't feel the need to demonstrate it. By the same token, much information avoidance stems from insecurity. If you are 100% confident in your views, you won't mind being exposed to information that challenges them, and likewise if you have no idea about your beliefs, you are likely to be open to new information that could help you to refine them. Information avoidance is probably at an extreme for people who hold cherished beliefs (beliefs they have an emotional attachment to) but are insecure about -- i.e., at some level, they realize that the evidentiary basis of their beliefs is weak. In this situation, people will be powerfully motivated to avoid information that might challenge the beliefs they are attached to.


How does one's own self-perception affect this tendency ? Is there any evidence of influence from varying levels of self-esteem at different times in an individual life ? And what about stress ? Just from self observation, it seems to me that the more confident and secure I feel the more I am able to consider conflicting evidence/advice/opinions.

dony007

(David) In some early pilot work on an information avoidance scale, we find (maybe not surprisingly) that people are more open to information when they think they can do something about it. Our university health clinic advertises that "All STDs are treatable" -- which rightly emphasizes that you can do something even if it can't be cured. So I wouldn't be surprised if people who felt more empowered to process information were also less physically avoidant.

However, obtaining information is only one part of the process. Work by Gino and Ariely, for example, finds that creative people are more likely to cheat, because they can come up with creative excuses afterward.

So we might imagine that someone with high self-esteem is more likely to obtain information about their performance relative to others, but less likely to update their beliefs after.


What do you think causes this denial when information that would benefit them in the long run is overriden in place of instant gratification?

BobDeBac

You are correct that information avoidance often involves a tradeoff between (avoiding) immediate pain, but achieving long-term gain. For example, watching a video of oneself speaking in public is likely to be very painful (In 30+ years in academia, I have yet to muster the courage), but I'm sure it would be hugely beneficial to do so. Similarly, for those of us who are in long-term relationships, it might make sense to ask one's partner: "Is there anything I do that drives you crazy?" But how many of us are brave enough to ask, especially when you've been together for years and risk discovering that the thing that drives your partner crazy is something you've been doing forever? (George; probably neither of my fellow panelists want this comment to be attributed to them)(Over and out; thanks for the great questions!)


Hi! Thanks for doin this. I'm a student in Las Vegas, I spent my first 3 years of university doing business-finance and then had a change of heart and decided to pursue philosophy(graduating in fall) and then return for an MBA(to hopefully apply the two to a career).

Would you say that economics and philosophy need to be more intertwined? (Doing a thesis on it I hope-) I find that raw data without critical thought can be abused when it comes to justifying irrational/unethical situations(usually talked about as scrutiny on capitalism) . The raw data only shows what people do and how people act, but not always why.

What is a career path that you could recommend for my schooling? I'm not sure what types of positions to be looking for. I'd like to stay in the business industry but I can't imagine that there are a ton of bankers that want to listen to ethical mumbojumbo but more so, just want to see + return. What you guys do is amazing, I really appreciate your work. Can't wait to dig into it.

I've also noticed a similar tune to which people deny evidence of the political systems as with religion, care to touch on it? I don't like giving prescriptive direction when talking on the subject(s), but more so allowing a natural understanding to take place, yet sometimes people are willfully ignorant even if their own premises for belief are hypocritical or contradicting. Similar to the political system. I've always heard to not speak about either at work, which I agree with especially after this last year, but shouldn't it be more rational and such a rule be unnecessary?

If y'all could also throw a couple books out there that you recommend!

Thanks again!

sidewalkgum

(David) Hi there! Economics started off as philosophy (Adam Smith wrote the Theory of Moral Sentiments before The Wealth of Nations) and I think it's always good and important to reflect on broader arguments. To some extent, that's what our review paper does as well: it relies on experiments, data from the real world, and mathematical models -- but it's really making a philosophical point as well. There's a very cool paper titled "Adam Smith, Behavioral Economist" (George is a co-author) that links some of Smith's writing to current developments in the field. That may be a good inspiration!

I agree that there are some topics that lead to emotional arguments -- politics and religion among them. We don't seem to have a good way to discuss them without making people feel threatened and attacked. "Disagreeing without being disagreeable" is a learned skill -- and I think it's one companies would do well to embrace. Not necessarily because we need to debate politics at work (any more than we would discuss our sex lives), but because the same reluctance to disagree might spill over to pertinent topics, like bringing up reasons not to make certain investments.

I think anything by Cass Sunstein is worth reading -- and he's a prolific author, so there's a lot!


Russell, George and David, thank you for taking the time to talk with us about your work. This really is a fascinating topic.

How can someone be aware if their succumbing to this type of bias and what can be done about it?

How do we make sense of information that is directly relevant, but potentially biased or only half-true? Like if I suspect an advertisement may be factually correct, but still misleading?

It seems like unavoidably, all information is viewed through a certain perspective, even pure, raw data. The types of questions you ask and try to answer are subject to the interpretation of whoever is trying to answer them. The very metrics used to assess whether one option is 'better' than another are, in a way, a type of bias.

What does it mean to have unbiased data? Is it achievable?

PapaNachos

(Russell) In principle, the idea is to recognize the source's potentially biased motives and to extract whatever signal is present. In practice, this is really hard. People tend to allow their own motives to color their interpretations.


Hi all, thank you for taking the time out to do a Reddit AMA.

Question for George Loewenstein and David Hagmann: I am a huge fan of your paper 'Warning: You are about to be nudged', and am wondering what your thoughts are on information avoidance when it comes to advance directives and doctor/patient conversations. Three questions (please feel free to answer whichever question is most interesting to you): 1. How do you think we can overcome the avoidance of information from both doctor and patient perspectives when it comes to end-of-life care? 2. What would you say to people who argue that it is unethical to nudge people towards advance decisions? 3. What (in your opinions) is the future of advance decisions and creating public policy surrounding this important issue?

shibsy

(George) I do think that information avoidance plays an important role in end of life decisions. In research with Rebecca Ferrer and others, we are obtaining evidence that people with cancer seem to avoid adverse information provided by oncologists so as to maintain optimism. This may be perfectly sensible, in line with the earlier discussions of the rationality of information avoidance. However, such information avoidance risks the potentially severe cost of failing to obtain palliative care when it is warranted and would greatly improve the quality of death.

Despite doing research on the role of defaults in advance directives, I share your qualms about nudging people in this way, particularly when we are nudging people toward taking actions (e.g., heroic measures) that many if not most doctors would not personally advocate. In research on advance directives we try to minimize our influence on actual decisions by informing research subjects at the end of the study that they were 'nudged'. However, the research you allude to should reduce our confidence that such disclaimers will have the effect of undoing our (random, and hence not normatively justifiable) intervention.

Based on my reading of existing research, however (e.g., the SUPPORT trial), I don't believe that advance directives make much of a difference or, hence, are the solution to the problem (if you believe it is one, as I do) of insufficient use of hospice and palliative care. Better levers into that problem might involve changing the culture and institutions at hospitals -- e.g., when it comes to referring patients to intensive care.


Additional Assets

License

This article and its reviews are distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and redistribution in any medium, provided that the original author and source are credited.