The nirvana fallacy: An imperfect solution is often better than no solution

In this post, I want to briefly explain and discuss a logical blunder known commonly as the “nirvana fallacy.” This fallacy occurs when you suggest either that a solution should not be used because it is imperfect or that a solution should not be used because there is some underlying issue that is not being addressed, but you fail to provide a plausible alternative. That may seem a bit confusing at first, so I will use several examples that are commonly used by opponents of science.

Let’s start with one of the most basic and most obvious examples. Anti-vaccers often like to claim that we should not vaccinate because vaccines aren’t 100% effective. This is an extremely clear cut instance of the nirvana fallacy, because the fact that something isn’t 100% effective does not mean that we should not use it. Partial effectiveness is still better than no effectiveness. Indeed, almost nothing is 100% effective. For example, seat belts, helmets, condoms, parachutes, etc. are all less than 100% effective, but they are still very useful. Even so, a life-saving medical marvel like vaccines doesn’t need to be 100% effective to be useful, because every life that is saved is important.

Now let’s look at a slightly more complex example: the peer-review system. This is the system that scientific papers have to pass before being published, and it is admittedly imperfect. Indeed, I have devoted numerous posts on this blog to problems with it (for example here, here, here, and here). This has led some to suggest that it is worthless and should be abandoned entirely. In reality, however, an imperfect quality control system is still better than no quality control system at all. For example, we can all agree that a quality control system at a candy bar facility that limits cockroach legs to one per every ten chocolate bars is better than no quality control system at all (note: those aren’t actual statistics). Even so, the peer-review system is imperfect, and bad papers do sometimes get through, but an awful lot of bad papers never make it.

This brings me to an important point about nirvana fallacies: if you are going to argue that something should be abandoned because it is imperfect, then you must simultaneously propose a more effective alternative. To go back to my chocolate bar example, there would be nothing wrong with saying, “we should stop using the current method that limits cockroach legs to 1 per 10 bars, and switch to method B, which limits legs to 1 per 20 bars.” It is, however, invalid to simply say, “the current quality control system doesn’t stop every cockroach leg (i.e., it isn’t 100% effective), therefore we should remove the quality control altogether.” Even so, if you can propose a better alternative the current peer-review system, then by all means do so, but it is ridiculous to argue that we should either let any “study” pass as valid science or just abandon science altogether.

A similar example often occurs in climate change debates. I frequently encounter people who admit that we are probably causing the climate to change, but they argue that we will never be able to curtail our greenhouse gas emissions in time to truly stop climate change, therefore we shouldn’t bother to do anything. Once again, however, an imperfect solution is better than no solution. Yes, we probably won’t be able to fully prevent climate change, but we can prevent the worst consequences of it (Schleussner et al. 2016), and that makes it worth taking action.

Another version of this fallacy also occurs during climate change debates, and it can be summarized as, “but not everyone else will do it.” For example, I often talk to Americans who argue that America should not try to limit its fossil fuel use because even if America switched to renewable energy sources, many other countries wouldn’t, so there is no point. Once again, the problem is that even if America was the only country to take climate change seriously (which is currently almost backwards of reality), that would still have an impact on the amount of warming that occurs. Further, the actions of others have no bearing on your own responsibility. In other words, the fact that everyone else is doing something unethical does not mean that it is ok for you to do it (yes, I know that was a philosophical argument not a scientific one, but I think that it is relevant here).

stick figure meme blue logical fallacy GMO golden rice

An example of the nirvana fallacy

A final variant of the nirvana fallacy occurs when you argue that a solution should not be used because it does not address some underlying issue. A good example of this comes from GMOs. Despite all of the anti-GMO propaganda, not all GMOs are about money. Some, such as golden rice and GMO bananas, are being developed solely as humanitarian endeavors. These GMOs are rich in vitamin A, which is currently lacking in the diets of some developing countries. Anti-GMO activists often respond to this by saying that the vitamin deficiency in developing countries is actually just a symptom of poverty, and those deficiencies would go away if we took care of economic inequality and food distribution. Really think about that response for a minute. They are actually arguing that all that we have to do to fix the problem is solve world hunger and poverty. Sure, fixing those things would solve the vitamin problem, and we should be trying to fix those problems, but we clearly aren’t going to find the solution any time soon. In contrast, we could be using vitamin rich GMOs within a few years or even months. Asking people who are suffering and even dying from vitamin deficiencies to wait for us to fix world hunger rather than using a GMO is absurd. In other words, it is true that the GMOs don’t address the underlying issue, but the underlying issue is a nearly impossible problem to solve. Therefore, we should use the solution that is available to us, even though it’s not perfect.

In short, an imperfect solution is generally better than not having any solution at all. Therefore, it is generally not valid to argue that a solution should not be used simply because it is imperfect or incomplete, unless you can provide a feasible and superior alternative.

Literature cited
Schleussner et al. 2016. Differential climate impacts for policy-relevant limits to global warming: the case of 1.5C and 2C. Earth Syst. Dynam. 7:327-351.

Posted in Global Warming, GMO, Rules of Logic, Vaccines/Alternative Medicine | Tagged , , , , , , | 13 Comments

Understanding grants in science: doing research without selling your soul

Last week was a good week for me, because I received a several thousand dollar grant for my research, so I wanted to take a few minutes to talk about exactly what that means. Many people seem to be under the impression that I can now go buy a new car, or that I have been bought off by some company and am now their pawn. In reality, all that it means is that I can do a research project that I wouldn’t have been able to do otherwise. So in this post, I want to clear up some myths and discuss how grants work in science.

Note: Throughout this post I am going to refer to anyone who provides a grant as a “grant agency” regardless of whether they are a company, society, government, charity, etc. Also, please note that I am speaking in very general terms. There are tons of different types of grants out there, so it is always possible to find unusual ones or exceptions the comments that I am going to make, but you should not make the mistake of focusing on the outliers and ignoring the general trends. In other words, I am not saying that there are no biased or unethical sources of funding, but most grants don’t work the way that many people seem to think they do.

You don’t get to keep the money
Addendum (14-June-16): Based on the comments, it seems that in some countries, such as the US, it is more common for grants to cover salaries than my personal experience with grants (which is largely not in the US) had lead me to believe. Please see the comments.

The first misconception that I want to clear up is the idea that scientists get to keep their grant money for their own expenditures. In a great many cases, a scientist’s salary is covered by the university, institution, or company that they work for. In those cases, all of the money goes into your research and you don’t get to keep a single cent (many grants explicitly state that they won’t cover salaries). Indeed, when you apply for the grant, you have to provide a detailed budget showing how you are going to spend the money, and at the end of the project, you are generally required to provide reports showing what you did, including providing evidence that you met the research objectives that you proposed in your application.

To be clear, there are some situations where a scientist’s salary is not covered by their university, and in those cases, you do apply for grants that include your salary, but even then, the salary can only be a very modest part of your budget. Trying to give yourself a high salary is pretty much guaranteed to result in your application getting tossed into the “do not fund” pile.

Now, you may be thinking, “well can’t you just lie and spend the money on personal stuff without telling anyone?” No, not really. First, grant money is usually locked up in institutional accounts. It doesn’t generally go into your personal account. This means that to use the money, you have to fill out purchase orders that go through your university. So there is accountability, and if you start trying to buy a bunch of clearly personal stuff, you’re going to get called on it. Second, you have to report back to the grant agency and show that you produced results. So if you spend the money on a vacation to Hawaii, you won’t have enough to do the project, which means you won’t produce results, which means that you’ll have a bad track record and won’t be able to get grants in the future. One of the things that grant committees want to see is evidence that you have spent previous grants wisely and produce high quality results. So blowing the grant on personal purchases is a really bad idea.

Note: When I say “produce results” I simply mean that you carried your project through to it’s conclusion and produced a report, paper, etc. I do not mean, “produced a specific result that the grant agency wanted.”

What’s the incentive?
At this point, you may be wondering what the incentive is for getting grants. If scientists don’t get to keep the money, then what’s the point? Indeed, by getting my recent grant, I have just committed myself to a project that will consume several months of my time and increase my pay by exactly 0 dollars. So why am I excited by it?

There are several answers to that question. First, a scientist’s status is determined largely by both the number and quality of papers that he/she produces. So getting a grant is a good thing because it lets you do research, which lets you publish, which lets you build your reputation as a scientist. For well established scientists, however, an extra paper doesn’t make that big of a difference. To be clear, it’s still important. Even if you have tenure you are usually expected to publish at least occasionally, but most scientists are workaholics, and working 60+ hours a week is extremely common. So why would we kill ourselves over these projects that produce little in the way of tangible personal benefits?

The answer is the underlying reason why many of us got into science in the first place: we love asking and answering questions. Becoming a scientist is an extremely long, difficult, and expensive process, so unless you really love doing research, you probably aren’t going to stick with it all the way through the PhD. As a result, those who do make it through tend to be extremely passionate about their work. So scientists get excited when we get grants because grants let us do the really cool research projects that we want to do. As a scientist, you get to pick what you want to study. So you write grants for projects that you are interested in and care about. In my case, not only do I get to answer an interesting question, but it is a question with strong implications for the conservation of many species of amphibians, and as a herpetologist/conservation biologist, that is something that I care greatly about. So I am excited by this grant because it well let me do research that I think is interesting and important.

To be clear, I’m not suggesting that all scientists are driven entirely by their love of research. We’re human. We have the same faults as everyone else, and there are always exceptions to the norm, but most of the people who stick with science do so because they love it (it’s certainly not for the money, because we make diddlysquat). Go to a conference sometime and talk to the scientists about their work. I think that you might be surprised by how deeply they care about what they are doing.

Grant agencies don’t control your results
Perhaps the most common myth that I encounter about grants is the idea that granting agencies control the results of the projects that they fund. I constantly hear non-scientists assert that grants, “come with strings attached,” and people often seem to think that when you get a grant, you agree to produce a particular result.  In reality, in the majority of cases, grant agencies have absolutely no control whatsoever over your results.  You tell them what you are going to test, not what result you are going to find. Also, at the end of your project, when you write your paper, you don’t have to submit it to them for approval before sending it to a journal. You do have to give them a written report of what you did, but they usually have no control over what you do with those data. In fact, in most cases, you can submit your paper for review before even informing them of your results. Just to give one comical example of this, an anti-vaccine group recently became very upset when scientists that they had funded published a paper showing a lack of evidence that vaccines cause autism.

Now, you may be thinking, “fine, they can’t stop you from publishing, but if you publish a result that they don’t like, you won’t get funding from them again.” First, that argument would only apply to the subset of grants that come from corporations or biased special interest groups (more on that in the next section). Second, within that subset, yes, using a  Merck grant to publish a paper showing that one of Merck’s products is dangerous might prevent you from getting a grant from them again, but what is your incentive for getting another grant from them? Remember, in many situations, you don’t get to keep any of the funding. You don’t benefit financially from it. Your only benefits are personal satisfaction and the increase in your scientific reputation, but spending months working 60 hours a week on a project, then scrapping all of that work and falsifying the results is the exact opposite of personal satisfaction. Similarly, writing a crappy paper with false results is a good way to ruin your scientific reputation. So risking your career by falsifying results just so that you can get another grant for a project that you may also have to falsify is downright stupid.

To be 100% clear, within the medical literature, there are very real and serious biases that need to be dealt with. It is true that research that is sponsored by pharmaceutical companies is more likely to favor those companies, but there are several things to note there. First, pharmaceutical companies tend to either fund their own research or use contract research organizations rather than giving out grants to academics. In those situations, the scientists often do get their salaries from the companies, and the companies do have a very large say in writing the papers. So those situations are fundamentally different from what I am talking about in this post. Also, I do not mean to suggest that scientists are in no way biased. Of course we have biases, and receiving a large grant certainly can give you a subconscious bias towards the company that gave you the grant. That is, however, not at all the same as the type of deliberate collusion that many people assume takes place.  

Grants aren’t given out by politicians
This is a somewhat bizarre myth, but I run into it frequently. Many people seem to be under the impression that politicians control the entire granting process and they get to decide who does and does not get the funding. In reality, there are tons of different grants out there, many of which have no connections to governments. Further, even when the grants are from governments, the politicians generally have very little to do with who gets them. They may set general limits (like X amount of money is for human health research), but they don’t decide the actual people/projects that get the money. That decision is usually made by a panel of scientists (often independent scientists).

You can do research that disagrees with big companies
The final myth that I want to address is the notion that you will never get funding for a topic that opposes large corporations. Again, there are lots of grants that aren’t associated with companies, and corporations do not control the entire grant process. So this notion is quite ridiculous.

Other people modify this claim to make it more general and argue that you have to study whatever the grant agency wants you to, but that is backwards of how things usually work. You decide what interests you, then you write a proposal for that topic and send it to a grant agency that you think might be interested in it. If they like it and don’t have any better proposals, you’ll get the money. If they don’t like it, you won’t. They don’t tell you what to study, rather you tell them what you are going to study.

To be clear, getting grants is not easy, and you usually have to write lots of applications before you actually get one (my project received several rejections before a grant agency finally accepted it). Also, you often have to try to sell your project differently depending on where you are applying, but at the end of the day, the project is yours. For example, my project is a genomics project on the conservation of rainforest frogs. Some of the grants that I applied for were very conservation focused, so for those applications, I stressed the conservation implications of my work. Others were really interested in genomics research, so for those I stressed that aspect. Still others were concerned about rainforests, so for those I emphasized my study subjects’ role as members of the rainforest community. So each application that I wrote was tailored to the interests of the people that I was applying to, but it was the same project on each application, I just presented it differently. Also, it is worth mentioning that grant agencies do sometimes recommend that you change your methodologies if they think that what you proposed was inadequate, but those recommendations usually come from the scientists who reviewed the grant proposals, not from the actual source of the grant.

So yes, getting grants is difficult, and if you want to get one, your project is going to need to have a solid scientific basis. If you’re writing proposals for flat-earth research, you’re not going to get any money. Similarly, if you are working on a very boring topic with no real-world applications, you’re going to have trouble getting money, but if you have an interesting question with good scientific support behind it, you can get money somewhere. This is especially true if your research has implications for human health. If you actually had a compelling reason to think that a particular vaccine was dangerous, for example, you could absolutely get funding for it. If you don’t believe me, just look at the hundreds of papers on vaccine safety, many of which were not funded by pharmaceutical companies. The dozens of studies on vaccines and autism, for example, exist because scientists took the concern over vaccines causing autism seriously and invested heavily in studying it. The fact that there was a potential that the research would be negative for pharmaceutical companies did not prevent the research from going forward (again, just to be clear, many of those studies were not funded by “big pharma” so you can’t accuse them all of being corrupt without committing an ad hoc fallacy).

Finally, just in case you aren’t convinced that corporations don’t control the process, think about the thousands of papers that have been published on humans causing climate change. If large companies were actually capable of preventing important research from moving forward, then surely multi-billion dollar oil companies could have prevented those climatologists from getting money.

via PhD Comics

via PhD Comics

Biases
I don’t want to paint an overly rosy picture of grants. There are plenty of real problems with the system, and one of the biggest (in my opinion) is the biased nature of grants. Some topics are much easier to get money for than others. For example, within conservation biology, the vast majority of money goes to “cute and cuddly” animals. If you want to do conservation research on pandas, tigers, etc. there’s plenty of money for that, but if you want to study rattlesnakes, snails, lizards, etc., good luck and may the force be with you. For example, a recent review of the literature on Australian mammals found that 73% of the papers were on marsupials (kangaroos, wallabies, koalas, etc.) despite the fact that Australia is home to a great many bats and rodents. Getting funding for those “ugly” animals is much more difficult than getting funding for “cute” charismatic animals.

There are similar biases in the medical literature. Some diseases get way more attention than others even though the neglected diseases are often more important. For example, there is a mismatch between the burden created by different types of cancer and the amount that we spend studying those cancers.

Another huge and regrettable bias is the bias against replication studies. Grant agencies like to see novel research that is tackling new questions, but replication is (or at least should be) the cornerstone of science. Nevertheless, if you want to do exactly what someone else has already done, you are going to have a really hard time getting funding.

Summary
In short, there are very real problems with the way that grants work. There are enormous biases in their distribution, and some topics are very hard to get funding for. Also, conflicts of interest do sometimes exist and should be taken seriously; however, the way that many opponents of science characterize grants is completely false. In many situations, scientists do not get to keep any of the grant money for themselves, and grant agencies usually have no control over the final results of the project. Similarly, grant agencies do not tell you what to study, rather you tell them what you want to study. So this notion that you can’t get funding for a project that might negatively impact a large company is absurd. There are lots of grants out there and many of them are not affiliated with companies. Finally, many of the biggest grants are from governments, but those grants are usually distributed by panels of expert scientists, not politicians.

Posted in Nature of Science | Tagged , | 8 Comments

Global warming isn’t natural, and here’s how we know

The cornerstone argument of climate change deniers is that our current warming is just a natural cycle, and this claim is usually accompanied by the statement, “the planet has warmed naturally before.” This line of reasoning is, however, seriously flawed both logically and factually. Therefore, I want to examine both the logic and the evidence to explain why this argument is faulty and why we are actually quite certain that we are the primary cause of our planet’s current warming.


The fact that natural climate change occurred in the past does not mean that the current warming is natural
I cannot overstate the importance of this point. Many people say, “but the planet has warmed naturally before” as if that automatically means that our current warming is natural, but nothing could be further from the truth. In technical terms, this argument commits a logical fallacy known as non sequitur (this is the fallacy that occurs whenever the conclusion of a deductive argument does not follow necessarily from the premises). The fact that natural warming has occurred before only tells us that it is possible for natural warming to occur. It does not indicate that the current warming is natural, especially given the evidence that it is anthropogenic (man-made).

To put this another way, when you claim that virtually all of the world’s climatologists are wrong and the earth is actually warming naturally, you have just placed the burden of proof on you to provide evidence for that claim. In other words, simply citing previous warming events does not prove that the current warming is natural. You have to actually provide evidence for a natural cause of the current warming, but (as I’ll explain shortly) no such mechanism exists.


Natural causes of climate change

Now, let’s actually take a look at the natural causes of climate change to see if any of them can account for our current warming trend (spoiler alert, they can’t).

Sun
The sun is an obvious suspect for the cause of climate change. The sun is clearly an important player in our planet’s climate, and it has been responsible for some warming episodes in the past. So if, for some reason, it was burning hotter now than in the past, that would certainly cause our climate to warm. There is, however, one big problem: it’s not substantially hotter now than it was in the recent past. Multiple studies have looked at whether or not the output from the sun has increased and whether or not the sun is responsible for our current warming, and the answer is a resounding “no” (Meehl, et al. 2004; Wild et al. 2007; Lockwood and Frohlich 2007, 2008; Lean and Rind 2008; Imbers et al. 2014). It likely caused some warming in the first half the 20th century, but since then, the output from the sun does not match the rise in temperatures (in fact it has decreased slightly; Lockwood and Frohlich 2007, 2008). Indeed, Foster and Rahmstorf (2011) found that after correcting for solar output, volcanoes, and El Niños, the warming trend was even more clear, which is the exact opposite of what we would expect if the sun was driving climate change (i.e., if the sun was the cause, then removing the effect of the sun should have produced a flat line, not a strong increase).

Finally, the most compelling evidence against the sun hypothesis and for anthropogenic warming is (in my opinion) the satellite data. Since the 70s, we have been using satellites to measure the energy leaving the earth (specifically, the wavelengths of energy that are trapped by CO2). Thus, if global warming is actually caused by greenhouse gasses trapping additional heat, we should see a fairly constant amount of energy entering the earth, but less energy leaving it. In contrast, if the sun is driving climate change, we should see that both the energy entering and leaving the earth have increased. Do you want to guess which prediction came true? That’s right, there has been very little change in the energy from the sun, but there has been a significant decrease in the amount of energy leaving the earth (Harries et al. 2001; Griggs and Harries. 2007). That is about as close to “proof” as you can get in science, and if you are going to continue to insist that climate change is natural, then I have one simple question for you: where is the energy going? We know that the earth is trapping more heat now than it did in the past. So if it isn’t greenhouse gasses that are trapping the heat, then what is it?


Milankovitch cycles
Other important drivers of the earth’s climate are long-term cycles called Milankovitch cycles, which involve shifts in the earth’s orbit, tilt, and axis (or eccentricity, precession, and obliquity, if you prefer). In fact, these appear to be one of the biggest initial causes of prominent natural climate changes (like the ice ages). So it is understandable that people would suspect that they are driving the current climate change, but there are several reasons why we know that isn’t the case.

First, Milankovitch cycles are very slow, long-term cycles. Depending which of the three cycles we are talking about, they take tens of thousands of years or even 100 thousand years to complete. So changes from them occur very slowly. In contrast, our current change is very rapid (happening over a few decades as opposed to a few millennia). So the rate of our current change is a clear indication that it is not being caused by Milankovitch cycles.

Second, you need to understand how Milankovitch cycles affect the temperature. The eccentricity cycle could, in concept, directly cause global warming by changing the earth’s position relative to the sun; however, that would cause the climate to warm or cool by affecting how much energy from the sun hits the earth. In other words, we are back to the argument that climate change is caused by increased energy from the sun, which we know isn’t happening (see the section above).

The other cycles (precession and obliquity), affect the part of the earth that is warmed and the season during which the warming takes place, rather than affecting the total amount of energy entering the earth. Thus, they initially just cause regional warming. However, that regional warming leads to global warming by altering the oceans’ currents and warming the oceans, which results in the oceans releasing stored CO2 (Martin et al. 2005; Toggweiler et al. 2006; Schmittner and Galbraith 2008; Skinner et al. 2010). That CO2 is actually the major driver of past climate changes (Shakun et al. 2012). In other words, when we study past climate changes, what we find is that CO2 levels are a critically important factor, and, as I’ll explain later, we know that the current increase in CO2 is from us. Thus, when you understand the natural cycles, they actually support anthropogenic global warming rather than refuting it.


Volcanoes

At this point, people generally resort to claiming that volcanoes are actually the thing that is emitting the greenhouse gasses. That argument sounds appealing, but in reality, volcanoes usually emit less than 1% of the CO2 that we emit each year (Gerlach 2011). Also, several studies have directly examined volcanic emissions to see if they can explain our current warming, and they can’t (Meehl, et al. 2004; Imbers et al. 2014).


Carbon dioxide (CO2)

A final major driver of climate change is, in fact, CO2. Let’s get a couple of things straight right at the start. First, we know that CO2 traps heat and we know that increasing the amount of CO2 in an environment will result in the temperature increasing (you can find a nice list of papers on the heat trapping abilities of CO2 here). Additionally, everyone (even climate “skeptics”) agree that CO2 plays a vital role in maintaining the earth’s temperature. From those facts, it is intuitively obvious that increasing the CO2 in the atmosphere will result in the temperature increasing. Further, CO2 appears to be responsible for a very large portion of the warming during past climate changes (Lorius et al. 1990; Shakun et al. 2012). Note: For past climate changes, the CO2 does lag behind the temperature initially, but as I explained above, the initial warming triggers an increase in CO2, and the CO2 drives the majority of the climate change

At this point, you may be thinking, “fine, it’s CO2, but the CO2 isn’t from us, nature produces way more than we do.” It is true that nature emits more CO2 than us, but prior to the industrial revolution, nature was in balance, with the same amount of CO2 being removed as was emitted. Thus, there was no net gain. We altered that equation by emitting additional CO2. Further, the increase that we have caused is no little thing. The concentration has increased by over 40% since the start of the industrial revolution, and the current concentration of CO2 in the atmosphere is higher than it has been at any point in the past 800,000 years. So, yes, we only emit a small fraction of the total CO2 each year, but we are emitting more CO2 than nature can remove, and a little bit each year adds up to a lot over several decades.

Carbon dioxide isotope ratios CO2

These data come from Wei et al. 2009, but the legend of this figure was modified for readability by skepticalscience.com (the data themselves were in no way manipulated as you can see in Figure 4 of Wei et al. 2009)

Additionally, we know that the current massive increase in CO2 is from us because of the C13 levels. Carbon has two stable isotopes (C12 and C13), but C13 is heavier than C12. Thus, when plants take carbon from the air and use it to make carbohydrates, they take a disproportionate amount of C12. As a result, the C13/C12 ratios in plants, animals (which get carbon from eating plants), and fossil fuels (which are formed form plants and animals) have more C12 than the C13/C12 ratios in that atmosphere. Therefore, if burning fossil fuels is responsible for the current increase in CO2, we should see that ratio of C13/C12 in the atmosphere shift to be closer to that of fossil fuels (i.e., contain more C12), and, guess what, that is exactly what we see (Bohm et al. 2002; Ghosh and Brand 2003;Wei et al. 2009). This is unequivocal evidence that we are the cause of the current increase in CO2.

Finally, we can construct all of this information into a deductive logical argument (as illustrated on the left). If CO2 traps heat, and we have increased the CO2 in the atmosphere, then more heat will be trapped. To illustrate how truly inescapable that conclusion is, here is an analogous argument:
1). Insulation traps heat
2). You doubled the insulation of your house
3). Therefore, your house will trap more heat

You cannot accept one of those arguments and reject the other (doing so is logically inconsistent).

Note: Yes, I know that the situation is much more complex than simply CO2 trapping heat, and there are various feedback mechanisms at play, but that does not negate the core argument.

 

This figure from Hansen et al. 2005 shows the effect of both the natural and anthropogenic drivers of climate change. Notice how only anthropogenic sources show a large warming trend. Also, see figure 2 of Meehl et al. 2004.

This figure from Hansen et al. 2005 shows the effect of both the natural and anthropogenic drivers of climate change. Notice how only anthropogenic sources show a large warming trend. Also, see figure 2 of Meehl et al. 2004.

Putting the pieces together
So far, I have been talking about all of the drivers of climate change independently, which is clearly an oversimplification, because, in all likelihood, several mechanisms are all acting together. Therefore, the best way to test whether or not the current warming is natural is actually to construct statistical models that include both natural and man-made factors. We can then use those models to see which factors are causing climate change. We have constructed multiple of these models, and they consistently show that natural factors alone cannot explain the current warming (Stott et al. 2001; Meehl et al. 2004; Allen et al. 2006; Lean and Rind 2008; Imbers et al. 2014). In other words, including human greenhouse gas emissions in the models is the only way to get the models to match the observed warming. This is extremely clear evidence that the current warming is not entirely natural. To be clear, natural factors do play a role and are contributing, but human factors are extremely important, and the models show that they account for the majority of the warming.


Correlation vs. causation
It is usually about now that opponents of climate change start to argue that scientists are actually committing a correlation fallacy, and simply showing a correlation between temperature and the CO2 that we produce does not mean that the CO2 is causing the temperature increase. There are, however, several problems with that argument.

First, correlation can indicate causation under certain circumstances. Namely, situations where you have controlled all confounding factors. In other words, if you can show that Y is the only thing that is changing significantly with X, then you can reach a causal conclusion (even placebo controlled drug trials are really just showing correlations between taking the drug and recovery, but because they used the control, they can use that correlation to reach a causal conclusion). In the case of climate change, of course, we have examined the confounding factors. As I explained in the previous section, we have constructed statistical models with the various drivers of climate change, and anthropogenic greenhouse gasses are necessary to account for the current warming. In other words, we have controlled for the other causes of climate change, therefore we can reach a causal conclusion.

graph correlation smoking cancer

Smoking and lung/bronchial cancer rates (data via the CDC).

Second, and perhaps more importantly, there is nothing wrong with using correlation to show a particular instance of causation if a causal relationship between X and Y has already been established. Let me give an example. The figure to the right shows the smoking rates and lung/bronchial cancer rates in the US. There is an obvious negative correlation between the two (P < 0.0001), and I don’t think that anyone is going to disagree with the notion that the decrease in smoking is largely responsible for the decrease in lung cancers. Indeed, there is nothing wrong with reaching that conclusion, and it does not commit a correlation fallacy. This is the case because a causal relationship between smoking and cancer has already been established. In other words, we know that smoking causes cancer because of other studies. Therefore, when you see that the two are correlated over time, there is nothing wrong with inferring that smoking is driving the cancer rates. Even so, we know from laboratory tests and past climate data that CO2 traps heat and increasing it results in more heat being trapped. In other words, a causal relationship between CO2 and temperature has already been established. Therefore, there is nothing fallacious about looking at a correlation between CO2 and temperature over time and concluding that the CO2 is causing the temperature change.


Ad hoc fallacies and the burden of proof
At this point, I often find that people are prone to proposing that some unknown mechanism exists that scientists haven’t found yet. This is, however, a logical fallacy known as ad hoc. You can’t just make up an unknown mechanism whenever it suits you. If that was valid, then you could always reject any scientific result that you wanted, because it is always possible to propose some unknown mechanism. Similarly, you can’t use the fact that scientists have been wrong before as evidence, nor can you argue that, “there are still things that we don’t understand about the climate, so I don’t have to accept anthropogenic climate change” (that’s an argument from ignorance fallacy). Yes, there are things that we don’t understand, but we understand enough to be very confident that we are causing climate change, and, once again, you can’t just assume that all of our current research is wrong.

The key problem here is the burden of proof. By claiming that there is some other natural mechanism out there, you have just placed the burden of proof squarely on your shoulders. In other words, you must provide actual evidence of such a mechanism. If you cannot do that, then your argument is logically invalid and must be rejected.


Summary/Conclusion
Let’s review, shall we?

  • We know that it’s not the sun
  • We know that it’s not Milankovitch cycles
  • We know that it’s not volcanoes
  • We know that even when combined, natural causes cannot explain the current warming
  • We know that CO2 traps heat
  • We know that increasing CO2 causes more heat to be trapped
  • We know that CO2 was largely responsible for past climate changes
  • We know that we have greatly increased the CO2 in the atmosphere
  • We know that the earth is trapping more heat now than it used to
  • We know that including anthropogenic greenhouse gasses in the models is the only way to explain the current warming trend

When you look at that list of things that we have tested, the conclusion that we are causing the planet to warm is utterly inescapable. For some baffling reason, people often act as if scientists have never bothered to look for natural causes of climate change, but the exact opposite is true. We have carefully studied past climate changes and looked at the natural causes of climate changes, but none of them can explain the current warming. The only way to account for our current warming is to include our greenhouse gasses in the models. This is extremely clear evidence that we are causing the climate to warm, and if you want to continue to insist that the current warming is natural, then you must provide actual evidence for the existence of a mechanism that scientists have missed, and you must provide evidence that it is a better explanation for the current warming than CO2. Additionally, you are still going to have to refute the deductive argument that I presented earlier (i.e., show that a premise is false or that I committed a logical fallacy), because finding a previously unknown mechanism of climate change would not discredit the importance of CO2 or the fact we have greatly increased it. Finally, you also need to explain why the earth is trapping more heat than it used to. If you can do all of that, then we’ll talk, but if you can’t, then you must accept the conclusion that we are causing the planet to warm.

Related posts

 Literature cited

  • Allen et al. 2006. Quantifying anthropogenic influence on recent near-surface temperature change. Surveys in Geophysics 27:491–544.
  • Bohm et al. 2002. Evidence for preindustrial variations in the marine surface water carbonate system from coralline sponges. Geochemistry, Geophysics, Geosystems 3:1–13.
  • Foster and Rahmstorf. 2011. Global temperature evolution 1979–2010. Environmental Research Letters 7:011002.
  • Gerlach 2011. Volcanic versus anthropogenic carbon dioxide. EOS 92:201–202.
  • Ghosh and Brand. 2003. Stable isotope ratio mass spectrometry in global climate change research. International Journal of Mass Spectrometry 228:1–33.
  • Griggs and Harries. 2007. Comparison of spectrally resolved outgoing longwave radiation over the tropical Pacific between 1970 and 2003 Using IRIS, IMG, and AIRS. Journal of Climate 20:3982-4001.
  • Hansen et al. 2005. Earth’s energy imbalance: confirmation and implications. 308:1431–1435.
  • Harries et al. 2001. Increases in greenhouse forcing inferred from the outgoing longwave radiation spectra of the Earth in 1970 and 1997. Nature 410:355–357.
  • Imbers et al. 2014. Sensitivity of climate change detection and attribution to the characterization of internal climate variability. Journal of Climate 27:3477–3491.
  • Lean and Rind. 2008. How natural and anthropogenic influences alter global and regional surface temperatures: 1889 to 2006. Geophysical Research Letters 35:L18701.
  • Lockwood and Frohlich. 2007. Recently oppositely directed trends in solar climate forcings and the global mean surface air temperature. Proceedings of the National Academy of Sciences 463:2447–2460.
  • Lockwood and Frohlich. 2008. Recently oppositely directed trends in solar climate forcings and the global mean surface air temperature. II. Different reconstructions of the total solar irradiance variation and dependence on response time scale. Proceedings of the National Academy of Sciences 464:1367–1385.
  • Lorius et al. 1990. The ice-core record: climate sensitivity and future greenhouse warming. Nature 139–145.
  • Martin et al. 2005. Role of deep sea temperature in the carbon cycle during the last glacial. Paleoceanography 20:PA2015.
  • Meehl, et al. 2004. Combinations of natural and anthropogenic forcings in the twentieth-century climate. Journal of Climate 17:3721–3727.
  • Schmittner and Galbraith 2008. Glacial greenhouse-gas fluctuations controlled by ocean circulation changes. Nature 456:373–376.
  • Shakun et al. 2012. Global warming preceded by increasing carbon dioxide concentrations during the last deglaciation. Nature 484:49–54.
  • Skinner et al. 2010. Ventilation of the deep Southern Ocean and deglacial CO2 rise. Science 328:1147-1151.
  • Stott et al. 2001. Attribution of twentieth century temperature change to natural and anthropogenic causes. Climate Dynamics17:1–21.
  • Toggweiler et al. 2006. Mid-latitude westerlies, atmospheric CO2, and climate change during the ice ages. Paleoceanography 21:PA2005.
  • Wei et al. 2009. Evidence for ocean acidification in the Great Barrier Reef of Australia. Geochimica et Cosmochimica Acta 73:2332–2346.
  • Wild et al. 2007. Impact of global dimming and brightening on global warming. Geophysical Research Letters

 

 

 

 

Posted in Global Warming | Tagged , , , , , , , | 37 Comments

Even if medical errors were the 3rd leading cause of death, that wouldn’t be as bad as it sounds

There has recently been a lot of hype over a “new” study claiming that medical errors are the third leading cause of death in the US (this was really just a rehash of previous studies). Dr. Gorski has already done a fantastic job of explaining why that estimate is likely inaccurate, so I am going to focus on a slightly different issue that he only talked about briefly. Namely, even if the estimate is accurate (which its not), it would not support the conclusion that we should abandon modern medicine and switch to “alternative” medicines. In other words, many people try to use statistics like this to claim that modern medicine is actually dangerous and should be avoided, and that claim is extremely flawed for several reasons.

UPDATE 2019: New research has provided more evidence that the actual rate of deaths be medical errors is substantially lower than reported in the study I talked about here. I have updated the title of this post to better reflect that (it was originally “Medical errors may be the 3rd leading cause of death, but that’s not as bad as it sounds”). Dr. Gorski discussed the new research over at Science-Based Medicine.

Image via the CDC

Image via the CDC

First, as I recently explained in a different post, mortality rates in the US have been steadily decreasing with time, and that decrease is largely due to modern medicine. Yes, sanitation played a role back when we first switched to indoor plumbing and stopped letting raw sewage flow down our streets, but sanitation standards in the US haven’t changed appreciably in decades, yet the mortality rates continue to drop. So you clearly cannot give sanitation the credit for the continued decreases. To put this another way, modern medicine is obviously a good thing because of all the lives that it saves. If you stop and think about this for half a second, it should make perfect sense. Think of all the diagnostic tools, machines, procedures, etc. that we have today that we didn’t have a few decades ago.

Just to give one nice example, the first heart transplant was done in 1967, yet as of 2008, there were close to 2,000 heart transplants per year in the US, and that’s just one procedure for one organ. There are almost innumerable conditions that we can treat or prevent today that would have been fatal just a few decades ago, and just in case you aren’t convinced, think about this: if you were in a serious car accident, would you want to be treated in a modern hospital, using modern medical techniques, or would you want to be treated in a hospital from the 1950s that only used the techniques and medicines available then? I’m betting you’d pick the former (I’m also betting that you would want an actual hospital and actual doctors, not an uncertified clinic and a naturopath).

This brings me to my second major point. Yes, there are risks associated with seeking professional medical help, but there are also enormous risks too not getting help, and there are enormous benefits from getting help. To put this another way, you can’t make an argument based on a claim like, “medical errors are the third leading cause of death in the US” without also considering the number of people who are alive because of modern medicine. You have to consider the risks and the benefits, rather than just the risks.

Third, you also have to consider the fact that many of the people who are listed as “died from a medical error” also would have died if doctors had not tried to intervene. Let’s say, for example, that someone needs a heart transplant to live, and, during the surgery, a doctor screws up, and does something that kills the patient. That counts as death by medical error, but the patient still would have died even if the doctor didn’t do anything! In other words, this example (and many actual cases like it) are not situations where a healthy person walked into a hospital and was killed. Rather, they are situations where the person was at death’s door, and the doctors failed to save them. To be clear, I am not justifying medical mistakes, nor am I saying that we shouldn’t be trying to prevent them, but you need to understand that many of them are not adding to the total death toll.

To give a hypothetical example that will illustrate the point, imagine that there is a fatal condition that kills 1,000,000 people annually, and someone comes up with a treatment that cures 50% of patients, but kills the other 50%. It clearly would not make sense to argue that you should avoid the treatment because it kills 50% of its patients. Yes, it has a high death rate, but that death rate is still 50% lower than the death rate without the treatment. Even so, yes, many people do die while in hospitals, but the total death toll is still way lower than it would be without hospitals (Note: for my example, assume that this is a condition that kills you quite quickly, rather than a condition that you can live with for weeks or even years).

Alternative medicine quote: Tim MinchinFinally, you may be thinking, “well, why not just switch to alternative medicines, I mean, what’s the harm?” First, most alternative medicines don’t work (or at least haven’t been tested and shown to work). So using something that either has been shown not to work or has never been tested rather than using something that has been shown to work is foolhardy. For example, by now everyone is probably familiar with the case of Ezekiel Stephan, the Canadian child who died when his parents tried to treat his meningitis with natural remedies rather than science-based medicine. This is only one among many such cases, and it clearly illustrates the dangers of avoiding medical treatments. Further, even for things like giving birth, the neonatal mortality rates are significantly lower for births at hospitals when compared to home births. So there are very real consequences to avoiding science-based medicine.

In short, science is not perfect, but it is still far better than anything else that we have. Although there are many deaths from medical errors, there are also countless millions of lives that are saved by medical successes. Further, the death rate from medical errors appears artificially high because they include many deaths that would have occurred even if the patient hadn’t sought medical help. So yes, there are many deaths for medical errors, and yes, we should be striving to correct the problems with our medical system, but it is absurd to suggest that we should throw out modern medicine altogether and return it the type of woo and snake oil that was prevalent in the dark ages. To use an old adage, you shouldn’t throw out the baby with the bath water.

Posted in Vaccines/Alternative Medicine | Tagged , , , | 2 Comments

When is it reasonable to demand more studies?

I recently wrote a post in which I reviewed the scientific literature on vaccines and autism, and the responses from the anti-vaccine crowd were predictable. The most common of these responses followed the basic format of, “but it could be X, and scientists haven’t thoroughly studied X yet.” This line of reasoning is very common in anti-science circles. It shows up often in debates on GMOs, climate change, natural remedies, etc., and it is honestly somewhat of a confusing topic because, unlike many arguments used against science, this one is not automatically fallacious. There are times in which it is a completely legitimate argument, but there are other times when it is only somewhat legitimate, and other times when it is totally invalid. Thus, these arguments fall along a spectrum and there is no clear and universal set of rules for determining whether or not a particular argument is valid. Nevertheless, I want to talk about some clues and tools that you can use to help judge whether or not this argument is being used appropriately.

Science can never test everything
First, I want to explain the background of why this line of reasoning is potentially problematic. Technically speaking, science never proves anything. It just shows what is most likely to be true given the current evidence. Part of the reason for that is simply that it is impossible to test every possibility, which is why the argument is often fallacious. No matter how thoroughly a topic has been tested, there will always be some aspect of it that hasn’t been tested, which means that you can always make the argument that, “X hasn’t been tested, so it could be X.”

In the case of vaccines and autism, I frequently encounter arguments like, “well we haven’t tested this specific combination of vaccines” or “what if the vaccines interact with this particular genetic condition.” The problem is that we can make an infinite number of these. For example, “what if the vaccine interacts with this specific food that is often eaten with 24 hours of being vaccinated” or “what if playing on a playground shortly after receiving a vaccine is what causes it” or “what if watching TV after receiving a vaccine causes it.” Hopefully you get the point. We can always make these hypotheses, so you need to justify the hypothesis with something other than simply, “well scientists haven’t tested it.”

aliensLogical fallacies
If you are either knowledgeable about logical fallacies or you read this blog often, you can probably spot the core fallacy that is behind this argument. It is, of course, the argument from ignorance fallacy. In its simplest terms, this fallacy occurs whenever you say, “we don’t know, therefore it’s X.” In other words, it uses a gap in our knowledge as an excuse to insert your own view. Creationists are probably the most famous for this and frequently make “God of the gaps arguments” where they say, “we don’t how X works/is formed, therefore God did it.” The problem should be obvious: when we don’t know something, all that you can say is, “we don’t know.” You can’t try to fill that knowledge gap with your pet hypothesis. This means that even when it is valid to bring up the fact that scientists haven’t tested X, you can never jump to the conclusion that X is true. All that you can say in that situation is, “we don’t know, and we need to do more testing.” You can’t assume that X is actually correct, yet that is often what people do. They jump from, “scientists haven’t tested possibility X” all the way to, “possibility X is true.”

Additionally, these arguments often lead to a logical blunder known as shifting the goal posts. It happens when you claim that X is true, then after X is debunked, you claim that it’s actually Y, then after Y is debunked, you claim that it is Z, etc. In other words, you keep changing your hypothesis without addressing your underlying view. A great example of this comes from thimerosal and the MMR vaccine. For years, anti-vaccers swore that it was thimerosal and the MMR vaccine that was causing autism, but that hypothesis has now been thoroughly debunked and thimerosal is no longer in childhood vaccines. As a result, many anti-vaxxers have shifted to claiming that it is an effect of multiple vaccines, or that it is aluminum, or an interaction with a genetic mechanism, etc. If you look through their arguments, you find that this happens over and over again. They claim that, “vaccines cause X via mechanism Y,” then after mechanism Y is thoroughly tested, they shift to, “well vaccines still cause X, but it is actually mechanism Z.” Hopefully you can spot the problem here. It is always possible to keep shifting your hypothesis, but doing so usually involves an ad hoc fallacy and is, therefore, not valid.

Proposing impossible studies
Now that the background is out of the way, let’s look at some of these arguments and why they often don’t work. One of the biggest clues that someone is not using this line of reasoning properly occurs when they demand impossible studies. For example, I have encountered GMO opponents and anti-vaccers who have said that they won’t be convinced until the technologies have been used for over 300 years. That is, however, clearly absurd. It is an arbitrary demand, and it is one that simply cannot be met in their lifetime. It’s the same thing as a creationists saying that he won’t accept evolution unless he can use a time machine to go back in time and see it for himself. These arguments are cop-outs that are no different from openly stating that nothing will ever convince you that you are wrong.

Another common argument by anti-vaccers is to demand additional randomized controlled trials. However, no ethics committee anywhere in the world is going to approve that study (except for new vaccines*) because the benefits of vaccines are so clear and well established. So once again, this is a demand for a study that simply isn’t going to happen.

*Note: Vaccines do actually undergo randomized trials when they are first being tested for approval. Anti-vaccers ignore this.

Similarly, anti-vaccers often demand a study that compares completely unvaccinated children to fully vaccinated children. At a quick glance, that may seem reasonable, but it is actually very problematic for a number of reasons. For one thing, the number of fully unvaccinated children in developed countries is quite low, and only a small fraction of them will have reliable and detailed medical records that you are able to access in a consistent manner. Further, you are going to have to limit your sample to a cohort of only a few years, otherwise you introduce numerous confounders (e.g., the vaccine schedule has changed over time, the genetic makeup of the population changes over time, routine medications change, etc.). Additionally, you are going to have to find a way to control for socioeconomic factors, geographic area, etc. By the time that you account for all of those, you are going to have a fairly small sample size.

Further, even if you could obtain a large sample size, you have a huge confounding factor that is nearly impossible to deal with. Namely, a family that refuses all vaccines very likely has other medical and lifestyle choices that differ greatly from the average person. Many people who fully refuse vaccines also use all manner of “alternative medicines,” shun science-based medicines, have bizarre diets, etc. This means that you are almost certainly going to have very large and important differences between your two groups other than just vaccination status, and that completely robs you of the ability to assign causation. So even though it sounds good in concept, it is just not a realistic study design.

Plausibility
Another important consideration is the plausibility of the hypothesis in question. You can generally use our current knowledge to figure out whether or not a hypothesis is likely to be true, and demanding research into that hypothesis is usually only reasonable when there is a high probability that the hypothesis will be true (more details here and here).

For example, even though there have been lots of studies of homeopathy, there are still many people who insist that we need more, large studies before we can conclude that it doesn’t work. In reality, however, homeopathy makes absolutely no sense. It breaks multiple fundamental concepts in science, and makes absurd claims like, “chemicals become more potent when they are diluted.” As such, it is exceptionally unlikely that it could work, and studying it is, quite frankly, a complete waste of time and money. You might as well study whether or not pigs can fly or if the moon is made of cheese.

Many (perhaps most) “cures” for cancer also fall into this category. For example, there are many people who insist that drinking Miracle Mineral Solution (which is just an industrial bleach) will cure all types of cancer (as well as “95% of all diseases”). You will not, however, find many large properly controlled studies on using it as a treatment for any of these conditions. Why? Quite simple, because it doesn’t make even the slightest bit of sense. We understand an awful lot about how cancer, bacteria, viruses, etc. work, and because of that knowledge, we know that it is extraordinarily unlikely that Miracle Mineral Solution could live up to any of its claims; therefore, we should be investing time and money into plausible solutions, rather than wasting it on utter nonsense (I explained the absurdity of Miracle Mineral Solution in detail here).

I could give countless additional examples of implausible hypotheses, but rather than continuing to provide examples, I want to make one final point. Namely, the fact that someone proposed a scientific sounding hypothetical pathway does not automatically mean that the pathway is plausible. The human body is amazingly complex, and hypothetical biochemical pathways rarely work in reality. Further, even if you can demonstrate that, in isolation, A causes B, B causes C, and C causes D, that does not automatically mean (or even suggest) that A will lead to D in humans.

For example, there will nearly always be a dose response. So you often end up with a situation where a large dose of A causes B, a large amount of B causes C, etc., but in humans A may only produce a small amount of B, in which case the chain will not proceed to C. There are also lots of other things to consider such as interactions with other chemicals and organs.

To give one actual example, a common pathway claimed by anti-vaccers is that vaccines sometimes cause fevers, and in some cases fevers can cause oxidative stress, and some types of oxidative stress are correlated with autism, therefore vaccines can cause autism. There are numerous problems here. First, yes, vaccines sometimes cause fevers, but those fevers are far less severe than the fevers caused by the diseases that vaccines prevent, so if this pathway was true, vaccines would likely help to prevent autism. Second, correlation clearly does not suggest causation, so just showing that oxidative stress is correlated with autism absolutely does not mean (or even suggest) that oxidative stress actually causes autism. Further, even if oxidative stress actually did cause autism, there would clearly be a certain level of oxidative stress that is required to cause autism, so you would need to establish that vaccines actually cause that level of sustained oxidative stress. In short, there are multiple steps that are questionable or likely don’t work. Now, obviously that doesn’t mean that the pathway definitely doesn’t work, but there is also no good reason to think that it does work, especially given the extent of the studies on vaccines and autism, which leads me into my next major topic.

Thoroughness of current testing
You should carefully consider the strength of the current testing. For example, many people continue to insist that we need more testing before eating GMOs, but in reality we have over 1,700 studies on their safety and environmental effects. Is it possible that there is a dangerous effect that all of those studies missed? Sure, but is it likely? Absolutely not! As such, continuing to insist that we need more studies before we should use them is clearly nonsensical. The ship of reasonable doubt has sailed, and we are well past the point of adequate testing.

Vaccines are the same story. Is it technically possible that vaccines cause oxidative stress which leads autism, and a large study comparing fully vaccinated and fully unvaccinated children would reveal that effect? Yes, but it is incredibly unlikely given the massive studies that we currently have (many of which have over 100,000 children, and the largest of which has over 1.2 million children, details here). To put this another way, if vaccines caused autism, there should be a dose response, and the current studies had the power to detect even very tiny effect sizes. So even if, as anti-vaccers often claim, the current studies compared children with every vaccine but one (usually MMR) with children who received every vaccine, there would still be a dose response, and you would still expect that to show up in the large studies. So the hypothetical pathways proposed by anti-vaccers are highly unlikely because if they were true, then the large studies should have found that children who received more vaccines had a higher rate of autism.

Likelihood that the risk will outweigh the benefits
Finally, when considering whether or not the demand for more research is reasonable, we also have to consider the known benefits/risks as well as the likelihood that the proposed hypothesis will be true. In other words, every decision has risks associated with it, so you have to weigh those risks against the benefits. In the case of vaccines, the odds that they would cause autism are quite low, whereas the known benefits are extremely high. To put this another way, if you demand more studies before we use vaccines, then what you are saying is, “we should not use a medicine that we know has enormous benefits and saves millions of lives annually because there is a slight chance that it causes a disability.” Hopefully you see why that is a problem. The known benefits far outweigh the hypothetical risks. Similarly, with GMOs, those of us in the developed world are sitting around demanding yet another study just in case it will find a problem that the past 1,700 missed, while people in developing countries are suffering from an extremely serious lack of food and proper nutrition that we know GMOs could help to ameliorate.

Conversely, you can have situations where it is reasonable to take action because the risk is enormous and far outweighs the cost of taking action. For example, many people insist that we need more studies before we take action on climate change, and I often hear people say things like, “if the planet is still warming in 100 years, then I’ll believe it.” In reality, of course, we have lots of very solid scientific evidence and logical reasons for concluding that we are causing the climate to change. Therefore, the hypothesis that climate change is just natural is extremely unlikely. More to the point, the consequences of climate change are so dire that waiting for additional evidence before taking action is foolhardy. It’s like standing on a sinking ship after the order to abandon ship has been given and saying, “I want more evidence before I get off this boat. If it is still sinking in a few hours, then I’ll believe it.”

Conclusion
In short, there are many situations in which it is unreasonable to demand additional studies before we take action, and you need to consider things like whether or not the proposed study is possible, whether or not the hypothesis is reasonable, the strength of the current evidence, and the known benefits of the thing in question. More often than not, when people demand additional studies, they are asking for impossible studies on implausible hypotheses, despite a mountain of contrary evidence and extraordinary known benefits. In other words, although the demand for more studies may seem reasonable at first, it is often used simply as a facade to make a position seem reasonable when, in fact, no amount of evidence will ever change the mind of the person making the demand.

Posted in Uncategorized | Tagged , , , , , , , | 5 Comments