Ever since I was a child, I was fascinated by science. Back then, science was reading about astronomy, physics, chemistry, and biology and the wonderful facts and theories proposed to explain those facts. The method that is used by scientists in their journey is the scientific method. Although the exact nature of the scientific method has been examined by philosophers for hundreds of years, its core nature of falsifiable hypotheses and experimentation holds a great beauty of thought.
However, science is carried out by individuals and societies and in turn that knowledge affects those individuals and societies. Thus, the actual discoveries and processes of science in the short term can be far more tumultuous than the overarching successes presented in history. As intelligent beings, we need to be aware of the dangers in science, so that we can more quickly discover the truth. We need to be wary of the misuse of science for means other than discovering the truth via genuine curiosity about the world.
I still believe that over the span of centuries, our knowledge will converge to the truth. It is the speed of this convergence, and the possible derailment of this process, that we must carefully watch. For as we are becoming increasingly aware of the world through our use of knowledge and technology, that ability can increasingly be used for harm as well as good.
Let us now examine some ways in which science can be misused.
To support science out of complacency
Research out of complacency is when a direction of research is continued not because it is interesting, but because it supports an institution. This may happen because there are a group of senior researchers who are keeping the field alive through their influence and yet are attracting few people to the field. Since typically academia is driven through peer review both for publications and grants, this is very easy to do.
This is a contentious topic because it is difficult to determine what will be interesting and which people might suddenly become interested. In that case, I will not name any field I think may fall under such a scheme. So although it is undoubtedly true that through complex arguments, any research field could be given a strong case for being important, I think that is also misdirection and besides the point.
The problem happens when a researcher knowingly enforces a stronghold of a certain field, knowing that in actual fact, the resources spent on that field would be better spent elsewhere. Here, I am not talking about inhibiting genuine curiosity. If someone has a genuine curiosity to explore a research area, they should not be stopped from doing so, if that curiosity is an honest reflection of the basic mental attitude of self-discovery. However, having that genuine curiosity and at the same time fostering a field which is toxic in the sense of being exclusionary to the self-discovery of others who might be interested in it, is contradictory.
This happens when researchers get the urge to explore certain areas, but actively prevent themselves from doing so because that desire is not aligned with the trends of their research circle. Nor am I criticizing those who must do this in order to keep their jobs. We must consider all aspects of the individual and not just the science-producing aspect of them. A person who publishes papers in order to obtain a secure career is doing something true to their heart and out of necessity. However, a researcher who promotes areas only for continued self-recognition or a feeling of being important while having a secure career is being dishonest with regard to the capability they have been granted as the gatekeepers of scientific inquiry.
Why is this phenomenon detrimental? Isn't all knowledge we discover good? Indeed, there is nothing wrong with that by itself. However, when the acts that lead to this discovery make it more difficult for younger researchers or students to actively explore their own interests, then it becomes harmful. It is harmful when these trends lead to hyperspecialization as the only way forward in research, while new areas that may actually be interesting are neglected simply because there is no support for young researchers to discover these new fields.
Science to support ideologies and beliefs
The natural and physical sciences have a long history. More recently, the title of science has been bestowed upon social matters such as psychology and sociology, collectively titled the social sciences. In many subareas of these fields, some form the scientific method is being used for knowledge discovery. Again, by itself, this is actually a good thing and a far cry from the initial efforts of individuals like Sigmund Freud who postulated a large body of notions, most of which were neither well-defined enough to make sense nor even falsifiable.
However, our modern day standards of subsuming these disciplines into the realm science can have dire consequences on two fronts: the propagation of incorrect statements due to the fervor of ideologies, and the misinterpretation of results by the public and other researchers to further their own ideological agendas.
Let us start with the propagation of falsehoods. How could this happen with the scientific method? One trick is called p-hacking. This occurs when researchers collect data and massage the statistical methodology until they obtain a significant result. This could include varying the model after the experiment by changing the explanatory variables, using different models or tests, or even redefining new variables from old in order to obtain that magical p-value less than 0.05. A good summary for some of this is Stephen Lindsay's paper Replication in Psychological Science.
Where this becomes a problem is when the results support someone's ideology or strong beliefs. An example is the belief that violent video games causes violent behaviour or the countless topics on gender. These highly charged topics, in which many people have a vested interest in protecting one viewpoint or another, create a huge incentive for the publication of studies that support one particular viewpoint. This is similar to a societal version of p-hacking, called the file drawer effect , in which a group of people, including scientists but also grant agencies and political bodies, select for studies that support their world view.
This is especially bad because an individual study may be quite rigorous and sound, and yet because it is subject to this sort of publication bias, over time a glut of such studies begin to create a seemingly strong scientific support for an ideology. This is coupled with the second phenomenon I mentioned: the incorrect interpretation of correctness, in which oversimplification and misinterpretation occurs in order to further strongly-held beliefs.
Because ideologies and strong beliefs are often borne out of strong fear and panic, the distortion of the scientific method in order to accommodate them and support them is especially dangerous. Someone who has had a loved one murdered by another who happened to be playing violent video games is purely motivated by fear and the urge to do something and find some simple explanation. But that visceral reaction should not prevent us from trying to discover the truth, in either direction. Moreover, we cannot reduce issues to yes or no questions, but we should remain calm and remember the nuance that exists because we are complex creatures. Having the import of the cold hand of science supporting an emotional belief through disingenuous methods such as p-hacking and publication bias is dangerous because people trust science and the shortcomings of this path of knowledge discovery will be lost in translation.
Science that harms living organisms
The first type of misuse of science, was essentially the misuse of resources against the spirit of science. The second type of misuse was the use of shoddy methods and bias to support fear and emotion in society. The third type is the creation of knowledge that might actually harm us or the planet we live on.
Perhaps people might first think of the atomic bomb or polluting industries, and I think that these are good examples. However, I am more concerned about the ostensibly good discoveries for which we do not have the societal wisdom to recognize as harmful. I believe at least some efforts are being taking to curtail the first two types of misuse I talked about, although the support of emotion and ideology maybe less so. But the third type: rigorous and well-done science that is actually harmful, is so difficult to discern at the time of doing it that it is so easy to overlook.
One example is the development of technology such as smartphones. The existence of such technologies is not necessarily harmful, but the intensity and level of integration in our lives could be. For example, smartphones can be addictive and has been shown to be related to perceived stress . Science is the ultimate source of new developments and technology companies are the driver, pushing these new developments into the hands of individuals. The natural and healthy inclination in most of us to connect with others is used as a tool for profit, and that cycle is perpetuated throughout modern society with the incremental improvements every year of new and more integrative devices.
There is actually a debate in the literature about the severity of smartphones such as in . I think this is a healthy investigation in a sense, but we also should be aware that it could be a reflection of a strongly held belief that smartphones might pose problems on occasions but most of the time they are not so bad. Indeed, I will admit that I brought up this example because I dislike smartphones myself and that is also partly emotional. And, I will not argue in this post that they are inherently bad. Rather, I want to say that it is examples such as these that should examined intimately, not just from a short-term psychological point of view, but from a long-term view towards its impact on society and the environment. It is precisely the intricate and seemingly innocuous appearance of new technology that warrants us to be cautions of it.
Other example that comes to mind are extending the human lifespan and virtual reality, both of which are intensely studied in the literature.
On responsibility in science
I have tried to introduce different ways in which science may be misused. By misused, I mean some use that brings unnecessary harm to society, individuals in the short term, or the environment. In many cases, we see that there is some oversight and intellectual debate to examine this misuses. In other cases, such as highly emotionally charged issues, the academic literature can be quite one-sided.
But more than just publish more papers about publishing papers, as individual scientists we also have the ethical obligation to watch out for these phenomena. If specific research fields in psychology or metascience are created at universities, then that is a good thing. If we can at least shed light on the good or evil that may come out of science, then we will be better for it.
Anyone who is trained in science has the knowledge and ability to recognize many of these issues. Such knowledge entails a responsibility to make decisions based on our understanding. These decisions, both personal and professional, do not have to be predicated on what is right, since we may not know what that is. But they do have to be made with the desire to understand and discover the truth even if that truth is uncomfortable, and connect with others in order to share our limited understanding in the hopes of creating a better environment for all living organisms on this planet.
 Head, Megan L., et al. "The extent and consequences of p-hacking in science." PLoS Biol 13.3 (2015): e1002106.
 Samaha and Hawi. "Relationships among smartphone addiction, stress, academic performance, and satisfaction with life." Computers in Human Behavior 57 (2016): 321-325.
 Panova, Tayana, and Xavier Carbonell. "Is smartphone addiction really an addiction?." Journal of behavioral addictions 7.2 (2018): 252-259.
 Glannon, Walter. "Extending the human life span." The Journal of Medicine and Philosophy: A Forum for Bioethics and Philosophy of Medicine. Vol. 27. No. 3. Journal of Medicine and Philosophy Inc., 2002.