Ethics, or morals, are often the guiding principles that help us differentiate between right and wrong. When considering ethics, many think of rules like the Golden Rule—treating others as you would like to be treated—or professional codes of conduct such as the Hippocratic Oath, emphasizing “first, do no harm.” Religious doctrines like the Ten Commandments, with commandments like “Thou shalt not kill,” and wisdom-filled aphorisms from figures like Confucius also come to mind. In its most common definition, ethics are the norms of conduct that help us distinguish between acceptable and unacceptable behavior.
These ethical norms are usually learned from a young age, instilled at home, in school, within religious communities, and through various social interactions. While the foundational sense of right and wrong is developed in childhood, moral growth is a lifelong process, with individuals progressing through different stages of moral maturity. Ethical norms are so pervasive in society that they might seem like mere common sense. However, if morality were simply common sense, why do so many ethical disagreements and complex issues exist in society today?
Alternatives to Animal Testing
Alternative test methods are methods that replace, reduce, or refine animal use in research and testing
Learn more about Environmental science Basics
One compelling reason for these ethical disputes is that while most people acknowledge shared ethical norms, they interpret, apply, and prioritize them differently based on their personal values and life experiences. For instance, while two individuals might agree that murder is inherently wrong, they could disagree on the ethics of abortion due to differing beliefs about what constitutes human life.
Societies often establish legal frameworks to govern behavior, but ethical norms are typically broader and more informal than laws. While laws often reinforce widely accepted moral standards, and both ethical and legal rules use similar concepts, ethics and law are not interchangeable. An action can be legal but ethically questionable, or illegal yet morally justifiable. Ethical principles can be used to critique, evaluate, propose, or interpret laws. Throughout history, numerous social reformers have ethically challenged citizens to resist laws deemed immoral or unjust. Peaceful civil disobedience is recognized as an ethical method of protesting laws or expressing political viewpoints.
Another perspective on defining ‘ethics’ focuses on the academic disciplines that study standards of conduct. Fields like philosophy, theology, law, psychology, and sociology all contribute to understanding ethics. For example, a “medical ethicist” specializes in studying ethical standards within medicine. Ethics can also be defined as a method, a procedure, or a perspective for making decisions about actions and analyzing complex dilemmas. When considering complex issues like global warming, one might adopt economic, ecological, political, or ethical viewpoints. While an economist might analyze the financial costs and benefits of different climate policies, an environmental ethicist would explore the underlying ethical values and principles at stake.
Explore Ethics in Action at NIEHS
Stay updated with the latest insights from our monthly Global Environmental Health Newsletter
Many disciplines, institutions, and professions have developed ethical standards tailored to their specific goals and objectives. These standards facilitate coordination among members and foster public trust. For instance, ethical standards are crucial in fields like medicine, law, engineering, and business. Ethical norms also guide the aims of research and apply to individuals engaged in scientific research and other scholarly or creative endeavors. Research ethics has emerged as a specialized field dedicated to studying these specific norms. Refer to the Glossary of Commonly Used Terms in Research Ethics and the Research Ethics Timeline for more information.
Adhering to ethical norms in research is paramount for several critical reasons. Firstly, ethical norms bolster the core aims of research, including the pursuit of knowledge, truth, and the minimization of error. For example, prohibitions against fabricating, falsifying, or misrepresenting research data are essential for upholding truth and reducing inaccuracies.
Participate in an NIEHS Study
Discover how research ethics are applied in practice.
Visit Joinastudy.niehs.nih.gov to explore the diverse studies conducted by NIEHS.
Secondly, given that research often necessitates extensive collaboration and coordination across various individuals, disciplines, and institutions, ethical standards reinforce values crucial for collaborative work. These include trust, accountability, mutual respect, and fairness. Many ethical norms in research, such as guidelines for authorship, copyright and patenting policies, data sharing protocols, and confidentiality in peer review, are designed to protect intellectual property while promoting collaboration. Researchers rightfully expect credit for their contributions and seek protection against idea theft or premature disclosure.
Thirdly, numerous ethical norms ensure that researchers are accountable to the public. Federal policies concerning research misconduct, conflicts of interest, the protection of human subjects, and animal care and use are vital for ensuring public accountability for researchers funded by public resources.
Fourthly, ethical norms in research are instrumental in building public support for scientific endeavors. Public funding for research is more likely when the public trusts the quality and integrity of the research process.
Finally, the norms of research champion a range of other crucial moral and social values, including social responsibility, human rights, animal welfare, legal compliance, and public health and safety. Ethical failures in research can have severe consequences for human and animal subjects, students, and the wider public. For instance, a researcher who fabricates data in a clinical trial could harm or even endanger patients, and failing to adhere to safety regulations regarding radiation or biological materials can put the researcher, staff, and students at risk.
Codes and Policies for Research Ethics
Given the vital role of ethics in research, it’s unsurprising that numerous professional organizations, government agencies, and universities have established specific codes, rules, and policies concerning research ethics. Many government agencies have mandated ethics rules for researchers they fund.
Ethical Principles
The following provides a general overview of some ethical principles commonly addressed in various codes*:
Honesty
Strive for honesty in all scientific communications. Report data, results, methods, and publication status truthfully. Do not fabricate, falsify, or misrepresent data. Avoid deceiving colleagues, research sponsors, or the public.
Objectivity
Aim to minimize bias in experimental design, data analysis, interpretation, peer review, personnel decisions, grant writing, expert testimony, and all other aspects of research where objectivity is expected. Disclose personal or financial interests that could influence research.
Integrity
Maintain promises and agreements; act with sincerity; strive for consistency in thought and action.
Carefulness
Avoid careless errors and negligence. Critically examine your own work and the work of peers. Maintain thorough records of research activities, including data collection, research design, and communications with agencies or journals.
Openness
Share data, results, ideas, tools, and resources. Be receptive to criticism and new ideas.
Transparency
Disclose methods, materials, assumptions, analyses, and other information necessary to evaluate your research.
Accountability
Take responsibility for your role in research and be prepared to provide an account (explanation or justification) of your actions in a research project and the reasoning behind them.
Intellectual Property
Respect patents, copyrights, and other forms of intellectual property. Do not use unpublished data, methods, or results without permission. Provide proper acknowledgment or credit for all contributions to research. Never plagiarize.
Confidentiality
Protect confidential communications, such as papers or grants under review, personnel records, trade or military secrets, and patient records.
Responsible Publication
Publish to advance research and scholarship, not solely to advance your own career. Avoid wasteful and redundant publications.
Responsible Mentoring
Educate, mentor, and advise students effectively. Promote their welfare and support their independent decision-making.
Respect for Colleagues
Treat colleagues with respect and fairness.
Social Responsibility
Strive to promote social good and prevent or mitigate social harms through research, public education, and advocacy.
Non-Discrimination
Avoid discrimination against colleagues or students based on sex, race, ethnicity, or other factors unrelated to scientific competence and integrity.
Competence
Maintain and enhance your professional competence and expertise through continuous learning and education. Promote competence in science as a whole.
Legality
Be aware of and comply with relevant laws and institutional and governmental policies.
Animal Care
Show proper respect and care for animals used in research. Avoid unnecessary or poorly designed animal experiments.
Human Subjects Protection
When conducting research on human subjects, minimize risks and harms while maximizing benefits. Respect human dignity, privacy, and autonomy. Take extra precautions with vulnerable populations and strive for fair distribution of research benefits and burdens.
* Adapted from Shamoo AE and Resnik DB. Responsible Conduct of Research, 3rd ed. New York: Oxford University Press, 2022.
Ethical Decision Making in Research
While codes, policies, and principles are invaluable, they, like any set of rules, cannot cover every situation, may sometimes conflict, and often require interpretation. Therefore, it is crucial for researchers to develop skills in interpreting, evaluating, and applying research rules and in making ethical decisions across various scenarios. Most decisions involve straightforward ethical rule application. Consider this example:
Case 01
A research protocol studying a hypertension drug requires administering different doses to 50 lab mice, followed by chemical and behavioral tests to assess toxic effects. Tom is nearing completion of the experiment for Dr. Q, with only 5 mice remaining. Eager to join his friends on a spring break trip to Florida leaving tonight, and having already injected all 50 mice, Tom hasn’t finished all tests. He decides to extrapolate the remaining 5 results from the 45 already completed.
Numerous research ethics policies would deem Tom’s actions unethical due to data fabrication. If federally funded, this would constitute research misconduct, defined by the government as “fabrication, falsification, or plagiarism” (FFP). Actions widely considered unethical by researchers are classified as misconduct. However, it’s essential to note that misconduct involves intent to deceive. Honest errors from sloppiness, poor record-keeping, miscalculations, bias, self-deception, or even negligence do not qualify as misconduct. Furthermore, reasonable disagreements over research methods, procedures, and interpretations are also not considered research misconduct. Consider another case:
Case 02
Dr. T discovers a mathematical error in a paper accepted for journal publication. While the error doesn’t alter the research’s overall results, it’s potentially misleading. The journal is in press, making pre-publication correction impossible. To avoid embarrassment, Dr. T decides to ignore the error.
Dr. T’s error is not misconduct, nor is initially choosing inaction. However, most researchers and ethical codes would argue Dr. T should inform the journal (and co-authors) about the error and consider publishing a correction or erratum. Not publishing a correction would be unethical as it violates norms of honesty and objectivity in research.
Many actions, while not government-defined “misconduct,” are still widely considered unethical by researchers. These are sometimes referred to as “other deviations” from accepted research practices, including:
- Publishing the same paper in multiple journals without editor notification.
- Submitting the same paper to multiple journals concurrently without editor notification.
- Failing to inform a collaborator of patent filing intentions to ensure sole inventor status.
- Including a colleague as a paper author in exchange for a favor, despite no significant contribution.
- Discussing confidential data from a paper under peer review with colleagues.
- Using data, ideas, or methods learned during grant or paper review without permission.
- Removing outliers from a dataset without justification in the paper.
- Using inappropriate statistical techniques to inflate research significance.
- Bypassing peer review by announcing results via press conference without adequate peer review information.
- Conducting literature reviews that fail to acknowledge field contributions or prior relevant work.
- Exaggerating claims on grant applications to convince reviewers of a project’s significance.
- Exaggerating credentials on job applications or curriculum vitae.
- Assigning the same research project to two graduate students to foster competition for speed.
- Overworking, neglecting, or exploiting graduate or post-doctoral students.
- Failing to maintain adequate research records.
- Failing to retain research data for a reasonable period.
- Making derogatory or personal attacks in peer reviews.
- Promising better grades to students in exchange for sexual favors.
- Using racist language in the laboratory.
- Significantly deviating from approved animal care or human subject research protocols without committee/board notification.
- Not reporting adverse events in human research experiments.
- Wasting animals in research.
- Exposing students and staff to biological risks violating institutional biosafety rules.
- Sabotaging another person’s work.
- Stealing supplies, books, or data.
- Rigging experiments to predetermine outcomes.
- Making unauthorized copies of data, papers, or software.
- Holding over $10,000 in stock in a research-sponsoring company without disclosing the financial interest.
- Deliberately overstating a new drug’s clinical significance for economic gain.
These actions are broadly seen as unethical by scientists, and some may even be illegal. Most also violate professional ethics codes or institutional policies but fall outside the narrow governmental definition of research misconduct. The definition of “research misconduct” has been widely debated, with many researchers and policymakers dissatisfied with the government’s FFP-focused definition. Given the extensive list of potential “serious deviations” and the practical challenges in defining and policing them, the government’s narrow focus is understandable.
Finally, research often presents situations where disagreement arises about the correct action, and no consensus exists. In these cases, valid arguments exist on multiple sides, and ethical principles may clash. These situations create challenging ethical or moral dilemmas. Consider this scenario:
Case 03
Dr. Wexford leads a large epidemiological study on the health of 10,000 agricultural workers, possessing an extensive dataset on demographics, environmental exposures, diet, genetics, and disease outcomes like cancer, Parkinson’s disease (PD), and ALS. After publishing a paper on pesticide exposure and PD in a prestigious journal and planning further publications, she receives a data access request from another research team interested in pesticide exposure and skin cancer, a topic Dr. Wexford also intended to study.
Dr. Wexford faces a dilemma. Ethical norms of openness encourage data sharing, and funding agency rules might mandate it. However, sharing data could lead to the other team publishing results she planned to pursue, potentially diminishing her (and her team’s) recognition and priority. Weighing both sides, Dr. Wexford needs careful consideration. Options include sharing data under a data use agreement defining permissible uses, publication plans, and authorship, or offering collaboration.
Researchers like Dr. Wexford can use these steps to navigate ethical dilemmas:
Define the Problem or Issue
Clearly articulate the ethical problem. In this case, it’s whether to share data with another research team.
Gather Relevant Information
Poor decisions often result from insufficient information. Dr. Wexford needs more details about university, funding agency, or journal policies, intellectual property considerations, potential for agreements, the other team’s willingness to share data, and the potential impact of publications.
Identify Different Options
Limited imagination, bias, ignorance, or fear can restrict options considered. Beyond ‘share’ or ‘don’t share,’ alternatives like ‘negotiate an agreement’ or ‘offer collaboration’ exist.
Apply Ethical Codes, Policies, and Legal Rules
University or funding agency data management policies might apply. Broader ethical rules like openness and respect for intellectual property are relevant, as are intellectual property laws.
Seek Ethical Advice
Consulting colleagues, senior researchers, department chairs, ethics or compliance officers, or trusted advisors can be helpful. Dr. Wexford might consult her supervisor and research team before deciding.
After considering these questions, further inquiry, information gathering, option exploration, or ethical rule consideration might be needed. Eventually, a decision and action are required. Ideally, a decision-maker in an ethical dilemma should be able to justify their choice to themselves, colleagues, administrators, and affected parties. They should articulate reasons for their actions and consider questions to explain their decision-making process:
- Which choice will likely yield the best overall outcomes for science and society?
- Which choice can withstand public scrutiny?
- Which choice aligns with your conscience?
- What would a wise, trusted individual advise in this situation?
- Which choice is most just, fair, and responsible?
Even after this process, decision-making can be challenging. In such cases, relying on gut feelings, intuition, seeking guidance through prayer or meditation, or even chance (like flipping a coin) may be considered. This doesn’t imply ethical decisions are irrational, but rather acknowledges that human reasoning has limits in resolving all ethical dilemmas within given time constraints.
Promoting Ethical Conduct in Science
Read about how U.S. research institutions are adhering to federal mandates for ethics in research.
Learn more about NIEHS Research
Most US academic institutions mandate responsible conduct of research (RCR) education for undergraduate, graduate, and postgraduate students. The NIH and NSF require research ethics training for students and trainees. Many non-US institutions have also developed research ethics curricula.
Students in research ethics courses may question the necessity of such education, believing themselves ethical and capable of distinguishing right from wrong, and unlikely to fabricate data or plagiarize. Indeed, most colleagues are likely highly ethical, and research misconduct is relatively rare despite publicized cases. Shamoo and Resnik (2022), cited above, confirm this. However, though infrequent, misconduct significantly impacts science and society, compromising research integrity, eroding public trust, and wasting resources. Misconduct rates are estimated from 0.01% to 1% of researchers annually, varying by reporting method.
Whether research ethics education reduces misconduct is still undetermined, partly depending on the perceived causes of misconduct. Two main theories exist: the “bad apple” theory and the “stressful environment” theory. The “bad apple” theory posits that most scientists are ethical, and misconduct is limited to morally corrupt, desperate, or disturbed individuals. This theory suggests ethics courses have minimal impact on such individuals, as peer review and self-correction mechanisms will eventually expose them.
Conversely, the “stressful environment” theory attributes misconduct to institutional pressures, incentives, and constraints, like publication pressure, grant acquisition, career ambitions, profit motives, inadequate supervision, and oversight (Shamoo and Resnik 2022). Proponents note the imperfections of peer review and the relative ease of system manipulation, where flawed research can persist undetected for years. Misconduct likely stems from both environmental and individual factors—morally weak, ignorant, or insensitive individuals in stressful environments. Ethics education can still be valuable, even if not preventing deliberate misconduct, by mitigating deviations from norms. It can enhance understanding of ethical standards, policies, and issues, and improve ethical judgment and decision-making. Many deviations may occur due to researchers’ lack of awareness or critical reflection on ethical norms. For instance, unethical authorship practices might stem from unquestioned traditions, like automatically including lab directors as authors regardless of contribution. Similarly, conflicts of interest, such as accepting stock or consulting fees from research sponsors, might be seen as normal without recognizing ethical implications.
If deviations result from ignorance or unexamined traditions, research ethics education can reduce serious deviations by improving ethical understanding and issue sensitivity.
Finally, research ethics education should equip researchers to handle ethical dilemmas by introducing key concepts, tools, principles, and methods for resolution. Scientists face controversial topics like human embryonic stem cell research, genetic engineering, artificial intelligence in various fields, potentially pandemic-creating biological experiments, and research involving human or animal subjects, all requiring ethical reflection and deliberation.
David B. Resnik, J.D., Ph.D. Bioethicist
Tel 984-287-4208 [email protected]