For many Black Americans, the mention of vaccines or medical research conjures the Tuskegee Syphilis Study—a decades-long betrayal where researchers deliberately withheld treatment and honesty for 40 years from Black men in Alabama, watching them suffer and die for the sake of observation. This wasn’t just a historical misstep; it created deep wounds that persist to this day. Though all of the men in the original study have died, their relatives and communities still carry the consequences, including being born with congenital syphilis themselves, stigma surrounding the disease, and persistent wariness of public health interventions like COVID-19 vaccinations.
I think of Tuskegee often these days when “science skepticism” is invoked as a cultural battlefield. Yes, disinformation and false narratives fuel some of the doubt, but in many communities, mistrust is not unfounded. It’s memory.
Behavioral science teaches us that vivid experiences—especially negative ones—can profoundly shape our choices and actions. A single betrayal can outweigh a hundred good-faith efforts. If we want to use science as a tool to improve lives, we must first confront this truth: trust is not given. It must be earned—again and again—through transparency, respect, and humility.
That starts with asking not “Why won’t they trust us?” but “How can we design for trust?”
5 Principles for Ethical Applied Research Today
There’s a sharp difference in the potential risks between biomedical research that tests medicine and the kind of social research that ideas42 does. However, many of the same principles of avoiding harm and ensuring informed consent remain applicable. Here are some of the ways my team approaches research with an ethic of respect for individuals’ agency and a commitment to minimizing harm.
1. Design With, Not For. Behavioral science makes one thing clear: Experts reliably overestimate their understanding of what drives other people’s behavior. In fact, there is evidence that expertise may increase the certainty of faulty assumptions. Participatory design helps correct for this bias. We invest in co-design, community advisory boards, and deep listening—because interventions rooted in lived experience, not outsider assumptions, are far more likely to succeed.
2. Give People Real Agency. Informed consent goes beyond getting signatures on forms. It means providing complete, honest information about what we’re studying, why it matters, and what the potential consequences might be—even when those conversations take time. People deserve to understand not only the immediate risks and benefits, but also how their participation contributes to larger research questions and has implications for the world.
3. Be Thoughtful About Power Dynamics. Behavioral science reminds us that the way choices are presented to people, so-called “choice architecture,” can have a profound impact on their behavior. Designers need to be mindful of how they are shaping the choice architecture that research participants face. For instance, it’s a common (and ethical) practice to compensate participants for their time and input. But offering exorbitant payments can make people feel like they cannot afford not to participate. Compensating people for their time is about finding the sweet spot of honoring their input without forcing their hand. This happy medium can change from context to context, so we often ask local researchers to help establish compensation that is fair and culturally appropriate.
4. Handle Data Like It Matters. Many practices help make data handling more trustworthy. They include collecting only what we need (sometimes, we’ll ask partners explicitly not to share all the information they could have), storing it on encrypted servers, limiting who can see it, and ensuring everyone knows exactly how it’ll be used.
5. Own Our Mistakes. We must be willing to acknowledge when research has caused harm or when findings don’t support our initial hypotheses. The scientific community’s credibility depends on our collective willingness to say, “We were wrong, and here’s how we’re going to do better.” This doesn’t weaken scientific credibility—it strengthens it. Conducting low-quality, less rigorous work contributes to noise and disinformation. Engaging in good research practices isn’t just methodologically sound—it’s a moral responsibility.
A Call for Shared Responsibility
Science is not a silver bullet or a singular answer to the world’s challenges. Still, it can be one cornerstone of a society in which people are healthier, have economic security, and feel more fulfilled. Unfortunately, a narrative has been building for years that science is unreliable, even harmful. We cannot tap into the benefits of research by ignoring or dismissing that skepticism—we build better designs through transparency, accountability, and genuine partnership.
By embracing rigorous ethics, honest communication, and authentic community engagement, we can transform research from something done to communities into something done with them. Only then can we ensure that research truly serves the public good.
The strength of science lies not in its infallibility but in its willingness to confront its failures and emerge more trustworthy. That’s the foundation upon which we must build our future.
Thank you to the experts on my team who are constantly teaching me about behavioral design. This article wouldn’t have been possible without them, especially Maheen Shermohammed , Sara Flanagan , and Thomas Tasche.