Checks, balances, and vibes: Research democratization in the age of AI
The vibes are weird. Robots and AI are everywhere. We have self-driving cars. Homes are now “smart.” And some companies are proposing to “...[empower] any member of your team to launch a research study in 10 minutes, interview hundreds of customers in hours, and ‘view actionable insights instantly’ through the use of AI,” thereby ostensibly replacing researchers with AI.
And democracy is…in a strange place. I’m not just talking about politics — seemingly everywhere it’s democratize this, democratize that. Technology has promoted democratization (and with it modernization) to all areas of life: photography, transportation, advertising, publishing, design, research; and with the onset of vibe coding, “democratization” has even come to software development. Software development (of which coding is a large part) was once revered as a discipline that required skill, dedication, logic, and precision. But now, according to Andrej Karpathy, the cofounder of OpenAI (Edwards, 2025), vibe coding allows devs to just see stuff, say stuff, run stuff, and copy paste stuff, and it mostly works,” with the help of Large Language Models (LLMs) and ChatGPT.
Whether you’re for or against research democratization, or for or against the use of AI in the workplace, one thing is becoming increasingly clear. If we, as research and Research Ops professionals, don’t begin to bring in, upskill, and empower our designers, product managers, and engineering partners to do (and access) the research they need, we’re liable to put ourselves in a position where our partners are relying on AI as their research allies — instead of human researchers.
Political science as context
In the world of tech (and research and Research Ops), we lack a definition that describes a system of checks and balances — a governance structure — and I can’t help but think about this systemic change in terms of democratic evolution. Before starting a career in UX (after a brief stint as a data scientist), I earned a PhD in political science. For nearly two decades, I thought, wrote, researched, and taught about democratic systems and how people navigate them. In the last two years I’ve become increasingly convinced that we’re entering a new Great Democratization Cycle, our own Third Wave of (tech) Democracy (à la Huntington, 1991), and just like the wave of democratization that swept the world in the 1980s and 1990s, research and tech democratization will happen with or without us. So it’s best to be prepared. With the right guidance and preparation, the very people and tools we worry will make research positions obsolete, may, if we play our cards right, be active force multipliers for the power of research.
So, if your imagination is as active as mine, or you’re as indoctrinated into thinking about democratic systems as I have been, then the following should sound applicable to the current situation in UX Research.
Reading the plethora of articles and think pieces on democratization (mine included) is a bit reminiscent of The Federalist Papers and Anti-Federalist Papers, in which the United States’ founding folk, in the leadup to ratify the Constitution of the United States, argued about key components that would come to define American-style democracy:1 strong central government, a system of checks and balances, governmental structure, and the importance of federalism. All similar aspects for us to consider as we endeavor to establish and maintain a thriving research democracy.2 In general, Federalists are concerned with the effectiveness of democratization through a system of checks and balances structured to prevent against tyranny and factions; Anti-Federalists are focused on making sure the system of democracy doesn’t encroach on an individual’s (in this case, a researcher’s) rights (in this case, expertize).
But we’re facing something that the US Founding Folks wouldn’t have ever imagined in their wildest dreams: AI, robots, vibe coding...The closest thing to artificial intelligence or robots they likely ever encountered was the Mechanical Turk, a chess-playing machine created by Wolfgang von Kempelen to win over Empress Maria Theresa of Austria in 1770.
For researchers, the idea of democratization and its necessary checks and balances gets more precarious when we think about the era we’re facing — one in which developers can build an app from end to end in less than a day.
The question driving research survival need not be “How do we get people to include research in the process?” Instead, it should aim to answer this: “How do we promote and support responsible research democratization when technology is moving at such speed?” We would benefit from thinking of democratization as an opportunity to showcase our unique strengths, rather than viewing it as a threat. However, it can also be fraught and even dangerous.
Research democratization defined (and refined)
“In simplest terms, user research democratization means making it possible and acceptable for anyone in an organization, regardless of role, to conduct user research.” — Jose Gallagos, Respondent
Vibes aside, a pivotal, yet unanswered question remains at the centre of all this discussion and thought work: What is research democratization?
When we attempt to define research democracy, explanations like Jose Gallagos’ might easily rile us. But opposite ends of the research democratization spectrum should exasperate us more: on one end is a research anarchy in which anyone and everyone in the company can talk to users, call it research, and make big design and product decisions based on the resultant “data” and “insights.” On the other end is a research autocracy in which only “true” researchers can conduct research, albeit amidst a backlog so long that research is seen as a blocker, and is eventually eliminated in order to promote efficiency in the product life cycle. To be fair, what resembles a research anarchy already exists in many organizations, and does indeed exasperate and rile many of us. When it comes to democracy, maybe Winston Churchill was right after all.
“Many forms of Government have been tried, and will be tried in this world of sin and woe. No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time.” — Winston Churchill, 11 November 1947
Ideally, research democratization would mean a better distributed workload. And instead of outright annihilation of research roles or expertise, the legitimacy of research and integration of data and insights would deepen across an organization, and along the product development life cycle.
If we are truly dedicated to building products that address the needs of our users and make their lives better, we are obligated to promote a research democracy that not only enables everyone in the product development life cycle to conduct legitimate research, but also empowers them to be data literate.
The ugly truth is: AI — and all the good and the bad it brings — isn’t going anywhere. But what’s also true is that the rise of AI does not mean the end of research as we know it. In fact, using AI mindfully (and with governance measures) may help to offset some of the more repeatable and outsourceable work, thereby allowing researchers the capacity and space to cultivate deeper knowledge or expand their creativity. No matter how we look at it, if we don’t bend, we’ll break.
The status quo offers little (if any) opportunity for us to focus on integrating new skills, adapting, and expanding creative autonomy. But by softening the rigid construct that research must be conducted only by experts, we grant ourselves space to shift and build skills that allow us to adapt to new and emerging conditions. Research democratization requires researchers to embrace their inner teacher and program manager, becoming the stewards and propagators that bring clarity, strength, and empowerment to their partners. On the other hand, if we bend too much and expand too broadly, we risk turning over the keys to the research kingdom completely and becoming irrelevant. (Not to mention encouraging whatever chaos AI might impinge on our profession.)
So, how do we solve this? Well, it depends. Like country-level democracies, I believe that research democracies need to be built with their constituency in mind. Democratized research isn't merely about broadening access to research; it's about empowering practitioners. And an essential step toward that empowerment is encouraging deeper critical thinking and teaching our partners how to engage with AI. Without structured foundations in place (i.e., rules or guidance on the use of AI), the democratization process risks becoming chaotic and ineffective. Just as political systems rely on governance structures and rules to function productively, research requires governance and clear frameworks to guide decision-making and ensure accountability.
Two sides of a double-edged sword
What keeps me up at night when we talk about democratization — whether in research or in everyday life — is a side effect that often goes unmentioned: the losers of modernization. The losers of modernization can be thought of as folks who modernization and development left behind. Foregoing a detour into a discussion about political development, an example of this could be found in the factory worker who has been replaced by automation or the cashier who has been replaced by self-checkout. Without the requisite skills to find employment in an advancing society, the factory worker and cashier are forgotten by modernization and technology.
And it plays on the individual psyche as well. Those who are most adversely affected by modernization may at some point radicalize to the extent that they will shift their worldviews to align with more extreme positions proposing to undo changes associated with modernization. In sociological circles, this is often coupled with conversations of social breakdown and relative deprivation. All this is to say, as we continue through periods of transformation, people will be left behind.
As a people leader in research operations, I worry about the effects of research democratization on the mental well-being of the teams and individuals I support. I worry that my teams will be left behind in democratization — that they, as researchers, are no longer understood to be experts in their field and are left out of conversations where they could help product and design navigate risky decisions. I worry that by teaching and encouraging our partners to do low-risk research, our partners might make camp on top of Dunning-Kruger’s Mt. Stupid (“I know everything about users”), while researchers will wallow in an extended and ever-deepening valley of despair.

In a world with AI-driven research, “solutions” abound. Synthetic Users claim to support user research “without the headache” 🙄, designers are skipping traditional usability testing by asking ChatGPT to validate designs, and interviews are being conducted by robots. In this world, I sincerely worry that we’ll soon enter a time of vibe research where designers, PMs, and engineers conduct “research” just by seeing stuff, saying stuff, running stuff, and copying and pasting stuff, because… they think it mostly works.
There are ways to leverage the power of LLMs and machine learning (ML) in order to democratize research while also promoting empirical thinking and responsible research. But vibe research isn’t the way.
At the center of all this lie some critical questions:
- Can we — and will we — build a system that does not replace independent contributors (ICs), but rather promotes them and allows them to work on more strategic and meaningful projects?
- Can we — and will we — empower non-researchers to make data-informed decisions and avoid creating roadblocks in this brave new vibe-coded world?
- How do we bring research expertise into the mix so that the discipline of research avoids becoming the new loser of tech modernization?
Responsible, ethical, and human-centred development means democratizing our practice in a way such that ICs and research teams are not left behind as the tech world modernizes. Failure to do so would be punctuated by manifestations of revolt fed by increasing discontent as they watch others get access to resources and opportunities that my ICs and research teams deserve.
With this, we need to take a very careful look at how to build research democracy while ensuring researchers are not left to be replaced by robots or vibes.
A roadmap for democratization
“New democracies are, in effect, in a catch22 situation: lacking legitimacy they cannot become effective; lacking effectiveness they cannot develop legitimacy.” (Huntington, 1991: 258)
The key to democratizing research is empowering practitioners to shape research agendas and methodologies proactively. Democratization, at its core, signifies a shift towards inclusivity and openness in decision-making processes. For a democracy to be effective, we need order and structure. There has to be someone in charge. To quote UX Design Researcher David Tang, “...it is important to understand the nuance that a democratic process does not mean that decision-making power in all stages of research is shared.” The question, called from political scientist Robert Dahl, is, then: “who governs?”3 Who has the ultimate say in what research focuses on, and how research is used?
Importantly, what does any of this look like relative to the fact that corporations are not democracies? Employees rarely, if ever, vote on policy or decisions. Employees are not citizens in this equation. In a far stretch of the word, employees are denizens,4 beholden to decisions made by governing bodies (in this case, executives). Outside of work, we’re seeing a retreat from democracy and democratic ideals, yet simultaneously it looks as though democratization may be the antidote within our workplaces as the onslaught of automation and artificial intelligence.
So, how do we “democratize” within an inherently non-democratic framework? The answer is responsibly and deliberately. Research democratization doesn’t mean simply replacing the expertise and skills of researchers with the efficiency of machines. To democratize research means to empower others — namely, designers, product managers, and engineers — to think like researchers, interrogate the data and findings presented to them, test hypotheses, and verify (not validate) information. By empowering others to think like researchers, we can guard against overreliance on LLMs, ML, and AI (along with whatever other alphabet soup combinations you want to conjure).
Even with the best intentions to empower and educate, a tension between acknowledging and leveraging researchers’ expertise and promoting a culture of inclusion where non-researchers are welcome to participate exists. Just as political systems grapple with questions of representation and accountability, research must strike a balance between leveraging specialized knowledge and incorporating diverse perspectives. How you democratize defines how your research organization operates. But operations is not just about tools, administration, and recruitment. The challenge lies in supporting and building operations that empower stakeholders while maintaining the integrity and rigor of research methodologies, such as educational programs and upskilling of people who do research (PWDRs)5. Moreover, research can not come at this with a “one size fits all” approach, and it must facilitate collaboration among stakeholders, ensuring that diverse perspectives are incorporated into the research process.
Bringing this back to the political science framework, the scalability of democratized research hinges on the establishment of robust rules and guidelines. Just as political systems rely on laws and regulations to maintain order and stability, research requires rules to govern data collection, analysis, and decision-making. These rules provide a framework for practitioners to operate within, ensuring consistency and reliability in research outcomes. To strike this balance between research rigor and flexibility for facilitation, we need frameworks for deliberative decision-making so that we, as UX practitioners, can harness the collective wisdom of stakeholders to drive innovation and meet user needs more effectively. Without clear rules in place, the democratization process risks becoming fragmented and inefficient. Even more so, without a clear research democratization plan, we risk losing some of our strongest advocates to the simplicity of machines.
Trust in craft, trust in the future
Neither robots, nor AI, nor vibe coding can replace the craft of research. Artificial intelligence is a very powerful tool — one that researchers and PWDRs should embrace when possible (and reasonable). But we shouldn’t do it at the expense of quality, depth, and intentionality. The implications and consequences of poorly or carelessly executed research are too profound to risk. And with AI hallucinations and false information growing more rampant (Metz & Weise, 2025), we must employ our critical thinking skills and share our knowledge to promote ethical and responsible use of AI. So it is paramount that people who do research — researchers and nonresearchers alike — understand the fundamentals of research, how to engage with the craft of research, and when to harness AI. If we, as researchers, shy away from democratization and isolate our craft, we risk pushing our most valuable partners into the robots’ arms, thereby alienating ourselves.
Note
The views of this article do not reflect those of the author’s employer (past, present, or future). The only use of artificial intelligence in this article was to develop a catchy title.
References
Dahl, Robert. 1972. Polyarchy: Participation and Opposition. Yale University Press.
Dahl, Robert. "What Political Institutions Does Large-Scale Democracy Require?" Political Science Quarterly 120, no. 2 (2005): 187-197. https://www.jstor.org/stable/20202514.
Dahl, Robert. 2005. Who Governs? Democracy and Power in the American City (2nd ed.). Yale University Press.
Dahl, Robert. 2020. On Democracy. Yale University Press.
"Democracy Index 2023." Democracy Index 2023. Economist Intelligence Unit, April 18, 2025. https://www.eiu.com/n/campaigns/democracy-index-2023/.
Edwards, Benj. "Will the Future of Software Development Run on Vibes?" Ars Technica. March 5, 2025. https://arstechnica.com/ai/2025/03/is-vibe-coding-with-ai-gnarly-or-reckless-maybe-some-of-both/.
Gallegos, Jose. "What Is Democratizing Research and How To Make It Work For You." Respondent. January 13, 2023. https://blog.respondent.io/what-is-democratizing-research-and-how-to-make-it-work-for-you.
Huntington, Samuel. 1991. The Third Wave: Democratization in the Late Twentieth Century. University of Oklahoma Press.
Kapstein, Ethan, and Nathan Converse. 2008. The Fate of Young Democracies. Cambridge University Press.
Levitsky, Steven, and Daniel Ziblatt. 2018. How Democracies Die. Penguin Random House.
Levitsky, Steven, and Daniel Ziblatt. 2024. Tyranny of the Minority: Why American Democracy Reached the Breaking Point. Penguin Random House.
Lijphart, Arend. 2012. Patterns of Democracy: Government Forms and Performance in Thirty-Six Countries (2nd ed.). Yale University Press.
Metz, Cade and Karen Weise. “A.I. Is Getting More Powerful, but Its Hallucinations Are Getting Worse.” The New York Times. May 06, 2025. https://www.nytimes.com/2025/05/05/technology/ai-hallucinations-chatgpt-google.html
Tang, David. “Democratization won’t save research.” Medium | UX Collective. March 21, 2025. https://uxdesign.cc/democratization-wont-save-research-d438b7b590b8
"United States." Freedom House. April 18, 2025. https://freedomhouse.org/country/united-states.
1 It would be irresponsible to ignore that the American-style democracy we knew is changing before our eyes. Over the last two decades, the American democratic institutions have experienced noticeable erosion and measurable backsliding (Freedom House, 2025; Economist Intelligence Unit, 2025) of democratic freedoms and civil rights. For more on this, see Levitsky & Ziblatt (2018) and Levitsky & Ziblatt (2023).
2 Lest you think I’m being hyperbolic, take Federalist No. 10 in which future president and founding father James Madison explored ideas to protect the country from rule by a majority faction (thus supporting a republic), while Anti-Federalists argued for explicit checks and balances to protect the citizens from governmental abuse of power and petitioned for the enumeration of rights (leading to the Bill of Rights). (Note: James Madison was an early American statesman who advised George Washington and would become the fourth American president. In the Federalist Papers, Madison warns of the dangers posed by the majority factions, proposing that republics, by virtue of their distributed nature, mitigate large factions.
3 This is a shout-out and head nod to the political scientists in the audience. Robert Dahl (1915-2014) was known as a father of political science and published some of the foundational works on democracy from an empirical lens. His work is considered some of the most important work associated with the pluralist approach to understanding power structures. Recommended readings include: Polyarchy (1972), Who Governs?: Democracy and Power in an American City (2005), What Political Institutions Does Large-Scale Democracy Require? (2005), and On Democracy (2020).
4 Denizen is described as an inhabitant, a person admitted to residence in a foreign country (Merriam Webster).
5 A term first coined by Kate Towsey in 2019.
Edited by Kate Towsey and Katel LeDu.
👉 The ResearchOps Review is the publication arm of the Cha Cha Club – a members' club for Research Ops professionals. Subscribe for smart thinking and sharp writing, all laser-focused on Research Ops.
Rally’s Research Ops Platform enables you to do better research in less time. Find out how you can use Rally to empower your teams to talk to their users, without disjointed tooling and spreadsheets. Explore Rally now by setting up a demo.