Announcing the UXR AI Disclosure Drafting Tool: Building Trust Through Transparency
In the world of UX research, trust is our most valuable asset. As we increasingly integrate powerful AI tools into our workflows—from transcribing interviews to synthesizing data—maintaining that trust requires a new level of transparency. The question is no longer if we should disclose our use of AI, but how we can do it in a way that is clear, compliant, and reinforces the integrity of our work.
Today, we're excited to launch the UXR AI Disclosure Drafting Tool, a new resource designed to help research teams navigate this complex landscape with confidence.
The Challenge: A Complex Web of Rules and Expectations
Researchers are facing a rapidly evolving set of expectations. Groundbreaking regulations like the EU AI Act and pioneering state laws in California and Colorado are establishing clear legal mandates for AI transparency. These laws require organizations to inform users when they are interacting with AI and to be exceptionally clear about how high-stakes decisions are made.
Beyond compliance, our participants and stakeholders expect honesty. As our own research has shown, simply revealing that AI was used can sometimes backfire, creating a perception that the work was low-effort or inauthentic. This phenomenon, often called the "transparency paradox," presents a unique challenge for researchers: creating disclosures that are not only legally sound but also psychologically effective—building confidence rather than eroding it.
The Solution: A Guided Path to Clear Disclosure
The UXR AI Disclosure Drafting Tool is a conversational, step-by-step guide that helps you generate clear and objective disclosure statements tailored to your specific needs. It translates the complex legal requirements and ethical best practices from our comprehensive research framework into a simple, actionable workflow.
By answering a few questions about your research, the tool helps you draft statements for three key audiences:
Research Participants: Generate clear, plain-language text for your consent forms that explains how AI will be used to analyze data or interact with participants, and how their data will be handled.
Internal Stakeholders: Create concise methodology sections for your research reports and presentations that detail the AI tools used, the specific tasks they performed, and the level of human oversight involved.
External Stakeholders: Draft formal disclosure statements for public reports or academic papers, ensuring your methodology stands up to external scrutiny.
How the Tool Aligns with Regulatory Standards
This tool was designed to address the core principles emerging from global AI regulations directly.
Clarity and Specificity: The generated text is designed to be clear, concise, and objective, meeting the high standard for accessible language set by regulations such as the EU AI Act.
Context-Appropriate Disclosure: The tool helps you tailor your disclosure to the specific use case—from data analysis to participant interaction—ensuring the information is always relevant.
Emphasis on Human Oversight: By prompting you to define the level of human validation, the tool helps you craft statements that emphasize the critical role of the human researcher, a key factor in building trust and demonstrating accountability.
Building a Future of Trustworthy Research
Ethical, transparent research is the foundation of excellent product design. By being proactive and clear about how we use new technologies, we not only comply with regulations but also strengthen the trust we have with our participants and stakeholders.
We invite you to use the UXR AI Disclosure Drafting Tool in your next research study. Let's continue to build a future where innovation and trust go hand in hand.