Developer Insights into Designing AI-Based Computer Perception Tools

Explainable & Ethical AI
Published: arXiv: 2508.21733v1
Authors

Maya Guhan Meghan E. Hurley Eric A. Storch John Herrington Casey Zampella Julia Parish-Morris Gabriel Lázaro-Muñoz Kristin Kostick-Quenet

Abstract

Artificial intelligence (AI)-based computer perception (CP) technologies use mobile sensors to collect behavioral and physiological data for clinical decision-making. These tools can reshape how clinical knowledge is generated and interpreted. However, effective integration of these tools into clinical workflows depends on how developers balance clinical utility with user acceptability and trustworthiness. Our study presents findings from 20 in-depth interviews with developers of AI-based CP tools. Interviews were transcribed and inductive, thematic analysis was performed to identify 4 key design priorities: 1) to account for context and ensure explainability for both patients and clinicians; 2) align tools with existing clinical workflows; 3) appropriately customize to relevant stakeholders for usability and acceptability; and 4) push the boundaries of innovation while aligning with established paradigms. Our findings highlight that developers view themselves as not merely technical architects but also ethical stewards, designing tools that are both acceptable by users and epistemically responsible (prioritizing objectivity and pushing clinical knowledge forward). We offer the following suggestions to help achieve this balance: documenting how design choices around customization are made, defining limits for customization choices, transparently conveying information about outputs, and investing in user training. Achieving these goals will require interdisciplinary collaboration between developers, clinicians, and ethicists.

Paper Summary

Problem
The integration of artificial intelligence (AI)-based computer perception (CP) technologies into clinical decision-making is a significant challenge. These tools have the potential to revolutionize healthcare, but their effective integration into clinical workflows depends on how developers balance clinical utility with user acceptability and trustworthiness. The main problem is that developers must navigate complex and competing demands, such as innovating while ensuring usability, challenging clinical paradigms while aligning with them, and customizing while preserving objectivity.
Key Innovation
This study presents findings from in-depth interviews with developers of AI-based CP tools, highlighting four key design priorities: (1) accounting for context and ensuring explainability, (2) aligning tools with existing clinical workflows, (3) customizing for relevant stakeholders for usability and acceptability, and (4) pushing the boundaries of innovation while aligning with established paradigms. The study also emphasizes the importance of developers' roles as both technical architects and ethical stewards, designing tools that are both acceptable by users and epistemically responsible.
Practical Impact
This research has significant practical implications for the development and implementation of AI-based CP tools in healthcare. By understanding the design priorities and challenges faced by developers, clinicians, patients, and ethicists can work together to create tools that are both clinically actionable and epistemically responsible. This collaboration can lead to the development of CP systems that support informed, context-sensitive decisions, without becoming rigid confirmation engines or indecipherable black boxes. Ultimately, this can improve the quality of care and patient outcomes.
Analogy / Intuitive Explanation
Imagine you're building a new house, and you want to incorporate cutting-edge technology, such as smart home devices, to make the house more comfortable and efficient. However, you also want to ensure that the technology is user-friendly and doesn't compromise the aesthetic appeal of the house. This is similar to the challenge faced by developers of AI-based CP tools, who must balance innovation with usability, clinical utility with user acceptability, and objectivity with customization. The goal is to create a system that is both clinically effective and trustworthy, like a well-designed house that seamlessly integrates technology with human needs.
Paper Information
Categories:
cs.HC cs.AI cs.CY
Published Date:

arXiv ID:

2508.21733v1

Quick Actions