Auditing for Human Expertise

Architecture of OpenAI


View a PDF of the paper titled Auditing for Human Expertise, by Rohan Alur and 5 other authors

View PDF
HTML (experimental)

Abstract:High-stakes prediction tasks (e.g., patient diagnosis) are often handled by trained human experts. A common source of concern about automation in these settings is that experts may exercise intuition that is difficult to model and/or have access to information (e.g., conversations with a patient) that is simply unavailable to a would-be algorithm. This raises a natural question whether human experts add value which could not be captured by an algorithmic predictor. We develop a statistical framework under which we can pose this question as a natural hypothesis test. Indeed, as our framework highlights, detecting human expertise is more subtle than simply comparing the accuracy of expert predictions to those made by a particular learning algorithm. Instead, we propose a simple procedure which tests whether expert predictions are statistically independent from the outcomes of interest after conditioning on the available inputs (`features’). A rejection of our test thus suggests that human experts may add value to any algorithm trained on the available data, and has direct implications for whether human-AI `complementarity’ is achievable in a given prediction task. We highlight the utility of our procedure using admissions data collected from the emergency department of a large academic hospital system, where we show that physicians’ admit/discharge decisions for patients with acute gastrointestinal bleeding (AGIB) appear to be incorporating information that is not available to a standard algorithmic screening tool. This is despite the fact that the screening tool is arguably more accurate than physicians’ discretionary decisions, highlighting that — even absent normative concerns about accountability or interpretability — accuracy is insufficient to justify algorithmic automation.

Submission history

From: Rohan Alur [view email]
[v1]
Fri, 2 Jun 2023 16:15:24 UTC (155 KB)
[v2]
Fri, 27 Oct 2023 19:00:05 UTC (250 KB)
[v3]
Mon, 25 Nov 2024 13:59:11 UTC (250 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.