CORE-BEHRT: A Carefully Optimized and Rigorously Evaluated BEHRT

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled CORE-BEHRT: A Carefully Optimized and Rigorously Evaluated BEHRT, by Mikkel Odgaard and 5 other authors

View PDF
HTML (experimental)

Abstract:The widespread adoption of Electronic Health Records (EHR) has significantly increased the amount of available healthcare data. This has allowed models inspired by Natural Language Processing (NLP) and Computer Vision, which scale exceptionally well, to be used in EHR research. Particularly, BERT-based models have surged in popularity following the release of BEHRT and Med-BERT. Subsequent models have largely built on these foundations despite the fundamental design choices of these pioneering models remaining underexplored. Through incremental optimization, we study BERT-based EHR modeling and isolate the sources of improvement for key design choices, giving us insights into the effect of data representation, individual technical components, and training procedure. Evaluating this across a set of generic tasks (death, pain treatment, and general infection), we showed that improving data representation can increase the average downstream performance from 0.785 to 0.797 AUROC ($p<10^{-7}$), primarily when including medication and timestamps. Improving the architecture and training protocol on top of this increased average downstream performance to 0.801 AUROC ($p<10^{-7}$). We then demonstrated the consistency of our optimization through a rigorous evaluation across 25 diverse clinical prediction tasks. We observed significant performance increases in 17 out of 25 tasks and improvements in 24 tasks, highlighting the generalizability of our results. Our findings provide a strong foundation for future work and aim to increase the trustworthiness of BERT-based EHR models.

Submission history

From: Mikkel Odgaard [view email]
[v1]
Tue, 23 Apr 2024 16:35:59 UTC (1,249 KB)
[v2]
Wed, 24 Apr 2024 08:02:49 UTC (1,249 KB)
[v3]
Wed, 22 May 2024 12:45:42 UTC (1,248 KB)
[v4]
Fri, 11 Oct 2024 05:26:01 UTC (1,885 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.