Mental Health Research; Workplace Health; Today

AI for mental health screening may carry biases based on gender, race

AI for mental health screening may carry biases based on gender, race

Some artificial intelligence tools for health care may get confused by the ways people of different genders and races talk, according to a new study led by CU Boulder computer scientist Theodora Chaspari. The study hinges on a, perhaps unspoken, reality of human society: Not everyone talks the same. Women, for example, tend to speak at a higher pitch than men, while similar differences can pop up between, say, white and Black speakers. Now, researchers have found that those natural variations could confound algorithms that screen humans for mental health concerns like anxiety or depression. The results add to a…
Read More
No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.