英语 英语 日语 日语 韩语 韩语 法语 法语 德语 德语 西班牙语 西班牙语 意大利语 意大利语 阿拉伯语 阿拉伯语 葡萄牙语 葡萄牙语 越南语 越南语 俄语 俄语 芬兰语 芬兰语 泰语 泰语 泰语 丹麦语 泰语 对外汉语

PBS高端访谈:医疗算法存在种族歧视

时间:2020-05-14 09:30来源:互联网 提供网友:nan   字体: [ ]
特别声明:本栏目内容均从网络收集或者网友提供,供仅参考试用,我们无法保证内容完整和正确。如果资料损害了您的权益,请与站长联系,我们将及时删除并致以歉意。
    (单词翻译:双击或拖选)

Hari Sreenivasan: A recent study published in Science magazine found significant racial bias1 in an algorithm used by hospitals across the nation to determine who needs follow up care and who does not. Megan Thompson recently spoke2 with STAT's Shraddha Chakradhar, who explained what the researchers found.

Megan Thompson: Where exactly was this bias coming from?

Shraddha Chakradhar: There are two ways that we can identify how sick a person is. One, is how many dollars are spent on that person. You know, the assumption being the more health care they come in for, the more treatment that they get, the more dollars they spend and presumably the sicker they are if they're getting all that treatment. And the other way is that, you know, we can measure actual biophysical things, you know, from lab tests, what kind of conditions or diseases they might have. So it seems like this algorithm was relying on the cost prediction definition. In other words, the more dollars a patient was projected to spend on the part of an insurance company or a hospital, then that was a sign of how sick they were going to be. And that seems to be where the bias emerged.

Megan Thompson: I understand that the researchers then sort of use the algorithm using a different type of data. Can you just tell us a little bit more about that? What did they use?

Shraddha Chakradhar: Yeah. So instead of relying on just costs to predict which patients are going to need follow up care, they actually used biometric data, physical biophysical data, physiological4 data, and they saw a dramatic difference, you know, in the previous model. The algorithm missed some 48,000 extra chronic5 conditions that African-American patients had. But when they rejiggered the algorithm to look more at actual biological data, they brought that down to about 7,700. So it was about an 84 percent reduction in bias.

Megan Thompson: Do we know anything about how the use of this biased6 algorithm actually affected7 patient care?

Shraddha Chakradhar: We don't actually know that. But as I mentioned, the algorithm is used by hospitals to help them flag patients who might need extra care in the coming year, whether it's, you know, an at-home nurse or making sure that they come in for regularly scheduled doctor's appointments. So we can only presume that if black patients, sicker black patients weren't being flagged accurately8, that they also missed out on this follow up care.

Megan Thompson: Are there any consequences for the company, Optum, that was behind this algorithm?

Shraddha Chakradhar: Yes. So the day after the study came out, actually, New York regulators, the Department of Financial Services and the Department of Health sent a letter to the company saying they were investigating this algorithm and that the company had to show that the way the algorithm worked wasn't in violation9 of anti-discrimination laws in New York. So that investigation10 is pending11. One encouraging thing is that when the researchers did the study, they actually reached back to Optum and let them know about the discrepancy12 in the data. And the company was glad to be told about it. And I'm told that they're working on a fix. And the other encouraging thing is that the researchers have actually now launched an initiative to help other companies who may be behind similar algorithms to help them fix any biases13 in their programs. So they've launched a program based out of the University of Chicago's Booth School to do this work on a pro3 bono basis so that they can sort of catch these things in other algorithms that might be used across the country.

Megan Thompson: All right, Shraddha Chakradhar of STAT, thank you so much for being with us.

Shraddha Chakradhar: Thank you for having me.


点击收听单词发音收听单词发音  

1 bias 0QByQ     
n.偏见,偏心,偏袒;vt.使有偏见
参考例句:
  • They are accusing the teacher of political bias in his marking.他们在指控那名教师打分数有政治偏见。
  • He had a bias toward the plan.他对这项计划有偏见。
2 spoke XryyC     
n.(车轮的)辐条;轮辐;破坏某人的计划;阻挠某人的行动 v.讲,谈(speak的过去式);说;演说;从某种观点来说
参考例句:
  • They sourced the spoke nuts from our company.他们的轮辐螺帽是从我们公司获得的。
  • The spokes of a wheel are the bars that connect the outer ring to the centre.辐条是轮子上连接外圈与中心的条棒。
3 pro tk3zvX     
n.赞成,赞成的意见,赞成者
参考例句:
  • The two debating teams argued the question pro and con.辩论的两组从赞成与反对两方面辩这一问题。
  • Are you pro or con nuclear disarmament?你是赞成还是反对核裁军?
4 physiological aAvyK     
adj.生理学的,生理学上的
参考例句:
  • He bought a physiological book.他买了一本生理学方面的书。
  • Every individual has a physiological requirement for each nutrient.每个人对每种营养成分都有一种生理上的需要。
5 chronic BO9zl     
adj.(疾病)长期未愈的,慢性的;极坏的
参考例句:
  • Famine differs from chronic malnutrition.饥荒不同于慢性营养不良。
  • Chronic poisoning may lead to death from inanition.慢性中毒也可能由虚弱导致死亡。
6 biased vyGzSn     
a.有偏见的
参考例句:
  • a school biased towards music and art 一所偏重音乐和艺术的学校
  • The Methods: They employed were heavily biased in the gentry's favour. 他们采用的方法严重偏袒中上阶级。
7 affected TzUzg0     
adj.不自然的,假装的
参考例句:
  • She showed an affected interest in our subject.她假装对我们的课题感到兴趣。
  • His manners are affected.他的态度不自然。
8 accurately oJHyf     
adv.准确地,精确地
参考例句:
  • It is hard to hit the ball accurately.准确地击中球很难。
  • Now scientists can forecast the weather accurately.现在科学家们能准确地预报天气。
9 violation lLBzJ     
n.违反(行为),违背(行为),侵犯
参考例句:
  • He roared that was a violation of the rules.他大声说,那是违反规则的。
  • He was fined 200 dollars for violation of traffic regulation.他因违反交通规则被罚款200美元。
10 investigation MRKzq     
n.调查,调查研究
参考例句:
  • In an investigation,a new fact became known, which told against him.在调查中新发现了一件对他不利的事实。
  • He drew the conclusion by building on his own investigation.他根据自己的调查研究作出结论。
11 pending uMFxw     
prep.直到,等待…期间;adj.待定的;迫近的
参考例句:
  • The lawsuit is still pending in the state court.这案子仍在州法庭等待定夺。
  • He knew my examination was pending.他知道我就要考试了。
12 discrepancy ul3zA     
n.不同;不符;差异;矛盾
参考例句:
  • The discrepancy in their ages seemed not to matter.他们之间年龄的差异似乎没有多大关系。
  • There was a discrepancy in the two reports of the accident.关于那次事故的两则报道有不一致之处。
13 biases a1eb9034f18cae637caab5279cc70546     
偏见( bias的名词复数 ); 偏爱; 特殊能力; 斜纹
参考例句:
  • Stereotypes represent designer or researcher biases and assumptions, rather than factual data. 它代表设计师或者研究者的偏见和假设,而不是实际的数据。 来自About Face 3交互设计精髓
  • The net effect of biases on international comparisons is easily summarized. 偏差对国际比较的基本影响容易概括。
本文本内容来源于互联网抓取和网友提交,仅供参考,部分栏目没有内容,如果您有更合适的内容,欢迎点击提交分享给大家。
------分隔线----------------------------
TAG标签:   PBS  社会  听力MP3
顶一下
(0)
0%
踩一下
(0)
0%
最新评论 查看所有评论
发表评论 查看所有评论
请自觉遵守互联网相关的政策法规,严禁发布色情、暴力、反动的言论。
评价:
表情:
验证码:
听力搜索
推荐频道
论坛新贴