-
(单词翻译:双击或拖选)
This is Weekend Edition from NPR News. I am Scott Simon.
Imagine yourself in the future for a moment riding in a driverless car. You see 10 pedestrians1 stroll into the street just a few yards ahead of you. The car's going too fast to brake and miss them, so would you steer2 your car to try to miss them and possibly injure yourself? But if it's a driverless car, would you even get to make that choice? We're going to talk now to somebody who studies some of the ethical3 questions that are raised by autonomous4 vehicles. Patrick Lin, associate professor of philosophy at Cal Poly in San Luis Obispo, Calif., thanks very much for being with us.
PATRICK LIN: You're welcome. Glad to be here, Scott.
SIMON: There was a survey put up by MIT that asked questions along these lines, right?
LIN: Right, right.
SIMON: What did you notice in the survey when you looked at it?
LIN: Well, you know, so that's not the first survey done on this topic. There have been other surveys. And they had similar results, which is that people are split on the idea of how a driverless or autonomous car should behave. I think the one thing that stood out to me is that there's going to be a lot more work needed in this field here. One problem with surveys is that what people say in surveys isn't necessarily how they would actually choose in real life.
SIMON: Yeah.
LIN: They might not always know what it is they want.
SIMON: Yeah, I mean, 'cause it does seem to me just anecdotally that probably not a month goes by we don't read about some traffic accident where somebody said, you know, I just couldn't stop. They pulled into the lane, they walked across the street. And I must say, as a rule, society doesn't blame them for making an unethical choice to save their own life, even if the crash results in killing5 others.
LIN: Right. If it's a human-driven car, what you have there is just an accident. It's a reflex. Maybe you have bad reflexes. But we understand that that's just a reflex, it's not premeditated. But when you're talking about how we ought to program a robot car, now you're talking about pre-scripting the accident, right? So this is a difference between an accidental accident and a deliberate accident. So there's a big difference there legally and ethically6.
SIMON: Would somebody get into a driverless car if they thought the algorithms of that car would essentially7 say, I'm not going to let you run into that school bus and kill people, you're going to die instead?
LIN: I think they would. So, for instance, anytime you get in a driven car by someone else, you're at risk. Studies have shown that if you're a human driver and you're about to be in a crash, you're going to reflexively turn away from the crash. This usually means that you expose your passengers to that accident. But that doesn't paralyze us when we step into a car.
SIMON: At the same time, though, Professor, I mean, I think it's going to be hard for people to think of an algorithm making that decision for us.
LIN: That's right. I mean, it's a weird8 thing to think about. But that's exactly what we're doing when we're creating robots and artificial intelligence. They're taking over human roles, from being our chauffeur9 to our stock market trader to our airline pilot to whatever. We've got to do some soul-searching. And then we have to ask, well, should robots and AI mimic10 humans - do what we do - or should they even do something differently? So robot ethics11 and human ethics could be two different things. But when we talk about programming cars or making any kind of robots, it's a good exercise in how humans behave and how we ought to behave.
SIMON: Patrick Lin is an associate professor of philosophy at Cal Poly in San Luis Obispo, Calif. Thanks so much for being with us.
LIN: You're welcome. Thanks for having me.
点击收听单词发音
1 pedestrians | |
n.步行者( pedestrian的名词复数 ) | |
参考例句: |
|
|
2 steer | |
vt.驾驶,为…操舵;引导;vi.驾驶 | |
参考例句: |
|
|
3 ethical | |
adj.伦理的,道德的,合乎道德的 | |
参考例句: |
|
|
4 autonomous | |
adj.自治的;独立的 | |
参考例句: |
|
|
5 killing | |
n.巨额利润;突然赚大钱,发大财 | |
参考例句: |
|
|
6 ethically | |
adv.在伦理上,道德上 | |
参考例句: |
|
|
7 essentially | |
adv.本质上,实质上,基本上 | |
参考例句: |
|
|
8 weird | |
adj.古怪的,离奇的;怪诞的,神秘而可怕的 | |
参考例句: |
|
|
9 chauffeur | |
n.(受雇于私人或公司的)司机;v.为…开车 | |
参考例句: |
|
|
10 mimic | |
v.模仿,戏弄;n.模仿他人言行的人 | |
参考例句: |
|
|
11 ethics | |
n.伦理学;伦理观,道德标准 | |
参考例句: |
|
|