-
(单词翻译:双击或拖选)
ARI SHAPIRO, HOST:
Social media can be an echo chamber1. Facebook and Twitter feeds often show us posts by people who think pretty much the same way we do - red feed for conservatives, blue feed for liberals. On this week's All Tech Considered, my co-host Kelly McEvers talked with someone who's thought a lot about echo chambers2 and whether something could be done to change them.
(SOUNDBITE OF MUSIC)
KELLY MCEVERS, BYLINE3: Harvard law professor Cass Sunstein has written a lot about why societies need dissent4. That was the title of one of his earlier books. His new book is "#Republic: Divided Democracy In The Age Of Social Media." He says these echo chambers we live in actually hurt democracy. Cass Sunstein, welcome to the show.
CASS SUNSTEIN: Thank you so much.
MCEVERS: Since the election, I mean, we've done a lot of reporting on this - right? - how the news is being filtered, how people are only reading stories that, you know, match their ideas. But how does this hurt democracy, in your opinion?
SUNSTEIN: Well, there are a couple of things that happened then. None of them is good. One is that if you're listening to people who just agree with you or reading news sources that fit with your own preconceptions, it's not as if you just stay where you are. You tend to end up more extreme, which makes us get kind of blocked as a society, which isn't good for democracy and which makes it possible for people to see people who disagree with them not as fellow citizens, but as enemies who are crazy people or dupes. And that can make problem solving very, very challenging.
MCEVERS: Just after the election, Facebook CEO Mark Zuckerberg said it was, quote, "pretty crazy" to think that fake news on Facebook had any influence on the election. He said, quote, "voters make decisions based on their lived experience." What do you think about that?
SUNSTEIN: Well, with respect to fake news, the data is that it has not had a massive effect on elections. So I think the data is supportive of his conclusion there. In so far as Facebook is disparaging5, the concern that it's contributing to a system of division and polarization and extremism and echo chambering, that's not great. We ought to see Facebook thinking - how can we be part of a solution rather than what we now are with respect to polarization? That is part of the problem.
MCEVERS: You know, the question is - what can be done to combat this? What can be can be done to sort of stop that extreme - that extremism?
SUNSTEIN: Yes. Well, there are lots of things that can be done in our individual lives and also on the part of those who provide information. So there's nothing inevitable6 in an algorithm that says, you know, this is your type of content. Here you go. If you have an algorithm that exposes you to opposing viewpoints or topics you never would have put there - and that can change your life in a good way.
MCEVERS: I'm imagining, like, a serendipity7 bar where you sort of move it - you know, like the brightness control on your phone - like, you move it right or left. You're like - I want more serendipity in my algorithm or less.
SUNSTEIN: Completely love that - serendipity button or an opposing viewpoint button.
MCEVERS: Or a slider, yeah
SUNSTEIN: A slider - definitely. What we have, in a way, now on the social media is a serendipity bar which is dialed down to black, black, black. And it isn't easy to dial it up till it's actually used, so providers can completely do that.
MCEVERS: But it's not in their interest to do that, right? I mean, they want - I mean, I was reading in your book, like, you know, some - a memo8 - a Facebook memo - where they were talking about the desire to keep people, you know, reading an article for, you know, the maximum amount of time. And to do that, you give them something they want.
SUNSTEIN: Well, we're early days, really, still for Facebook and social media. And so my expectation is that Facebook and Twitter will do some experimenting on this count. It is true that kind of a quick reaction is provide people with content that they will look at. And that might be the information cocoon9 effect. But lots of Americans have not just a desire to see, you know, what they already think, but a desire to see some stuff that'll be challenging or eye-opening.
MCEVERS: What should we, as social media users, do? I mean, is the, you know, sort of immediate10 suggestion that, like, liberals should follow more conservatives and vice11 versa?
SUNSTEIN: I think - no question - that if you're left of center, have a little plan in the next two weeks to follow some smart people who are right of center. And if you're right of center, and you tend to ridicule12 or contempt for people on the left, follow some liberals. Find some who have at least a little bit of credibility for you. Or make a determined13 judgment14 whether you're left or right. See what you can get from the other side. And this is, you know, individual lives, but as the framers of the constitution knew, a republic is built up of innumerable individual decisions. And whether we get a well-functioning system or not depends on, you know, countless15 individual acts.
MCEVERS: Cass Sunstein's new book is "#Republic: Divided Democracy In The Age Of Social Media." Thank you.
SUNSTEIN: Thank you so much.
1 chamber | |
n.房间,寝室;会议厅;议院;会所 | |
参考例句: |
|
|
2 chambers | |
n.房间( chamber的名词复数 );(议会的)议院;卧室;会议厅 | |
参考例句: |
|
|
3 byline | |
n.署名;v.署名 | |
参考例句: |
|
|
4 dissent | |
n./v.不同意,持异议 | |
参考例句: |
|
|
5 disparaging | |
adj.轻蔑的,毁谤的v.轻视( disparage的现在分词 );贬低;批评;非难 | |
参考例句: |
|
|
6 inevitable | |
adj.不可避免的,必然发生的 | |
参考例句: |
|
|
7 serendipity | |
n.偶然发现物品之才能;意外新发现 | |
参考例句: |
|
|
8 memo | |
n.照会,备忘录;便笺;通知书;规章 | |
参考例句: |
|
|
9 cocoon | |
n.茧 | |
参考例句: |
|
|
10 immediate | |
adj.立即的;直接的,最接近的;紧靠的 | |
参考例句: |
|
|
11 vice | |
n.坏事;恶习;[pl.]台钳,老虎钳;adj.副的 | |
参考例句: |
|
|
12 ridicule | |
v.讥讽,挖苦;n.嘲弄 | |
参考例句: |
|
|
13 determined | |
adj.坚定的;有决心的 | |
参考例句: |
|
|
14 judgment | |
n.审判;判断力,识别力,看法,意见 | |
参考例句: |
|
|
15 countless | |
adj.无数的,多得不计其数的 | |
参考例句: |
|
|