当前位置

首页 > 英语阅读 > 双语新闻 > 如何抗击隐形的歧视

如何抗击隐形的歧视

推荐人: 来源: 阅读: 4.97K 次

如何抗击隐形的歧视

Six months ago, tech entrepreneur Rohan Gilkes tried to rent a cabin in Idaho over the July 4 weekend, using the website Airbnb.

半年前,科技创业者罗恩.吉尔克斯(Rohan Gilkes)尝试通过Airbnb网站预订爱达荷州的一间小屋,在美国独立日长周末使用。

All seemed well, until the host told him her plans had changed: she needed to use the cabin herself.

一切似乎都很顺利,直到房主告诉他,她的计划有变:她自己需要使用那间小屋。

Then a friend of Rohan’s tried to book the same cabin on the same weekend, and his booking was immediately accepted.

之后,罗恩的一个朋友试着在同样时间预订那间小屋,他的预订被迅速接受了。

Rohan’s friend is white; Rohan is black.

罗恩的朋友是白人;罗恩是黑人。

This is not a one-off.

这并非一次性事件。

Late last year, three researchers from Harvard Business School — Benjamin Edelman, Michael Luca and Dan Svirsky — published a working paper with experimental evidence of discrimination.

哈佛商学院(Harvard Business School)的3名研究人员——本杰明.埃德尔曼( Benjamin Edelman)、迈克尔.卢卡(Michael Luca)和丹.斯维尔斯基(Dan Svirsky)去年年末发布了一份工作论文,其中的实验证据证明了歧视的存在。

Using fake profiles to request accommodation, the researchers found that applicants with distinctively African-American names were 16 per cent less likely to have their bookings accepted.

研究人员使用假的资料来申请订房,他们发现,如果看申请者的姓名明显像是非裔美国人,其预订被接受的可能性要低16%。

Edelman and Luca have also published evidence that black hosts receive lower incomes than whites while letting out very similar properties on Airbnb.

埃德尔曼和卢卡还发布了一些证据,表明在Airbnb上出租类似房源时,黑人房主的租房所得会比白人房主低。

The hashtag #AirbnbWhileBlack has started to circulate.

#AirbnbWhileBlack(Airbnb上的黑人)的话题标签开始传播。

Can anything be done to prevent such discrimination? It’s not a straightforward problem.

可以做些什么来防止这种歧视吗?这不是一个简单明了的问题。

Airbnb condemns racial discrimination but, by making names and photographs such a prominent feature of its website, it makes discrimination, conscious or unconscious, very easy.

Airbnb谴责种族歧视,但Airbnb网站的一个突出特征就是显示姓名和照片,这让有意或者无意的歧视变得非常容易。

It’s a cheap way to build trust, says researcher Michael Luca.

这是一种成本低廉的建立信任的方式,研究员迈克尔.卢卡说。

But, he adds, it invites discrimination.

但他补充道,这招来了歧视。

Of course there’s plenty of discrimination to be found elsewhere.

当然,其他地方也可以发现很多歧视现象。

Other studies have used photographs of goods such as iPods and baseball cards being held in a person’s hand.

另一些研究使用了卖家手持商品(如iPod或者棒球卡)拍下的商品照片。

On Craigslist and eBay, such goods sell for less if held in a black hand than a white one.

在Craigslist和eBay上,黑人手持的商品卖价会比白人手持商品的卖价低。

An unpleasant finding — although in such cases it’s easy to use a photograph with no hand visible at all.

这个发现令人不舒服——尽管在这种情况卖家想避免受到歧视很容易,只需使用不露出手的商品照片就可以了。

The Harvard Business School team have produced a browser plug-in called Debias Yourself.

哈佛商学院的团队制作了一个叫做Debias Yourself的防偏见浏览器插件。

People who install the plug-in and then surf Airbnb will find that names and photographs have been hidden.

安装这个插件的人在浏览Airbnb的时候会发现姓名和照片被隐藏了。

It’s a nice idea, although one suspects that it will not be used by those who need it most.

这是个好主意,不过我怀疑那些最需要这个功能的人不会使用它。

Airbnb could impose the system anyway but that is unlikely to prove tempting.

Airbnb可以强行实施这个系统,但这样做不太可能有吸引力。

However, says Luca, there are more subtle ways in which the platform could discourage discrimination.

然而,卢卡表示,平台还可以使用一些更含蓄的方式来阻止歧视。

For example, it could make profile portraits less prominent, delaying the appearance of a portrait until further along in the process of making a booking.

比如,平台可以让资料中的个人照片变得不那么突出,在预订进行到一定阶段后再显现照片。

And it could nudge hosts into using an instant book system that accelerates and depersonalises the booking process.

平台还可以敦促房主使用即时预订系统,这种系统即能加快预订过程,又能去除预订过程中的个人因素。

(The company recently released a report describing efforts to deal with the problem.)

(该公司最近发布了一份报告,描述了为处理这一问题做出的努力。)

But if the Airbnb situation has shone a spotlight on unconscious (and conscious) bias, there are even more important manifestations elsewhere in the economy.

如果说Airbnb的情况让人们关注到无意识(和有意识)的偏见,那么在经济的其他领域,还有一些更重要的反映出偏见的情况。

A classic study by economists Marianne Bertrand and Sendhil Mullainathan used fake CVs to apply for jobs.

经济学家玛丽安娜.贝特朗(Marianne Bertrand)和森德希尔.穆莱纳坦(Sendhil Mullainathan)所做的一项经典研究使用了假简历来申请工作。

Some CVs, which used distinctively African-American names, were significantly less likely to lead to an interview than identical applications with names that could be perceived as white.

使用明显是非裔美国人姓名的简历得到面试的几率,要低于内容一样但使用可能被认为是白人姓名的简历。

Perhaps the grimmest feature of the Bertrand/Mullainathan study was the discovery that well-qualified black applicants were treated no better than poorly qualified ones.

或许贝特朗和穆莱纳坦进行的研究中最令人沮丧的一点是,完全够格的黑人申请者得到的待遇和不那么够格的申请者一样糟糕。

As a young black student, then, one might ask: why bother studying when nobody will look past your skin colour? And so racism can create a self-reinforcing loop.

那么,一个年轻的黑人学生或许会问:如果没人在乎你肤色以外的东西,为何还要费力学习呢?因此,种族主义可能会导致一个自我加强的循环。

What to do?

该怎么办?

One approach, as with Debias Yourself, is to remove irrelevant information: if a person’s skin colour or gender is irrelevant, then why reveal it to recruiters? The basic idea behind Debias Yourself was proven in a study by economists Cecilia Rouse and Claudia Goldin.

有一种策略,就像Debias Yourself防偏见插件一样,是去除无关信息:既然一个人的肤色或者性别不影响其录用,那何必把这些信息透露给招聘人员呢?经济学家塞西莉亚.劳斯(Cecilia Rouse)和克劳迪娅.戈尔丁(Claudia Goldin)的一项研究证明了Debias Yourself所依据的基本理念是正确的。

Using a careful statistical design, Rouse and Goldin showed that when leading professional orchestras began to audition musicians behind a screen, the recruitment of women surged.

通过细心的统计设计,劳斯和戈尔丁表明,当一流的专业管弦乐团开始隔着屏风面试音乐家时,女性被录取的几率激增。

Importantly, blind auditions weren’t introduced to fight discrimination against women — orchestras didn’t think such discrimination was a pressing concern.

重要的是,在这里,盲试的引入并不是为了抗击对女性的歧视——管弦乐团并不认为他们在性别歧视方面存在紧迫问题。

Instead, they were a way of preventing recruiters from favouring the pupils of influential teachers.

事实上,盲试是为了防止招聘者偏袒具有影响力的教师的学生。

Yet a process designed to fight nepotism and favouritism ended up fighting sexism too.

然而,这种旨在打击裙带关系和徇私行为的程序最终也打击了性别歧视。

 . . . 

 . . . 

A new start-up, Applied, is taking these insights into the broader job market.

新创立的企业Applied正把这些洞见应用到更广泛的就业市场中。

Applied is a spin-off from the UK Cabinet Office, the Behavioural Insights Team and Nesta, a charity that supports innovation; the idea is to use some simple technological fixes to combat a variety of biases.

Applied是由行为研究小组(Behavioural Insights Team,由英国内阁办公室(Cabinet Office)和支持创新的慈善机构英国国家科技艺术基金会(Nesta)合作成立)和Nesta合作成立的公司,其创办理念是通过一些简单的技术性修正来抗击各种偏见。

A straightforward job application form is a breeding ground for discrimination and cognitive error.

一份直观的工作申请表为偏见和认知错误提供了温床。

It starts with a name — giving clues to nationality, ethnicity and gender — and then presents a sequence of answers that are likely to be read as one big stew of facts.

这种表格把暴露申请者国籍、族裔和性别的姓名放在最开头,它接下来提供的一系列答案可能被看做各种事实的大杂烩。

A single answer, good or bad, colours our perception of everything else, a tendency called the halo effect.

只需一个我们喜欢或不喜欢的答案,就会影响我们对其余一切答案的看法,这是一种叫做光晕效应的倾向。

A recruiter using Applied will see chunked and anonymised details — answers to the application questions from different applicants, presented in a randomised order and without indications of race or gender.

一个使用Applied服务的招聘人员将会看到区块化和匿名化的细节——将不同申请者对申请表问题的答案用随机顺序列出来,不体现种族或者性别。

Meanwhile, other recruiters will see the same answers, but shuffled differently.

同时,其他招聘人员将看到同样的答案,但以不同顺序列出。

As a result, says Kate Glazebrook of Applied, various biases simply won’t have a chance to emerge.

Applied的凯特.格莱兹布鲁克(Kate Glazebrook)表示,这样一来,各种偏见根本没有机会产生。

When the Behavioural Insights Team ran its last recruitment round, applicants were rated using the new process and a more traditional CV-based approach.

当行为研究小组进行最后一轮招聘时,有的申请人接受的是新程序的评分,有的接受的是基于简历的更传统方式的评分。

The best of the shuffled, anonymised applications were more diverse, and much better predictors of a candidate who impressed on the assessment day.

使用被打乱顺序、匿名化的申请表选出的最佳申请人更加背景各异,在评估日令人印象深刻的几率也大大提高。

Too early to declare victory — but a promising start.

宣布胜利还为时过早——但这是一个充满希望的开端。