-
(单词翻译:双击或拖选)
One of the most popular tools on Apple's new iPhone X is its facial recognition system.
This latest iPhone gives users the power to open the device just by looking at it. The smartphone has performed well in tests set up to trick it into opening for an unapproved user.
The same kind of facial recognition system is also used for other purposes. One area that will depend heavily on the technology in the future is lie detection.
Traditional lie detection machines, called polygraphs, use sensors2 connected to the body to measure physical changes in the body. These include blood pressure, breathing and heart rates and the amount of perspiration3 on the skin.
The person is asked a series of questions during the test. Any physical changes are studied to see which questions caused the reactions. The operator of the test then makes a decision about whether the results suggest the individual was lying or not.
In the future, experts say, lie detector4 tests will be carried out by systems that use video and machines. Researchers have a term for the intelligence shown by computers and other mechanical devices: artificial intelligence.
One company developing this kind of system is SilverLogic Labs in Seattle, Washington.
SilverLogic Labs calls its high-tech5 lie detector a “passive polygraph.” The most important piece of equipment is a video camera, which captures all a person’s visual and spoken reactions.
Jerimiah Hamon is the company's chief executive officer. He says the video images gather data that is then processed to make a decision about truthfulness7.
“We’re breaking it (footage) down into data and then using quant methods, or quantitative8 math, and deep learning from the videos to determine how a person’s emotional responses are tied to some stimulus9. So in this instance, it was questions.”
This technology is not new. It was first developed to measure the reactions people had while watching movies and television shows. Earlier studies showed this method was more exact in learning the true feelings of individuals who might try to suppress their honest opinions when answering questions.
SilverLogic Labs uses an algorithm to study the video of facial reactions of test subjects. It seeks to identify emotions, including anger, happiness and fear.
Rabia Piacentini is the company’s operations manager. She says the camera can gather detailed10 information to measure a number of emotional reactions. The data is processed by the machine, which then decides how truthful6 the subject is being.
In a demonstration11 of the system, the machine detects sadness.
“Have you ever hurt someone intentionally12?”
“Yes.”
The tester says this suggests the subject is lying since many people show visual emotions when not telling the truth.
The idea of a video camera recording13 emotional responses and testing truthfulness – possibly even secretly - raises questions about privacy.
But SilverLogic Labs rejects those concerns. It says privacy only becomes a problem when the technology is misused14. The company’s Jerimiah Hamon said he also believes a machine lie detector can be a better judge than a human tester.
“I think this is actually a super enhancement to civil rights, because cameras and computers aren’t biased15 to race, ethnicity, age, gender16 - any of that."
In another use of the technology, the United States and Canada have studied how facial recognition systems might be used to detect lying at the border.
The system is called the Automated17 Virtual Agent for Truth Assessments18 in Real Time, or AVATAR. It uses a computer-generated face that asks travelers a series of questions. A camera inside the machine studies how the person reacts. It can recognize changes in the eyes and voice, as well as any body movements.
If the system identifies a traveler as possibly lying, that person could be stopped and asked to complete additional security testing.
Developers of this technology say it could be used for other purposes as well. They believe it could help police detect untruthfulness and might be used by companies interviewing people for employment.
I’m Bryan Lynn.
Words in This Story
sensor1 – n. a device that detects or senses heat, light, sound, motion and then reacts to it in a particular way
perspiration – n. clear liquid that forms on skin when a person is hot or nervous
passive – adj. something that happens without specific action being taken
visual – adj. related to seeing
quantitative – adj. relating to how much there is of something
stimulus – n. something that causes something else to happen
algorithm – n. set of steps that are followed in order to solve a mathematical problem or to complete a computer process
enhancement – n. an improvement in something
detection – n. the act of identifying the presence of something
data – n. facts, numbers and other information
1 sensor | |
n.传感器,探测设备,感觉器(官) | |
参考例句: |
|
|
2 sensors | |
n.传感器,灵敏元件( sensor的名词复数 ) | |
参考例句: |
|
|
3 perspiration | |
n.汗水;出汗 | |
参考例句: |
|
|
4 detector | |
n.发觉者,探测器 | |
参考例句: |
|
|
5 high-tech | |
adj.高科技的 | |
参考例句: |
|
|
6 truthful | |
adj.真实的,说实话的,诚实的 | |
参考例句: |
|
|
7 truthfulness | |
n. 符合实际 | |
参考例句: |
|
|
8 quantitative | |
adj.数量的,定量的 | |
参考例句: |
|
|
9 stimulus | |
n.刺激,刺激物,促进因素,引起兴奋的事物 | |
参考例句: |
|
|
10 detailed | |
adj.详细的,详尽的,极注意细节的,完全的 | |
参考例句: |
|
|
11 demonstration | |
n.表明,示范,论证,示威 | |
参考例句: |
|
|
12 intentionally | |
ad.故意地,有意地 | |
参考例句: |
|
|
13 recording | |
n.录音,记录 | |
参考例句: |
|
|
14 misused | |
v.使用…不当( misuse的过去式和过去分词 );把…派作不正当的用途;虐待;滥用 | |
参考例句: |
|
|
15 biased | |
a.有偏见的 | |
参考例句: |
|
|
16 gender | |
n.(生理上的)性,(名词、代词等的)性 | |
参考例句: |
|
|
17 automated | |
a.自动化的 | |
参考例句: |
|
|
18 assessments | |
n.评估( assessment的名词复数 );评价;(应偿付金额的)估定;(为征税对财产所作的)估价 | |
参考例句: |
|
|