STANDARDS

NGSS: Core Idea: ETS1.B    

CCSS: Writing Standards: 1    

TEKS: 6.2E, 7.2E, 8.2E, E.8A

Are You Being Watched?

Some schools, airports, and cities are using technology that can recognize people’s faces. Is it being used to keep an eye on you?

JEFF MANGIAT (PHOTO ILLUSTRATION); TROY AOSSEY/GETTY IMAGES (CLASSROOM); ISTOCK/GETTY IMAGES (WHITEBOARD); SHUTTERSTOCK.COM (STATIC LINES, SECURITY CAMERAS)

AS YOU READ, THINK ABOUT the benefits and risks posed by technology that can recognize people’s faces.

You’re sitting in class listening to today’s lesson when your mind starts to wander. At first, your teacher doesn’t notice you’ve stopped paying attention—but the cameras watching you do. They scan the faces of you and your classmates every second to determine how engaged you are. The devices alert your teacher, who tells you to quit daydreaming. You sit up straight and glance warily at the artificial eyes watching you.

This scenario may sound like science fiction, but facial recognition technology like this is actually being used in a school in China. It can detect when students are listening, answering questions, writing, interacting with one another, or asleep. A similar system tested in another Chinese school even analyzes students’ expressions to track their emotions: if they’re angry, happy, disappointed, or sad. The data collected by both systems is used to evaluate students’ class performance. Each person receives a score, which gets displayed on a screen at school and can be viewed on an app by parents, pupils, and teachers.

You’re sitting in class listening to today’s lesson. Then your mind starts to wander. At first, your teacher doesn’t notice you’re not paying attention. But the cameras in the room do. They scan the faces of you and your classmates every second to see how interested you are in the lesson being taught. The cameras alert your teacher, who tells you to quit daydreaming. You sit up straight and glance carefully at the artificial eyes watching you.

This may sound like science fiction. But facial recognition technology like this is really being used in a school in China. It can tell when students are listening, answering questions, writing, talking with one another, or asleep. A similar system is being tested in another Chinese school. It even watches students’ faces to track their emotions. It notices if they’re angry, happy, disappointed, or sad. The data from both systems is used to determine how students are doing in class. Each person gets a score, which is shown on a screen at school. The score can be viewed on an app by parents, pupils, and teachers.

GILLES SABRIE/THE NEW YORK TIMES/REDUX (STREET); SHUTTERSTOCK.COM (CAMERA)

KEEPING TABS: The Chinese government is investing billions of dollars in facial ID tech to monitor its citizens.

Right now, schools in the U.S. aren’t using this kind of technology to keep tabs on students’ behavior. And it may never be used for that purpose here. But some American schools are considering using facial recognition for other things, like scanning school grounds to spot people who may pose a danger to students. In fact, facial recognition is currently used mainly for security purposes. It can help police identify suspects from video footage. Stores have tested the tech to catch repeat shoplifters. Some makers of electronic devices, like doorbells with built-in cameras that identify visitors, tout facial recognition as an added safety feature.

Some people are embracing this high-tech trend. But it’s raising red flags for others. They believe facial recognition, which is often used without individuals’ knowledge, could violate people’s privacy. The technology has also come under fire for being inaccurate, particularly when identifying people of color. Despite these objections, the use of facial recognition is becoming more widespread, showing up in places from schools and airports to concert venues.

Right now, schools in the U.S. aren’t using this kind of technology to watch students’ behavior. And it may never be used that way here. But some American schools are thinking about using facial recognition for other things. For example, they could scan school grounds for people who may be dangerous to students. Facial recognition is now used mainly for security reasons. It can help police identify suspects from video. Stores have tested the tech to catch repeat shoplifters. Some doorbells have built-in cameras to identify visitors. The makers of such electronic devices say that facial recognition is an added safety feature.

Some people welcome this high-tech trend. But it’s raising red flags for others. They believe facial recognition could invade people’s privacy. That’s because people often don’t know it’s being used. Another worry is that the technology can be incorrect, especially when it identifies people of color. Even with these concerns, the use of facial recognition is becoming more common. It’s showing up in places from schools and airports to concert sites.

HELPFUL TOOL?

Facial recognition is just one of many types of technology today that utilize biometrics to identify people. Biometrics are measurements of physical characteristics unique to each individual. If you’ve ever used your fingerprint to unlock your cell phone, then you’ve been recognized with the help of biometrics (see Fingerprints and Beyond).

Anil Jain is a computer scientist and biometrics expert at Michigan State University. He investigates how facial recognition technology can benefit society. For example, Jain thinks the technology could aid in solving crimes by allowing detectives to track down suspects far more quickly than traditional methods. If a detective has a picture of a suspect, say from a surveillance camera, a facial recognition program could compare it with a database of known faces in hopes of finding a possible match (see How Facial Recognition Works). “It can do a comparison a million times a second,” says Jain. Some experts think the technology could be used in a similar way to help search for missing persons.

Another advantage of the technology, some say, is to simplify security screenings in high-traffic places. For instance, travelers no longer have to show their passports multiple times at the Delta Air Lines international terminal in the largest airport in Atlanta. They just look at a camera for identification and move along. Delta says this gets people through the terminal and onto planes faster.

Today, many types of technology use biometrics to identify people. Facial recognition is just one of them. Biometrics are measurements of physical traits that are different for each person. Did you ever unlock your cell phone with your fingerprint? Then you’ve been identified with the help of biometrics (see Fingerprints and Beyond).

Anil Jain is a computer scientist and biometrics expert at Michigan State University. He studies how facial recognition technology can help society. For example, Jain thinks the technology could help solve crimes. It could allow detectives to track down suspects far more quickly than older methods. Say a detective has a suspect’s picture from a surveillance camera. A facial recognition program could compare it with a database of known faces. The program would look for a possible match (see How Facial Recognition Works). “It can do a comparison a million times a second,” says Jain. Some experts think the technology could be used like this to help search for missing persons.

Some say the technology has another benefit. It could make security screenings easier in busy places. The Delta Air Lines international terminal in Atlanta’s largest airport is an example. There, travelers no longer have to show their passports over and over. They just look at a camera that identifies them. Then they move along. Delta says this gets people through the terminal and onto planes faster.

CAROLYN THOMPSON/AP PHOTO

SAFER AT SCHOOL? Facial recognition systems may become a new security tool at some U.S. schools.

SCANNED AT SCHOOL

In the wake of several deadly shootings at schools in the U.S., some educational institutions are interested in using facial recognition to better protect students. In 2018, officials in Lockport, New York, installed the technology in some of the city’s public schools. As of press time, the system hasn’t yet gone live. But if it does, its cameras would scan people’s faces at school entrances and in hallways. The images would be compared with a database of individuals, like suspended students, not allowed on campus. The software can also detect if someone is holding a gun. If a match is found for a face or a weapon, school officials are notified so they can decide whether taking action, like calling the police, is needed.

Critics, though, say there’s no evidence this technology makes schools safer. Groups that work to protect basic human rights, like privacy, have concerns too. “I’m worried about how schools are conditioning young people to expect that everything they do is going to be monitored and tracked by authorities,” says Kade Crockford. She’s the director of the Technology for Liberty Program at the American Civil Liberties Union of Massachusetts. After pushback from civil liberties groups and parents, New York’s department of education is now reviewing Lockport’s facial recognition system to determine how it should best be used.

Dozens of deadly shootings have happened at schools in the U.S. That’s why some schools want to use facial recognition to protect students better. In 2018, officials installed the technology in some public schools in Lockport, New York. As of press time, the system hasn’t gone live. But if it does, its cameras would scan people’s faces at school entrances and in hallways. The images would be compared with a database of people. The database includes suspended students and others not allowed on campus. The software can also tell if someone is holding a gun. If a match is found for a face or a weapon, school officials are warned. Then they can decide if they need to take action, like calling the police.

But critics say there’s no proof this technology makes schools safer. Some groups work to protect basic human rights, like privacy. They have concerns too. “I’m worried about how schools are conditioning young people to expect that everything they do is going to be monitored and tracked by authorities,” says Kade Crockford. She’s the director of the Technology for Liberty Program at the American Civil Liberties Union of Massachusetts. Civil liberties groups and parents have objected to Lockport’s facial recognition system. So New York’s department of education is now reviewing the system to see how it should best be used.

Using facial recognition to assess student behavior is another point of controversy. Outside of China, this technology has mainly been used for online courses. Cameras in students’ computers monitor their faces and eye movements as they watch an instructor remotely. The software then notifies the teacher when students aren’t paying attention and generates quizzes so they can review material they may have missed. The program’s creators believe this will help students do better on tests and also help instructors refine their lessons to make them more engaging.

Some watchdog groups, though, are critical of facial recognition programs that attempt to read how people are feeling based on their expressions. They say the technology isn’t backed by enough scientific research and is too simplistic, given the complexity of human emotions.

People also disagree over using facial recognition to measure student behavior. Outside of China, this technology has mostly been used for online courses. When students watch an instructor online, cameras in student’s computers scan their faces and eye movements. The software tells the teacher when students aren’t paying attention. It also writes quizzes so students can review material they may have missed. The program’s creators say this will help students do better on tests. They also believe this will help instructors make their lessons more interesting.

But some watchdog groups object. These facial recognition programs try to read how people are feeling by their expressions. Critics say the technology isn’t backed by enough scientific research. They also believe the programs are too simple to read complex human emotions.

SKEWED SYSTEM

Another problem with facial recognition is how often it makes errors. The technology is far more accurate at recognizing people than it used to be—but not everyone. It has a harder time correctly identifying women of color and sometimes fails to notice them at all. Researchers at the Massachusetts Institute of Technology Media Lab have helped highlight these flaws. One of their studies used software to analyze photos of famous African American women, including former first lady Michelle Obama and tennis star Serena Williams. The result: The program incorrectly labeled them as men.

These issues exist because all people—including facial recognition software developers—have unconscious biases, or ingrained stereotypes, that affect their behavior, according to Meredith Broussard, an artificial intelligence expert at New York University. Facial recognition software relies on artificial intelligence—a computer’s ability to perform tasks normally associated with human intelligence—to learn to recognize faces. But the databases of images used by the software have typically contained more white people than those of color. “People who created this didn’t notice, because they were mostly white men,” Broussard says.

One way to improve the situation is to make the teams working on facial recognition software more diverse. Until that happens, Broussard worries, biases in the technology could have serious real-world consequences, such as leading police to pursue the wrong suspects or mistakenly arrest someone who’s innocent.

Facial recognition has another problem. It often makes mistakes. The technology is much better at identifying people than it used to be—but not everyone. It has a harder time correctly identifying women of color. Sometimes it fails to notice them at all. Researchers at the Massachusetts Institute of Technology Media Lab have helped highlight these flaws. One of their studies used software to look at photos of famous African American women. They included former first lady Michelle Obama and tennis star Serena Williams. The result: The program thought these women were men.

Meredith Broussard is an artificial intelligence expert at New York University. She says these problems exist because all people have biases without even realizing it. These deep-rooted stereotypes affect their behavior. To learn to recognize faces, facial recognition software uses artificial intelligence. That’s a computer’s ability to perform tasks normally done with human intelligence. The software uses databases of images. But the databases have usually contained more white people than those of color. “People who created this didn’t notice, because they were mostly white men,” Broussard says.

If the teams working on facial recognition software become more diverse, that could help. Broussard worries that until that happens, biases in the technology could cause serious problems in real life. They could lead police to follow the wrong suspects or arrest an innocent person by mistake.

BILL O’LEARY/THE WASHINGTON POST/GETTY IMAGES

FORGET YOUR TICKET: At Dulles airport in Virginia, passengers get their faces scanned—not their boarding passes.

SURVEILLANCE STATE?

Civil liberties groups have strongly opposed the use of facial recognition by law enforcement agencies, as well as in places like schools and airports. In addition to concerns over privacy and accuracy, these groups worry about the U.S. becoming a surveillance state—a country that closely monitors its citizens.

That’s already happening in China. Its government uses facial recognition cameras to create a giant surveillance network that not only curtails crime but also keeps citizens in line. Police, for example, use the technology to publicly shame people caught committing minor crimes. If people cross the street illegally, police can identify them using facial recognition and post their information on giant electronic billboards to embarrass them.

Civil liberties groups are strongly against the use of facial recognition by law enforcement agencies. They also oppose its use in places like schools and airports. That’s not just because of concerns over privacy and accuracy. These groups also worry about the U.S. becoming a surveillance state. That’s a country that closely watches its citizens.

That’s already happening in China. Its government uses facial recognition cameras to create a giant surveillance network. It doesn’t only fight crime. It also keeps citizens in line. For example, police use the technology to publicly shame people who commit small crimes. If people cross the street illegally, police can identify them using facial recognition. Then they post their information on giant electronic billboards to embarrass them.

In the U.S., some police departments also tested facial recognition to gather information about people on the streets. They claim it’s to identify individuals wanted by the police. New York is testing a system to combat terrorism that scans people crossing bridges and tunnels into New York City. While those kinds of actions are meant to make people feel safe, they can have the opposite effect.

Being watched all the time can cause people to be fearful, says Jeramie D. Scott, a senior lawyer at the Electronic Privacy Information Center in Washington, D.C. Fear, he argues, can pose a serious threat to basic rights, such as freedom of speech. For instance, if the government could use facial recognition to identify protesters, people might be less likely to openly object to policies. “Being anonymous allows freedom of thought,” says Scott. “It lets you not worry about every single thing you do being scrutinized by an authority figure.”

In the U.S., some police departments also tested facial recognition. They used it to gather information about people on the streets. They claim it’s to identify people wanted by the police. New York is testing a system to fight terrorism. It scans people crossing bridges and tunnels into New York City. Those kinds of actions are meant to make people feel safe. But they can do the opposite.

Being watched all the time can make people fearful, says Jeramie D. Scott. He’s a senior lawyer at the Electronic Privacy Information Center in Washington, D.C. He argues that fear can be a big threat to basic rights, such as freedom of speech. For example, what if the government could use facial recognition to identify protesters? Then people might be less likely to openly disagree with the government. “Being anonymous allows freedom of thought,” says Scott. “It allows you to not have to worry about every single thing you do being scrutinized by an authority figure.”

FACING THE FUTURE

As facial recognition becomes more widely used, many people want to ensure the technology won’t be abused. They’re calling on lawmakers to create rules to protect people’s rights. A few states already have regulations to safeguard consumers. They prevent businesses, like stores or device manufacturers, from collecting or selling facial recognition data without customers’ permission. Some cities have banned or are considering banning the technology’s use by government agencies, like police departments.

Meredith Broussard, the artificial intelligence expert, believes the facial recognition debate is part of a larger discussion. She says people need to think more critically about technology. Too often, says Broussard, people believe it can solve everything. But technology has limits to what it can do, and it might be the wrong tool for certain problems. Broussard offers this advice to young people: “Educate yourselves about technology so that you can empower yourselves to create the kind of world that you want to live in.”

As facial recognition becomes more common, many people want to make sure the technology won’t be abused. They’re asking lawmakers to create rules to protect people’s rights. A few states already have laws to protect consumers. Businesses, like stores or device manufacturers, aren’t allowed to collect or sell facial recognition data without customers’ permission. Some cities have banned or are thinking about banning the technology’s use by government agencies, like police departments.

Meredith Broussard, the artificial intelligence expert, believes the facial recognition debate is part of a larger concern. She says people need to do more thinking about technology. Too often, says Broussard, people believe technology can solve everything. But it can’t do it all, and it might be the wrong tool for certain problems. Broussard gives young people this advice: “Educate yourselves about technology so that you can empower yourselves to create the kind of world that you want to live in.” 

videos (1)
Skills Sheets (3)
Skills Sheets (3)
Skills Sheets (3)
Lesson Plan (2)
Lesson Plan (2)
Text-to-Speech