in

Privacy Threat: Study Finds That AI Can Predict Your Political Orientation From A Blank Face Image

If you have a photo posted to the internet, you are vulnerable

New York Post

(New York Post) Researchers are warning that facial recognition technologies are “more threatening than previously thought” and pose “serious challenges to privacy” after a study found that artificial intelligence can be successful in predicting a person’s political orientation based on images of expressionless faces.

recent study published in the journal American Psychologist says an algorithm’s ability to accurately guess one’s political views is “on par with how well job interviews predict job success, or alcohol drives aggressiveness.”

 

Lead author Michal Kosinski told Fox News Digital that 591 participants filled out a political orientation questionnaire before the AI captured what he described as a numerical “fingerprint” of their faces and compared them to a database of their responses to predict their views.

“I think that people don’t realize how much they expose by simply putting a picture out there,” said Kosinski, an associate professor of organizational behavior at Stanford University’s Graduate School of Business.

“We know that people’s sexual orientation, political orientation, religious views should be protected. It used to be different. In the past, you could enter anybody’s Facebook account and see, for example, their political views, the likes, the pages they follow. But many years ago, Facebook closed this because it was clear for policymakers and Facebook and journalists that it is just not acceptable. It’s too dangerous,” he continued.

Caucasian businessman with scanner frame and digital biometric grid on dark background
Researchers are warning that facial recognition technologies are “more threatening than previously thought” and pose “serious challenges to privacy.” New Africa – stock.adobe.com

“But you can still go to Facebook and see anybody’s picture. This person never met you, they never allowed you to look at a picture, they would never share their political orientation … and yet, Facebook shows you their picture, and what our study shows is that this is essentially to some extent the equivalent to just telling you what their political orientation is,” Kosinski added.

For the study, the authors said the images of the participants were collected in a highly controlled manner.

“Participants wore a black T-shirt adjusted using binder clips to cover their clothes. They removed all jewelry and – if necessary – shaved facial hair. Face wipes were used to remove cosmetics until no residues were detected on a fresh wipe. Their hair was pulled back using hair ties, hair pins, and a headband while taking care to avoid flyaway hairs,” they wrote.

Democratic and Republican parties emblems logos on a phone screen and American flag displayed on a laptop screen.
A study found that artificial intelligence can be successful in predicting a person’s political orientation based on images of expressionless faces. NurPhoto via Getty Images
Business man scanning his face by smart phone to unlock and accessing personal data.
“I think that people don’t realize how much they expose by simply putting a picture out there,” said lead author Michal Kosinski.tippapatt – stock.adobe.com

The facial recognition algorithm VGGFace2 then examined the images to determine “face descriptors, or a numerical vector that is both unique to that individual and consistent across their different images,” it said.

“Descriptors extracted from a given image are compared to those stored in a database. If they are similar enough, the faces are considered a match. Here, we use a linear regression to map face descriptors on a political orientation scale and then use this mapping to predict political orientation for a previously unseen face,” the study also said.

The authors wrote that their findings “underscore the urgency for scholars, the public, and policymakers to recognize and address the potential risks of facial recognition technology to personal privacy” and that an “analysis of facial features associated with political orientation revealed that conservatives tended to have larger lower faces.”

Asian businessman using face scanner to clock in work and check body temperature
The authors wrote that their findings “underscore the urgency for scholars, the public, and policymakers to recognize and address the potential risks of facial recognition technology to personal privacy.” ryanking999 – stock.adobe.com

Read More

Leave a Reply

Loading…

White House Changes Biden’s Walking Routine, Now He’s Surrounded By Aides To Hide His ‘Shuffle,’ Prevent Him From Falling Over

FYI: AI Expert Says Elon Musk’s Prediction That AI Will Become ‘Smarter Than Any Human Being’ By 2025 Could Be True