Researchers from Denmark and Sweden tested out this theory by utilizing a computational neural networks, algorithms that mimic the structure and function of human brains. They then trained the Face API from Microsoft, which made use of such a network, to analyze thousands of human pictures. The results of their study were published in Scientific Reports last March.
Study first author Stig Hebbelstrup Rye Rasmussen of Aarhus University and his colleagues also utilized Face API to measure the emotional state of the photos' subjects. The program also looked into other algorithms to determine the attractiveness and masculinity of the candidates.
They found that when analyzing a person's picture, the AI method of deep learning can determine a person's political belief with 61 percent accuracy. Right-wing politicians were more likely to have happy facial expressions in photos. In contrast, people with neutral facial expressions were more likely to identify as left-wing.
The paper also found that more attractive female politicians were more likely conservative. However, attractiveness and masculinity for men were not tied to political ideology.
"Most clearly we see that both male and female right-wing composites appeared happier than their left-wing counterparts," the piece also found. It added that women who showed contempt on their faces were more likely left-leaning, albeit noting this as a rare occurrence.
"Our results confirmed the threat to privacy posed by deep learning approaches," the researchers wrote. "Using a pre-developed and readily available network that was trained and validated exclusively on publicly available data, we were able to predict the ideology of the pictured person roughly 60 percent of the time in two samples."
While links between attractiveness and political ideology are nothing new, the study's findings reveal how powerful AI can be at deducing that information. This, the paper's authors wrote, "confirmed the threat to privacy posed by deep learning approaches."
Byron York, chief political correspondent for the Washington Examiner, remarked that the study may not be the most serious paper on the matter ever published. But given its topic, he warned that AI could impact the political arena – most especially the 2024 presidential elections. (Related: We must stop militant liberals from politicizing artificial intelligence.)
"The implication is conservatives are happier [and] better looking, and so conservatives think it's a great study. Liberals don't think so much as it can only determine the [political affiliation] by 61 percent of the time, meaning four out of 10 times is going to be wrong," York said during an appearance on "The Evening Edit" on Fox Business.
"This study maybe is not the most serious thing in the world. But AI is a serious issue in politics. There are serious issues behind AI in pretty much every area of life, and politics is one of those areas."
According to the journalist, AI could lead to some sort of discrimination against people from a political viewpoint as artificial intelligence in terms of creating deep fakes and fake videos for use and in political ads is a big deal. He cited how the the Republican National Committee (RNC) created an advertisement using AI-generated images months ago. The said ad presented that if incumbent President Joe Biden is reelected, the city of San Francisco would collapse and the southern U.S. border would be overrun by illegal immigrants.
"They did actually create images suggesting those things and it was all AI. So, it is something that both probably industry and government are going to have to address. And I think that you're going to get some of the candidates addressing that issue in the campaign," York said.
Check out FutureTech.news for more stories about AI.
Watch the full interview of Byron York on "Evening Edit" below.
This video is from the NewsClips channel on Brighteon.com.
TSA's use of FACIAL RECOGNITION tech in US airports rouses privacy concerns.
Fairway grocers in NYC now using facial recognition to profile customers.
Sources include: