General

Facebook’s Algorithm Shows Gender Bias, Says Study

NewsGram Desk

The ad delivery system of Facebook is biased towards women, showing them different job listings than it shows to men, a new study has revealed. Researchers at the University of Southern California found that Facebook's ad delivery system discriminates against women reports The Verge.

The team of researchers bought ads on Facebook for delivery driver job listings that had similar qualification requirements but for different companies. The findings showed that the social media company targeted the Instacart delivery job to more women and the Domino's delivery job to more men. According to the researchers, Instacart has more female drivers but Domino's has more male drivers.

Follow NewsGram on Facebook to stay updated.

"Facebook's ad delivery can result in the skew of job ad delivery by gender beyond what can be legally justified by possible differences in qualifications," the researchers wrote, "thus strengthening the previously raised arguments that Facebook's ad delivery algorithms may be in violation of anti-discrimination laws."

Researchers at the University of Southern California found that Facebook's ad delivery system discriminates against women reports The Verge. Pixabay

In a similar experiment on Microsoft-owned LinkedIn, the researchers found that the professional networking platform showed Domino's listing to as many women as it showed the Instacart ad.

A Facebook spokesperson said in a statement that their system takes into account "many signals to try and serve people ads they will be most interested in, but we understand the concerns raised in the report".

"We've taken meaningful steps to address issues of discrimination in ads and have teams working on ads fairness today. We're continuing to work closely with the civil rights community, regulators, and academics on these important matters," the spokesperson was quoted as saying. This is not the first time Facebook has faced allegations of gender bias in its algorithms.

In 2017, a joint investigation by US-based non-profit organization ProPublica and The New York Times found companies like Verizon, Amazon, Goldman Sachs, Target, and Facebook place recruitment ads limited to particular age groups.

Another ProPublica probe found that Facebook allowed housing advertisers to target audiences by race and exclude minorities, raising questions about whether the company is in compliance with federal fair housing rules that prohibit such discrimination. Facebook, however, called it a "technical failure".(IANS/JC)

American Children Who Appear to Recall Past-Life Memories Grow Up to Be Well-Adjusted Adults

In the ‘Wild West’ of AI Chatbots, Subtle Biases Related to Race and Caste Often Go Unchecked

Future of Education with Neuro-Symbolic AI Agents in Self-Improving Adaptive Instructional Systems

Lower turkey costs set table for cheaper US Thanksgiving feast this year

Suicide bombing kills 12 Pakistan soldiers