Twitter abandons image cropping algorithm after finding bias – fr

0
32
Twitter abandons image cropping algorithm after finding bias – fr


San Francisco (AFP)

Twitter said on Wednesday it was ditching an automated image cropping system after its review found a bias in the algorithm controlling the feature.

The messaging platform said it found the algorithm offered “unequal treatment based on demographic differences” with privileged whites and men over blacks and women, and an “objectification” bias focused on a woman’s chest or legs, described as a “male gaze”.

The news comes a month after Twitter announced the launch of an algorithmic fairness initiative as part of an effort to reduce biases on its platform introduced by automation.

Twitter in 2018 introduced a so-called salience algorithm to crop images, aimed at improving the consistency of the size of photos in the timeline and focusing on their most important elements.

Rumman Chowdhury, Twitter’s head of software engineering and ethics and artificial intelligence specialist, said the company determined after its review that it was best to leave the reframing decisions to users.

“We looked at the tradeoffs between speed and consistency of automated cropping with the potential risks we saw in this research,” she said in a blog post.

“One of our conclusions is that not everything on Twitter is a good candidate for an algorithm, and in this case how to crop an image is a decision that is best made by people. “

The announcement comes amid heightened concerns about advanced algorithms that may provide biased results due to a lack of data on minorities or other factors.

This week, Amazon said it was expanding the ban on law enforcement’s use of its facial recognition technology, fearing its flaws could amplify racial prejudice.

Twitter’s initiative calls for “taking responsibility for our algorithmic decisions” for the purpose of “fairness and fairness of results,” according to the company.

LEAVE A REPLY

Please enter your comment!
Please enter your name here