Liz Kelley of Twitter has admitted that the company “more analysis” to do in order to tackle racial bias within the algorithm it uses. As reported Engadget, this all comes after an experiment suggested Twitter’s algorithm prioritized photos based on racial bias.
Potentially making changes to its algorithms is the last thing Twitter needs on its plate right now. Given how busy the company is with dealing with election issues, more complications is not what they need.
Twitter, has announced additional security measures for political accounts in order to keep them safe in the run up to the election. The company, alongside Google, have also begun using new methods in order to try to tackle misinformation on their platforms.
Twitter’s algorithm may be racially biased
Twitter claims that it still needs to do more work to understand the algorithm’s flaws better. The original experiment carried out by cryptographic engineer Tony Arcieri, attached photos of Barak Obama and Mitch McConnell.
It found that Twitter seemed to exclusively highlight McConnell’s face leading to accusations of racial bias. The only time Obama popped up was when Alerci inverted the colors.
Others have done more studies and experiments to further investigate the issue. They inverted the names and photo order but this seemed ot have no effect. A high contrast smile also did nothing to change the prioritization.
However, a scientist called Matt Blaze, found that the app used did make a slight difference. His study found that TweetDeck was a more neutral and the official Twitter app.
Twitter to do more work to investigate algorithmic racial bias
Kelley did say that Twitter had checked the algorithm for racial bias prior to this. She said the company “didn’t find evidence” at the time. However, that seems to have changed after all these accusations.
She did say that Twitter would open would open source its algorithm studies. The idea being to help others “review and replicate”.
There is no guarantee that Google will be able to solve this issue. However, it does demonstrate the very real issues of algorithmic racial bias. In this case, it could serve to push certain people out of the limelight and keep others in it.
Forturntley, this issue has finally been noticed, however, if it cannot be fixed then Twitter faces a real problem. Hopefully, the company will be able to create a workaround or fix to make for a more natural and balanced algorithm.