Twitter Unleashes Hackers on Photo Algorithm

25

With the help of ethical hackers, Twitter wants to discover flaws in its cropping tool. But, unfortunately, he has a clear preference for women and white people.

 

When you post a photo on Twitter, it will be cropped on your timeline to fit the Twitter feed. But in the fall of last year, several examples of highly selectively cropped photos surfaced. In addition, the algorithm behind the tool would show women and white people remarkably more often.

Twitter examined its tool in the months that followed and confirmed in May that the algorithm does indeed prefer those people. Partly for this reason, automatic cropping is no longer applied in the smartphone app, but the ambition is to tackle the problem.

Now, for a week, Twitter organizes a bug bounty to discover errors in the system. Through HackerOne, you can earn 500 to 3500 dollars if you can show where the automatic cropping goes wrong. Twitter itself is releasing a paper on the subject and has, in the meantime, shared the code of the tool on Github.

Initially, it was assumed that computer-driven decisions would be more objective, but in recent years more and more examples have emerged of AI that is just not. This can happen because the dataset on which a tool is based is not representative or because the tools were developed by, for example, predominantly white men.

But sometimes, it also concerns data that initially seems to be independent of gender or skin colour. For example, students in the UK received a score based on an algorithm last year because no exams were possible due to corona.

However, it later turned out that the algorithm was based on previous performance and on the zip codes of previous students so that students from certain areas systematically received lower scores.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.