A German project investigating whether Instagram is politically polarizing and promoting misinformation or hate speech has been shut down. Facebook is pressuring researchers to obey or take legal action.
The Berlin Algorithm Watch investigated how certain things appear on Instagram in response to the upcoming German elections. The aim is to investigate whether the platform owned by Facebook allows certain trends to be discussed more often and how disinformation or hateful messages are displayed.
The researchers did this via an add-on that about 1,500 users voluntarily install and monitor what can be seen on Instagram. The investigation entered its second year, and each time Facebook was also asked to respond to the interim findings. But instead of responding, the researchers were instructed to stop their project and delete the data.
That was not a casual question. If Algorithm Watch didn’t stop, Facebook would use a ‘more formal engagement’. However, algorithm Watch says it doesn’t have the resources to fight a legal battle with Facebook, so the data was deleted. “We were bullied until we stopped our Instagram monitoring,” it said in an open letter to the EU asking for better protection of such projects.
In a conversation with the Finnish YLE, researcher Nicolas Kayser-Bril explains that his research finds, among other things, that messages from politicians with a lot of text are shown less than images.
“We were a long way from discovering the secrets of the algorithm, but Facebook attacked us anyway. It shows that they either have something to hide or are concerned about what could threaten their partial monopoly,” Kayser-Bril told YLE.
Facebook tells YLE that it has never made legal threats to the organization but that it does not want researchers to violate its terms of use. In recent years, the company has deliberately scuttled independent research projects. Recently, among other things, a project of the New York University Ad Observatory, in which the accounts of the researchers were suspended. A similar project by VRT and De Tijd was already blocked in 2019.
Facebook usually counters that criticism by saying that such tools violate the terms of use (drafted by Facebook itself). Instead, it refers to its own advertising library and initiatives where it provides researchers with data itself. However, it now appears that the data contained errors, which meant that the efforts of those researchers could go straight to the trash can. The advertising library is also regularly criticized for not containing all the details or all the advertisements.
In addition, last week’s revelation that Facebook knows very well how harmful its platforms are. Last week, based on internal documents, the Wall Street Journal revealed that Facebook is well aware that Instagram gives teenage girls low self-esteem and that the platform is sometimes blamed for suicide attempts. Yet, the company never shared those findings with the world until they were made public by the newspaper.
In short, it still seems that Facebook prefers not to have independent snoopers on its platform who use research and facts to point out abuses around disinformation or political preference in the algorithm. But the company knows that many things go wrong internally and is silent about this. If data is nevertheless released to researchers, then that is a mistake.
Be First to Comment