Journalists and scientists have been prohibited from accessing and collecting data about its services.
Corin Faife, data reporter at the Citizen Browser project, and his team say they need to be able to parse the HTML in order to see what’s relevant. Facebook is now injecting “dummy text”Shares are included in the data. “they just made the code more confusing,”Faife said. “We can’t prove that they’re doing it, but it seems like a reasonable inference.”
Lindy Wagner, spokesperson for Facebook, said that researchers were not being thwarted by the company. “We constantly make code changes across our services, but we did not make recent code changes to block these research projects. Our accessibility features largely appear to be working as normal; however, we are investigating the claimed disruptions,”Wagner stated in a statement that.
The HTML change has a significant impact on how researchers collect automated data from Facebook News Feed posts. It’s a method that academic researchers, including New York University’s Ad Observatory, rely on to monitor trends on a larger scale. In 2020, NYU researchers had built a browser extension to study ads that Facebook surfaced to users — only to Get shut down and expelled from the platform this August for supposedly violating Facebook’s terms around user privacy.
“I don’t fully understand why Facebook has taken such a hostile stance,”Laura Edelson, researcher at NYU told the. “Many researchers in the field (including me) have gone out of our way to try to work with Facebook. I have always provided my security findings to the company before I present them publicly, so they have the opportunity to improve their systems before vulnerabilities are publicly known.”
Edelson and her team were able to use the Ad Observer tool to identify ads that promote QAnon or other far-right militias. This revealed that Facebook had failed to recognize 10% of its political ads. Edelson, Damon McCoy, and Paul Duke remain banned from Facebook.
“I would still rather work with the platform. I see the problem of disinformation and hate on Facebook as a social problem that affects everyone, and we’d be able to solve it a lot faster by working together. This is why I continue to call on Facebook to reinstate my account — I want to work with them,”She told:
Despite the Federal Trade Commission’s response that Facebook was wrongThe company has not restored the accounts after it blocked the NYU group on the basis of its user privacy agreement. The Facebook spokesperson did not have an answer for when Edelson and her colleagues’ accounts would be reactivated, or if there is a plan or investigation in progress.
Facebook’s lack of transparency in sharing data or information about its algorithm has been criticized repeatedly. The company was perhaps most challenged in 2018 The biggest scandalCambridge Analytica was a data company that Donald Trump hired during his 2016 campaign. It had access to 50 million Facebook users’ information. Since then, the company has only tightened its grip on access and control over sharing data with academics — especially if the information happens to be unflattering.
“The research we’ve published has been kind of embarrassing for Facebook,”Faife stated. “As a company, they are not very good with transparency if it inconveniences them. There is a point where we need to see independent researchers have access to Facebook, because it is so big — larger than some countries in some cases. That deserves greater scrutiny.”
Facebook is concealing its own findings on harms that its services have caused. Recent investigations by the Wall Street Journal have shed light on Facebook’s own research showing that Instagram is harmful and toxic for teenage users, as well as how its platform is a home to drug cartels, human traffickers and anti-vaccination sentiments. There are many, including RegulatorsThese are the best. calling on FacebookThese files will be released, and more of its research made available to the public.
“They are being roped in from all sides,”Cory Doctorow is an advisor to the Electronic Frontier Foundation. “They don’t like what researchers find when conclusions are very unflattering. Facebook’s just really bad at gathering data on itself. They are a company that is committed to appearance, not doing good.”
Madelyn Webb is associate research director at Media Matters. She said that monitoring Facebook has been difficult for years. Although the company may be willing to share data, many of it is not complete or accurate. “riddled with caveats,”She spoke.
“Every single project we do on Facebook is us trying to scrounge. When you see researchers doing what I do, a lot of that is manually produced. We have only a tiny, tiny fraction, and the only reason we have access to that is because we have begged for it,”Webb stated. “They are trying to do this, because the information doesn’t look good for them.”
Facebook has also made it difficult for researchers outside of its own organization to do their job in other instances. AlgorithmWatch in Germany stated this earlier this year. Stopping a project after Facebook cited privacy concerns about its crowdsourced research into Instagram’s political content. Princeton University researchers studying misinformation and election ad targeting have also halted their project due to concern over Facebook’s right to review their research before publication.
Facebook has only recently begun to share information about its users. It is what it representsIn the News Feed. Project Amplify was also approved by Mark Zuckerberg, CEO of Facebook. This strategy promotes positive stories about Facebook on the news feed. Webb says these actions are not transparent.
“There are some modicums of Facebook trying to improve their PR by granting access that is incomplete,”Webb stated. “You can’t give somebody half a car and call that a gift. I wouldn’t say Facebook has made any real commitment to transparency. If anything, they’ve gotten better at spinning.”