Social platforms are being forced to grapple with the implications of these changes.
“The invasion of Ukraine is setting the stage for all future geopolitical events of this magnitude,”Wasim Khaled, CEO at threat and perception intelligence platform Blackbird.AI told. “And it is clear social media will remain a key territory in the landscape of war.”
Social platforms have proven that they can act much quicker than they did on previous issues. This shows that tech leaders are prepared to take a stand when they are threatened. Experts believe that platforms must become more proactive in detecting conflict and imminent threats. These tools and tactics are key to combating misinformation which is more dangerous than ever in modern warfare. Let’s take a closer look at those takeaways and what they mean for social media companies.
You will feel more at ease in your role as publisher
Following the invasion, Meta’s Facebook and Instagram, TikTok, Twitter, Google and others began restricting Russian ads and state-run media outlets, including RT and Sputnik. This contrasts with domestic controversies such as the Jan. 6, 2021 attack at the U.S. Capitol, or lies about 2020 presidential elections. Major platforms have banned specific pieces of content and not suspended actors or organizations.
“The excuse always was, ‘We aren’t the arbiters of truth,’”Subbu Vincent, Santa Clara University’s Director of the Markkula Center for Applied Ethics, shared his thoughts with “That justification does not emerge in the Russia-Ukraine case if you see how [companies] have acted quickly.”
Take the Capitol Insurrection. Right-wing extremists used Facebook to spread calls for the overthrow of the U.S. government. Watchdog groups and leaked reports have shown how social media companies downplayed their roles in the violence, and it wasn’t until after the Capitol attacks that Facebook and Twitter Banning was the decision former President Donald Trump’s accounts. Meanwhile, extremist groupsFacebook spread misinformation about COVID-19, even though it ignored its own studies.
The social networks have tended to be cautious in content moderation. They respond with force when forced into publishing decisions. Being a publisher requires that they take a greater amount of responsibility for their content than if they were a platform. However, they can still monitor what is being posted.
There’s an element of self-interest here for the tech companies, as well, Vincent explained. Tech companies want to be on the right side in history as they observe the conflict unfolding in Ukraine.
“They are seeing consequences for human lives and human agency down the line,”Vincent. “This is about democratic human agency. … [Companies] are pulling out because the consequences will get worse if they stay in.”
Representatives for Meta, Twitter and TikTok didn’t respond to requests for comment.
Prepare in peacetime, so you’re ready in wartime
Responses in Ukraine have also confirmed that tech giants are fully capable of making hardline pro-democracy decisions — just on their own terms. Whether it’s employing fact-checking teams or hiring an oversight board, tech leaders have plenty of resources to proactively remove misinformation and ensure safety on their platforms “when properly motivated,”Khaled at Blackbird.AI said.
“The goal since the onset of the invasion has been to disempower Russian propaganda artists from having another tool in their war chests,”Khaled added. “And while success is being demonstrated on this front, it is being done reactively, which still gives room for error and demonstrates lack of preparedness.”
As the world’s largest social network, Facebook said it has spent more than $13 billion on safety and security efforts since the 2016 U.S. election. Even so, the company failed to get it right — particularly in the case of overseas conflicts like the predominantly Muslim Rohingya ethnic group in Myanmar. 2017: As anti-Rohingya content in Myanmar and ethnic violence in Myanmar Spread on FacebookThe company did not detect the problem. More than 80% of hate speech is illegalIt was removed only afterward. The company cited technical and linguistic challenges, but it wasn’t until the next year that Facebook ramped up hiring Burmese speakers, created Policies regarding human rightsImmediately banned all government officials.
In Ukraine, there’s still concern that earlier propaganda campaigns are still driving current narratives and skewing the public’s perception online, Khaled said. According to reports, pro-Russian rebel groups have been identified. Used FacebookTo recruit fighters and disseminate message defying sanctions, and TikTok user shared Videos of Russian paratroopersThe invasion was actually recorded in 2015.
“The future of conflict, which is happening now, includes real-time TikTok and Twitter updates, engagement on social media, memeification, influencers swaying public opinion — essentially engagement from everyone, everywhere at every moment in time,”Khaled added.
For example, some tools that track distorted information will be crucial for predicting major events ahead of they occur. Next comes the next step. “formulating new playbooks, so when the time comes guesswork and trial and error are not the leading efforts that governments are relying on,”Khaled added.
Blackbird’s intelligence platform, for instance, measures intent behind a disinformation campaign instead of just tracking what’s occurring. It has been flagging narratives containing Russian disinformation and geopolitical concerns to find out where they are propagating. To stay ahead of potential dangers, social platforms will need to increase their ability to create advanced capabilities or outsource it.
As tech companies try to combat misinformation and violent content, they’re also fighting to maintain their services on the ground. They don’t have a clear plan for how to accomplish this task at such a large scale. These standards and industry practices are crucial in misinformation efforts and supporting and funding journalism.
Platforms must make real-time decisions about information that can blur the lines between reality and fiction, such as posts that downplay the invasion of Ukraine. But it’s clear that tech companies would rather not make those calls on their own, which is evident in how they approached the COVID-19 pandemic. Twitter has Additional fact-checkingLabels on COVID misinformation Facebook has established an outside fact-checking program that includes 80 organizations to examine content across Facebook’s apps.
“Disinformation on social media is blurring the lines of what’s actually happening, making it difficult to distinguish what side is the right side,”Khaled added. “Global conflict is occurring on land and online, and individuals are armed and empowered with smartphones making this event [in Ukraine] unlike any other in history.”
Roskomnadzor, Russia’s internet censor agency, has blocked access to Facebook & Instagram. This was after Roskomnadzor made some exceptions for content that called on violence against Russians. It’s unclear whether Meta-owned WhatsApp will also get booted, as that app is far more popular in the country than either Facebook and Instagram.
People are also flocking to YouTube — the world’s second-most visited website (after its parent ,Google), according to Hootsuite — to understand the conflicts in Ukraine. The platform announced earlier this month that it would remove videos denializing, minimising or trivializing the invasion. This was in addition to its restrictions on Russian-funded media channels and media.
“It’s within these companies’ power to hold themselves accountable when combating fake news or else they become vulnerable to public scrutiny and risk losing credibility going forward,”Robin Zieme, chief strategist officer at YouTube ad company Channel Factory said. “Fake news and misinformation is not only uncomfortable for bystanders, but extremely harmful for the parties impacted.”
Zieme’s firm is a YouTube partner currently working with the Ukraine Ministry of Digital Transformation to identify and block channels and keywords spreading misinformation. These lists are shared with media and brands to aid them in finding legitimate sources that can counter Russian misinformation, such as inaccurate reports about casualties. “Social media is almost unavoidable, and right now it’s allowing us to watch what is happening in Ukraine from the sidelines,”Zieme said.
This raises the question of whether Ukraine’s actions could influence future events like the midterm election or 2024 presidential elections. In Ukraine, whole nations and international groups took a united stand against Russia. They imposed sanctions almost immediately and stopped business operations in the area. While this made it easier to defend the choices of Western tech giants around ethical concerns, there is doubt that they will make similar calls about future issues that are closer to their hearts.
Vincent says it best: “Now that the world is burning, it is easier to make ethical decisions.”