November 22, 2017

Helping PR pros make smarter decisions

Are Facebook and Google news organizations?

Are Facebook and Google news organizations?

The UK administration of Prime Minister Theresa May is examining the feasibility of regulating Facebook and Google as news organizations, rather than tech companies. This move would establish a higher bar of responsibility for what appears on their sites.

Current law treats search and social platforms as technology sites, which means that these companies are not responsible for the content that is posted online. Regulating these channels as news organizations, on the other hand, means that they would carry some responsibility for what is posted—the theory being that if the responsibility for content is shifted to these companies, they will be more proactive in ferreting out illegal activity.

At issue for the UK is the ability of extremist and terrorist groups to use these tools to coordinate violent activity. This is a debate that has been simmering for years as the extent to which terror and extremist groups use the internet to recruit and plan has become apparent.

This change, if it comes to pass, could significantly alter the dynamic of the web as we know it. Why? Because one of the topics in the UK’s crosshairs is the proliferation of “fake news” pieces, which they recognize have potential to mislead the public.

The European Union has been far more aggressive at addressing a range of thorny issues that exist at the nexus of privacy and information. Google has fought hard against a 2014 European Court of Justice (ECJ) ruling called “the right to be forgotten.” This ruling is essentially a recognition that over time, information can become obsolete or incorrect—and that private individuals have a right to have older and outdated information removed from the web. The ruling has forced Google and other platforms to develop IP-address specific protocols in order to comply to each country’s standards.

Google has not had much success in challenging this ruling. European courts have generally ruled against the search giant, requiring it to remove older links—even if the original material was accurate.

So, the path to regulating technology companies—at least in the EU and Canada—is already being formed. The question is now becoming how wide that pathway will be, and whether the regulations will be far-reaching enough to alter the global practices of the platforms in question.

Why this matters

Content published in newspapers and other print publications are typically governed by laws that cover libel. Despite the recent trend of claiming stories are “fake news,” there are standards and processes required to verify facts and ensure that content published is accurate. Most publications take great pains to use words like “allegations of” or “alleged” if a story has components that have proven to be difficult to fact check—or, if claims are disputed. In short, because of the legal standards involved, most reputable news organizations have significant editorial checks and balances established.

Now, think about how content circulates on Facebook and Google. There isn’t a formal chain of command from a story to publication. Stories are posted by groups—some of which are news organizations and some are not—and are shared by individuals who find the content compelling. There is no editorial control, and no checks on content accuracy beyond what the organizations posting the information are willing to exercise before pushing the content to Facebook. Google’s challenge is to attempt to determine what constitutes “real” news content and prioritize those sites in search results. But trusting algorithms to serve up news has proven to be problematic.

If the UK decides to move ahead with this effort and, assuming any likely court challenges go the way of the regulatory agencies, how would Google and Facebook—or even Twitter—go about policing content posted on their sites? It would require a fundamental shift in the way these companies view themselves and how they operate. Instead of agnostic tech platforms, they would have to validate content shared to ensure accuracy—or potentially be held accountable for the spread of false information.

The difficulties inherent in launching such a reclassification, and the legal questions that go along with it, are most likely why executives at these companies are being so forceful in denying the fairly obvious reality that these platforms do, in fact, shape how news is delivered to the public.

A Business Insider piece reports that Sheryl Sandberg, among others at Facebook, argue that because the platform doesn’t employ any journalists, it cannot be considered a media company. This is a fairly head-scratching distinction, considering the fact that when you log on to Facebook, the first thing you see is what they have called a “news feed,” and the company also trains algorithms to provide informational content in that feed that matches what you’ve reviewed and liked in the past.

The UK is ahead of the US in examining how these platforms are shaping public opinion, and if that means they should bear additional responsibility. At some point this issue will need to be addressed by politicians and regulators in the US—the question is, are Google, Facebook, and others prepared for the possible outcomes of these discussions if it means they will be held accountable for the content they are giving prominence to on their platforms.

Ad Block 728

About The Author

Jennifer Zingsheim Phillips is the founder of 4L Strategies, and has worked in communications and public affairs for just over 20 years. Her background includes work in politics, government, lobbying, public affairs PR work, content creation, and digital and social communications and media analysis.

Related posts

Ad Block 728
9 Shares