The UK's Department of Culture, Media and Sport has published its long awaited report on “fake news”, and given recommendations on how to ensure that social media platforms are held accountable for how fake news is spread.
While the report is a recommendation for tech companies as a whole, Facebook comes in for the most criticism. According to the UK government report, Facebook knowingly broke privacy and competition laws.
Throughout the report, Facebook is frequently named and shamed, taking the brunt of the negativity and used as the poster child for how social media companies have dodged culpability to date.
If Facebook and other companies don't shape up, “democracy is at risk,” says Damian Collins, chairman of the UK's Culture, Media and Sport select committee.
Here's six things the UK government report highlighted:
- Tech Companies Should Have to Sign up to a Compulsory Code of Ethics
- UK Citizens Should Get Access to Their ‘Inferred’ Data
- Voluntary Guidelines Don’t Work
- The UK Government Doesn’t Rate Zuckerberg
- Russia is Very Interested in the Report
- “Fake News” is Not a Term the UK Government Wants to Recognize
1. Tech Companies Should Have to Sign up to a Compulsory Code of Ethics
One of the main takeaways from the report is that tech companies aren’t doing enough to protect their user base by stemming the flow of misinformation, as well as limiting more harmful and hateful information.
The recommendation that the paper makes is that responsibility for regulation should no longer be left up to the companies themselves. Instead, it should be handed over to an independent body, which would oversee a code of ethics, decided by a team of technical experts.
The idea is that this would provide the UK government with a clear set of rules in writing that all tech companies would need to adhere to. If they don’t adhere, they would be seen as breaking the law, and be dealt a hefty fine.
This is no small undertaking, and the paper also suggests that a levy be applied to tech companies in order to finance this new organisation.
In stark contrast to this additional cost for companies, the paper also backs a separate report on safeguarding the future of journalism. This report suggests that online newspapers and magazines should be zero rated for VAT, in order to redress the balance between social media platforms, and to encourage outlets not to develop paid-for digital services.
2. UK Citizens Should Get Access to Their ‘Inferred’ Data
Europe has strict rules about the sort of data that companies are allowed to collect and hold about its citizens. Under the General Data Protection Regulation (GDPR), companies must collect users' explicit consent when collecting and handling their data.
While this covers a huge range of personal data, it doesn’t account for “inferred data”. This could include information such as political parties you have liked on Facebook. This information is still highly useful to companies, who can use it to target potential customers, and it's not protected.
When giving evidence to Congress in April 2018, Mark Zuckerberg stated “You should have complete control over your data.” Zuckerberg said that the ultimate owners of data were the customers, who could delete any information held about them at will. However, the advertising profile that Facebook builds up about each user cannot be amended by the user, calling into question just how much control the public really has.
3. Voluntary Guidelines Don’t Work
If you’re wondering why the report suggests a compulsory code of conduct rather than a voluntary one, there’s a reason for that. They don’t work. The report quotes MP Margot James, who stated:
There have been no fewer than fifteen voluntary codes of practice agreed with platforms since 2008. Where we are now is an absolute indictment of a system that has relied far too little on the rule of law.
The paper recommends emulating governments that have enforced stricter rules. it singles out Germany as an example of where the system seems to work. Tech companies there are required to remove hate speech within 24 hours, or get saddled with a €20 million fine. With one-in-six of Facebook’s moderators now stationed in Germany, it seems that the company has taken this threat seriously.
The paper also questioned Facebook’s perception of regulation and authority. While on the surface, Facebook appears to agree with such recommendations, its actions don’t necessarily back this up. Ashkan Soltani, former Chief Technologist to the US Federal Trade Commission, observed, “There is a contemptuousness – that ability to feel like the company knows more than all of you and all the policy makers.” Soltani referenced the California Consumer Privacy Act, which Facebook championed in public, but lobbied against behind closed doors.
4. The UK Government Doesn’t Rate Zuckerberg
Perhaps unsurprisingly, the report has few compliments for Mark Zuckerberg himself. Firstly, the committee is rather sore about being stood up by him, naturally. But, he also catches flak for his apparent ignorance of the Cambridge Analytica scandal. The fact that he allegedly had no knowledge of the project, despite his position as CEO, “displays the fundamental weakness of Facebook in managing its responsibilities to the people whose data is used for its own commercial interests,” according to the paper.
Its most damning view on Zuckerberg, though, has to be the following quote:
“Even if Mark Zuckerberg doesn’t believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world, Mark Zuckerberg continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies.”
The report also blames Zuckerberg and Facebook’s management for deliberately being aloof and sidestepping questions. It criticized the ‘experts’ from the company attending the UK government hearing, who appeared to have not been properly briefed and couldn’t answer pertinent questions. Instead, they promised to follow up with letters, which also didn’t give satisfactory responses.
5. Russia is Very Interested in the Report
There’s an interesting aside in the report, which in part focuses on the proliferation of misleading information, and how it may have played a vital role in the 2016 Brexit referendum.
It might not be too surprising to learn that the finger is pointed at Russia for the role it played. The report reveals that the Kremlin-aligned media’s anti-Europe rhetoric managed 134 million impressions on Facebook during the period. For context, content from the UK-based Vote Leave and Leave.EU sites managed 33 million and 11 million impressions respectively.
Not only is Russia taking a keen interest in the UK’s election process, but also in the findings of the report itself. Before publication of this latest, final paper, an interim paper was published in 2018, with some initial recommendations. The UK Government issued its own response to the report, which attracted far more overseas interest than usual.
According to the DCM&S, the majority of views for the report came from overseas IPs, at 63%. Usually, 80% of views would be UK based.
Views of the Government's response to the interim report by city:
6. Fake News is Not a Term the UK Government Wants to Recognize
The report was originally titled “Fake News”, but the paper is quick to point to the slight change to this to “Disinformation and ‘Fake News’” (notice the extra quotes). The reason for this is that it views Fake News as a loaded term, acknowledging that “US President Donald Trump has described certain media outlets as ‘The Fake News Media’ and being ‘the true enemy of the people’”
The paper’s suggestion that the term was misleading was echoed by the UK government. Instead, it's since recommended the term “disinformation as the deliberate creation and sharing of false and/or manipulated information that is intended to deceive and mislead audiences, either for the purposes of causing harm, or for political, personal or financial gain.” It doesn't quite roll off the tongue.
It notes that the term “misinformation” implies an inadvertent sharing of false information.
While that's easily done, the UK government's findings suggest there's a huge problem of perfectly deliberate misinformation happening on Facebook's watch, and new regulations to curb its spread can't come soon enough.