The past few weeks have been significant for internet misinformation and disinformation (D&M) regulation: Elon Musk agreed to buy Twitter largely to change his approach to D&M; the US government announced – then suspended – a Disinformation Governance Board to oversee some D&M; the European Union completed landmark new internet laws, some of which regulate D&M; and former President Obama changed his long-standing hands-off approach and called for government regulation of D&M.
No exchange better illustrates the difficulty of defining D&M than the recent between President Biden and Amazon founder/Washington Post owner Jeff Bezos† In response to Biden’s tweet “Do you want to cut inflation? Let’s make sure the richest companies pay their fair share,” Bezos replied: “The newly created Disinformation Board should review this tweet, or maybe they should write a new one instead Forming Non Sequitur Board.” An industry and global regulatory structure is emerging to tackle internet D&M, but how difficult is the task?
Two terms about regulating D&M are important to note. The first is that most pleas for regulating D&M are only for very large platforms, usually defined as many millions of subscribers, making smaller platforms less regulated. This includes the European Union, many countries and several US states. The second is that regulation of D&M would conform to a set of pre-existing Internet content regulations that cover areas that have been regulated or banned both on and off the Internet for centuries — including infringement, child pornography, false advertising, defamation, threats of immediate harm, obscenity, rebellion and more. These areas have a robust history of national definition, refinement, legislation and litigation.
The majority of content moderation taking place on internet platforms today concerns these existing forms of illegal/regulated content, and definitions are generally similar across countries.
Regulating or banning D&M is breaking new ground by addressing previously less defined categories such as politics, health, science, etc. – and trying to do so on a global scale.
When looking at something so complex, it’s often best to start at the beginning — and the beginning is July 3, 1995, the day everyone ran into the checkout counter of an American grocery store. a beautiful cover of Time Magazine with a young boy behind a computer keyboard who was clearly in total shock as he looked at the computer screen, with a huge head blaring “CYBERPORN”.
An explosion of political concern about content on this new medium called the Internet ensued, leading to groundbreaking Internet content laws, rules and regulations, the major Internet platforms of which shielded from liability for content created by others and allowed platforms to edit in any form. as they wished, with virtually no supervision or liability.
As I explained in an earlier pieceAlmost all of this early attention to internet content was about cyberporn and it unequivocally established a clear right for the government to monitor internet content. Previously, the role of government in managing content in computer bulletin boards and chat rooms was much less clear.
Twenty-seven years later, few are talking about regulating cyberporn: the focus is almost entirely on D&M – but those early cyberporn laws laid the foundation for government regulation of D&M, and they lead to some of the same tough questions.
Most notably, if the governments or platforms ban D&M, then they have to define with some precision what D&M is – and isn’t, just as governments in the last century have tried to define obscene pornography. Defining D&M precisely today is much more complicated than defining obscenity in the 1900s – because major internet platforms have countless different countries, societies, religions, jurisdictions, languages, etc. Justice Potter Stewart’s 1964 definition of obscene pornography – “I know when I see it” – and to rely on “fact checkers” instead of judges to call D&M “when they see it.”
Not surprisingly, there is no generally accepted definition of “disinformation” or “disinformation”, although many definitions of disinformation are based on the concept of “false” and misinformation on “misleading”. Webster defines D as “false information spread deliberately and often covertly (such as by planting rumours) to influence public opinion or obscure the truth” and M as “inaccurate or misleading information.” Sometimes establishing truth/falsity is easy, but we all know that often it isn’t. My fourth grade teacher explained this by showing us a partially filled glass and asking if it was “half full” or “half empty”… we immediately split into respective camps. In seventh grade, we learned in the debate club that proponents emphasize truthful facts that support their opinion and discredit truthful facts that do not.
In a much more refined way, President Obama explained: that “any rules we devise to regulate the distribution of content on the Internet will involve value judgments. None of us are completely objective. What we consider unshakable truth today may turn out to be completely wrong tomorrow. But that does not mean that some things are no more true than others or that we cannot draw the line between opinions, facts, honest errors, willful deception.” Sometimes, as Obama explained, what is considered true or false can change.As proof of the evolving internet D&M truth, Evelyn Douek of the Knight First Amendment Institute recently featured in Wired magazine how multiple D&M ratings were later revised or even reversed.
Regardless, dozens of governments have criminalized or regulated internet D&M and made major platforms responsible for illegal D&M posts by third parties. According to the Poynter Institute, posting “false information” on internet platforms is a crime in many countries and more are on the way. In these situations, governments – through their courts or bureaucracies – will decide what is and what is not disinformation or misinformation. At the same time, public demands are mounting on internet platform business leaders to more actively regulate or ban D&M outside of (or in violation of?) government regulations.
Governments or business leaders face a very difficult task.
NOTE: This post has been updated from the original to correct the date of the Time magazine cover mentioned in the sixth paragraph.
Roger Cochetti provides consulting and advisory services in Washington, D.C. He was a senior executive at Communications Satellite Corporation (COMSAT) from 1981 to 1994. He also led public Internet policy for IBM from 1994 to 2000 and later served as Senior Vice President & Chief Policy Officer for VeriSign and Group Policy Director for CompTIA. He served on the State Department’s Advisory Committee on International Communications and Information Policy during the Bush and Obama administrations, has testified numerous times on Internet policy issues, and has served on advisory committees for the FTC and several UN agencies. He is the author of the Handbook for Mobile Satellite Communications†