The use of online platforms as a major source of news has brought significant challenges to the electoral process in the United States. Central to this problem is that online digital intermediaries – such as Facebook and Google- have provided a platform for reaching voters in an unprecedentedly efficient and effective manner. Studies have shown that 68% of US adults receive their news through these kinds of online platforms. Moreover, fake news is more capable of reaching people than factually correct information. Users of these platforms such as individuals, companies, and even foreign countries have capitalized on this development. As such, these platforms have been instrumentalized to shape public opinion, arguably influencing the electoral process.
Historically, Congress has taken a hands-off approach to regulating political information on online platforms. In doing so, the Supreme Court’s 1st Amendment interpretation has been followed that freedom of speech must be unfettered. Accordingly, a market of ideas must be upheld, allowing all opinions to flourish, even if they are false. Confronted with the serious possibility for judicial review stemming from the seminal Marbury v. Madison, Congress has remained passive. Recent developments, however, call for a regulatory approach. For example, a Senate Intelligence Committee’s report recently established that throughout the 2016 presidential election, disinformation campaigns flooded the various online platforms through digital political advertising. This article argues that until Congress takes regulatory action, American democracy will remain at the disposal of self-regulatory initiatives.
Subsequently, the question becomes which (if any) possibilities the United States Congress has to its disposal? The first and most logical answer would be to draft a statute containing content-based restrictions by criminalizing the dissemination of ‘fake news’. In light of the extensive body of United States Supreme Court case law in which the 1stAmendment freedom of speech is protected, this does not seem like a viable option. To start with, in the Ashcroft v. American Civil Liberties Union case it was ruled that there is a presumption of unconstitutionality for content-based restrictions to 1st Amendment freedom of speech. Moreover, in Reed v. Town of Gilbert, the Supreme Court held that statutory content-based restrictions on freedom of speech must be subjected to the strict scrutiny test.  This test has two cumulative hurdles which must be satisfied for a content-based regulation to be in line with the 1stAmendment: the statute must be based on a compelling interest and it must be narrowly defined to achieve that interest.  Cases such as the United States v. Alvarez evidence just how difficult this is.
Congress, however, is not prevented from passing legislation regulating the procedure (rather than the content) through which fake news in the electoral process can be countered. For example, in FEC v. Citizens United the Supreme Court ruled 8-1 in favor of legislation introducing disclosure requirements in the Bipartisan Campaign Reform Act (BCRA) 2002. Moreover, as the Court stated in McConnell v. FEC, transparency and disclosure requirements are a justified limitation of the freedom of speech when they strive to achieve goals such as providing voters with information, countering corruption, or collecting information to enforce any content-based restrictions.
The regulatory approach of Congress could, therefore, introduce procedural safeguards. These should require online platforms to comply with standards of transparency and disclosure requirements. Although far from comprehensive, this statute could introduce important amendments signaling the beginning of the regulation of online platforms. This would correspond with a pressing need to preserve America’s democracy evidenced by the 2016 and 2020 elections and would begin to close the gaping hole in the current online platform regulation of fake news.
 Brian Beyersdorf, ‘Regulating the Most Accessible Marketplace of Ideas in History: Disclosure Requirements in Online Political Advertisements after the 2016 Election’ (2019) 107 California Law Review 1061, 1063.
 Simone Chamber, ‘Truth, Deliberative Democracy and the Virtues of Accuracy: Is Fake News Destroying the Public Sphere’ (2021) 69 Sage Journal of Political Studies 147, 149.
 Ibid. 150.
 Ibid. (n 1) 1094.
 Ibid. 1063.
 Ibid. 1064.
 Marbury v. Madison, 5 S. Ct. 137, 178 ( 1803).
 United States Senate Select Committee on Intelligence, Report on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election, Volume 2: Russia’s Use of Social Media with Additional Views, 8 October 2019.
 Reed v. Town of Gilbert, 135 S. Ct. 2218, 2227-2228 (2015).
 Citizens United v. FEC, 130 S. Ct. 876, 898 (2010) .
 Brown v. Entertainment Merchants Association, 131 S. Ct. U.S. 2729, 2738 (2011).
 Citizens United v. FEC, 130 S. Ct. 876, 886 (2010).
 McConnell v. FEC, 124 S. Ct. 619, 690.
 Ellen P. Goodman and Lyndsey Wajert, ‘The Honest Ads Act Won’t End Social Media Disinformation, But It’s A Start’ (2017) < https://ssrn.com/abstract=3064451> accessed on 14 May 2021.