Testimony before the Texas Senate Committee on State Affairs
May 29, 2024
Daniel Cochrane
Senior Research Associate, Tech Policy Center
The Heritage Foundation
Chairman Hughes, Vice-Chair Paxton, and members of the committee, thank you for the opportunity to testify on this important topic. My name is Daniel Cochrane, and I am a senior research associate in the Tech Policy Center at The Heritage Foundation.
In my testimony today, I will focus on two key points. First, I will address the specific methods Big Tech uses to influence our elections and evade laws regulating corporate political activity. Second, I will suggest several investigatory strategies and policy options to rein in Big Tech’s power to shape public discourse.
Large Internet companies like Alphabet, Meta, TikTok, and X wield enormous power over what we see and say online.REF Google search commands 90.68 percent of the global search market.REF Meta, with both Facebook and Instagram, controls around 74 percent of the global social media market.REF TikTok has over 150 million active users in America, while X (formerly Twitter), has around 27 million daily active users in the U.S. alone.REF The sheer scale and reach of these platforms gives them a sizeable advantage in shaping political narratives. As the research presented by my fellow witnesses suggests, these companies are far from neutral participants.REF They are biased, partisan actors, with both the will and the means to shape political discourse in accordance with their own values and agendas.REF
Political Information Manipulation: How Big Tech Is Skirting Election and Campaign Finance Laws
Manipulating Internet searches or user feeds to ensure that results, ads, or opinions disproportionately favor one political candidate or party is akin to making a political contribution.REF But conventional corporate political contributions and related activities are strictly regulated. Under federal law, corporations are prohibited from using their resources to directly support political candidates.REF Even corporate political action committees (PACs) are generally only permitted to contribute funds voluntarily donated by senior executives and are still subject to both contribution limits and reporting requirements.REF Similarly, the Texas State Election Code prohibits corporate entities from making expenditures to support certain political activities, including “political consulting to support or oppose a candidate; telephoning or telephone banks to communicate with the public; brochures and direct mail supporting or opposing a candidate; partisan registration and get-out-the-vote drives…voter identification efforts, voter lists, or voter databases…”REF
Despite these restrictions, Big Tech platforms use their enormous power to shape elections and political information in ways that are not only unethical, but likely run afoul of the Texas State Election Code. Specifically, platforms use reactive, proactive, and cooperative methods to publish or control information in “non-uniform” ways that “may influence users’ opinions and choices at the ballot box.”REF
Reactive Methods of Manipulation. These involve shaping political information environments by restricting the use, reach, or perceived credibility of speakers and their speech. These include speaker or speech-based restrictions (such as, banning, removing, or restricting accounts or the ability to post or interact with content), restrictions on content reach (such as, limiting certain content’s visibility or distribution to other users, that is, “shadow banning”), and restricting the perceived credibility of speech or speakers through labels or fact-checking. When platforms define prohibited speech like “hate,” or election “misinformation” in a manner that favors or disfavors content from or about a political candidate or party, they are tipping the political scales—regardless of whether they acknowledge doing so.
Proactive Methods of Manipulation. These entail influencing voters’ perceptions, preferences, or behavior through feed manipulation, search results, auto-generated content suggestions, and targeted prompts, notifications, or experiences like partisan go-vote reminders.REF Platforms such as Google are also deploying new methods of social manipulation like “prebunking,” designed to psychologically “inoculate” voters against “mis or “dis” information.REF Such methods are used to structure and personalize each user’s information environment to re-enforce certain political values across the entire “user experience.” But in the context of election-related information, this is little different from mailing brochures, conducting partisan get-out-the-vote drives, or voter identification efforts. It could also be considered polling in favor of or against a political candidate, which is not a permissible activity for corporate entities under the Texas State Election Code.
Cooperative Methods of Manipulation. These involve Big Tech platforms partnering with other actors to surveil and analyze political dissent for the purpose of learning to shape or counter it through overt and covert means. The Twitter files and reports from the Foundation for Freedom Online revealed that Twitter (now X) had data-sharing agreements with outside censorship groups—such as the Election Integrity Partnership and Virality Project—which allowed them to covertly surveil users, monitor their political speech, and push for censorship when users expressed doubts about COVID-19 and the 2020 election.REF However, if Big Tech platforms censor or boost content at the request of a campaign, political party, or affiliated entity, this could violate requirements in both state and federal law that corporate political activities be carried out independently of political campaigns.
At bottom, platforms influence the political information ecosystem in a multitude of ways, using methods that sometimes transcend conventional “censorship” or “deplatforming.” Worse, they can do so in ways designed to evade liability under election and campaign finance laws that regulate corporate political activity.
A Proposal for Reining In Big Tech’s Political Information Manipulation
While there is no silver bullet for addressing Big Tech’s political manipulation, there are a number of short-term oversight, investigatory, and legal responses available to state officials, in addition to several longer-term policy remedies.
In terms of immediate responses, the committee should investigate whether platforms are in compliance with existing election and campaign finance laws. Such inquiries should look beyond mere representations by corporate officials. Rather, the committee should consider using its subpoena power to examine all internal policies and practices related to the 2024 U.S. election, including:
- The technological methods, tools, and “blacklists” used by Big Tech platforms to censor and restrict accounts, news, and opinions to manipulate the political information ecosystem and intervene and influence election outcomes.
- Platforms’ policies, commitments, and internal guidance around political content and elections. The committee should specifically look to policies that give these platforms wide discretion to censor disfavored political speech (such as “misinformation,” “disinformation,” and “hate speech”) or alter information environments in pursuit of political or social aims.REF
- Potential collusion between Big Tech, government (state, federal, and foreign), as well as NGOs or contractors working to manipulate the election and news information ecosystem under the guise of combatting “disinformation,” “misinformation,” or “hate speech.”REF
- Potential violations of state or federal laws that require corporate disclosure of political activities or that prohibit such activities altogether.
Long-term policy remedies should prioritize robust transparency standards and ensure that users have meaningful and informed choices about how their information environments are moderated, curated, and structured. The following ideas would be an excellent step in the right direction:
- Designate certain technical interventions by large Internet platforms that knowingly favor or disadvantage a political candidate, party, or advocacy organization, as a prohibited or reportable activity under state election law.REF
- To the extent that platforms convey election information to their users, require that they disclose all factors used to target that information and to determine its visibility or discoverability to individuals.
- Require platforms to provide detailed disclosures about how they enforce their content policies, as well as all ad hoc changes made to algorithms that moderate or otherwise regulate the flow of political content, including the rationale for each change.
- Require platforms to publicly disclose requests by governments or private entities to censor accounts or connected content as well as the platform’s response to each request.
- Mandate that platforms disclose all “blacklists” used to restrict accounts or content and require them to notify any user added or removed from such lists, including the specific reasons for their inclusion and options to appeal.
- Consider requiring the largest Internet platforms to regularly assess and mitigate the potential for their algorithms to impede the free flow of political information during election years. Assessments should specifically examine the impact of algorithms on the ability of users to communicate and receive information from or about a public official, political candidate, or election on an equal basis regardless of viewpoint, religion, or political ideology.
Conclusion
With the 2024 election on the horizon, it is critical for states to use every tool at their disposal to rein in these companies’ unchecked power over our political information ecosystem. I am deeply encouraged by the committee’s attention to this important issue and welcome any opportunity to engage with you or your staff further. Thank you again for your time and the opportunity to testify today.
This testimony has been lightly edited for style.
*****
The Heritage Foundation is a public policy, research, and educational organization recognized as exempt under section 501(c)(3) of the Internal Revenue Code. It is privately supported and receives no funds from any government at any level, nor does it perform any government or other contract work.
The Heritage Foundation is the most broadly supported think tank in the United States. During 2023, it had hundreds of thousands of individual, foundation, and corporate supporters representing every state in the U.S. Its 2023 operating income came from the following sources:
Individuals 82%
Foundations 14%
Corporations 1%
Program revenue and other income 3%
The top five corporate givers provided The Heritage Foundation with 1% of its 2023 income. The Heritage Foundation’s books are audited annually by the national accounting firm of RSM US, LLP.
Members of The Heritage Foundation staff testify as individuals discussing their own independent research. The views expressed are their own and do not reflect an institutional position of The Heritage Foundation or its board of trustees.