Guest post: David Hoffman on “Protecting Democracy (and us!) from Big Tech”

The growing role of social media in the Ukraine conflict is garnering increasing notice in the press, with some suggesting that given how it is so significantly influencing world opinion, we are seeing the “first social media war.”  A recent post, Are we ready for war in the infosphere?, explored some of the implications of this development, and we’ll be exploring that more in the future. 

In todays post Duke professor (and LENS conference participant!) David Hoffman highlights that the use of social media has “required private sector technology companies to make decisions about the degree to which they will collaborate with both the Russian and Ukrainian governments.”  He notes some of the decisions that have already flowed from the crisis.

This relates to a multi-disciplinary project that has been underway at Duke for over a year working to “analyze the policies, processes, and accountability structures of technology platforms.”  David says that in addition, the team has  “researched the governance frameworks of industries, such as finance, environment, and oil and gas, to identify best practices.”  He also makes this exciting announcement:

As the first release of our team’s research findings, we have launched a three-episode mini-series on the Duke Sanford School of Public Policy’s podcast Ways and Means.  The series is a joint production with the Debugger podcast from the Sanford Cyber Policy Program that explores the impact of technology on the lives of individuals and democracy.

In the podcast, my colleague Bob Sullivan explains that we have reached a point where governments often do not have the tools and resources to properly hold large technology platforms accountable. Bob’s interviews with experts highlight the challenges that many platforms face as they attempt to enforce rules to promote greater safety and cybersecurity, while facing criticism from those who are negatively impacted by those decisions.

David’s essay is an easy way to quickly get up to speed on the key issues the rise of the power of social media present to governments, the tech companies, and each of us.

Protecting Democracy (and us!) from Big Tech

by David Hoffman

The Battle for Democracy 

With the Russian invasion of Ukraine, a technology front in the battle for democracy has emerged. As part of that battle, technology platforms have become both fundamental components of Russian attacks and Ukrainian efforts to defend against them. Those efforts to defend Ukraine have not just included governments but they have also required private sector technology companies to make decisions about the degree to which they will collaborate with both the Russian and Ukrainian governments.

Social media platforms have made decisions to de-platform Russian propaganda accounts and remove inaccurate content. Meta and TikTok have blocked Russian state media from being displayed in Europe. Microsoft has shared cybersecurity threat and vulnerability information with various governments to help protect them against Russian attacks. Google has removed tags from its maps of Russia, Ukraine, and Belarus due to concerns that the user-generated content had been used by the Russian military to coordinate air attacks.

However, technology platforms and stakeholders have also refused to take actions to work against the Russian attack. This includes several cryptocurrency exchanges such as Binance and Kraken that refused to block trading from Russian accounts even though that trading may undermine sanctions.

It also appears that ICANN has refused a Ukrainian request to revoke Russian website domain names, which would likely take a large portion of the Russian internet offline. Meta’s Nick Clegg tweeted, “(t)he Ukrainians have also suggested that we remove access to Facebook and Instagram in Russia. However, people in Russia are using FB and IG to protest and organize against the war and as a source of independent information.”

Russia has also made requests of technology platforms. They have made a request to TikTok to protect children on its platforms from war videos from inside Ukraine. Meta refused a Russia demand that they cease fact checking of state owned media outlets.

From “Arbiters of Truth” to “Arbiters of Democracy” 

None of these technology platform decisions are neutral determinations. Each of these decisions are based on the values of the platform’s owners and executives. Technology platforms are well past denials of being arbiters of truth. Now, these companies not only make decisions of veracity, but they also make political decisions. We should now consider them “arbiters of democracy”.

This is the issue that the NSO group in Israel has had as they have come under criticism for licensing their Pegasus software to governments that they should have known would use it to harm democratic institutions such as a free press. On a daily basis these platforms make critical decisions about which countries they will work with and the extent of that collaboration. Companies are not standing on the sidelines waiting to see whether democracy will survive. Instead, they are playing in the game, they are writing the rules, and they are acting as the referees.

The potential consequences of companies’ involvement raise the question of how technology companies can demonstrate that they are making responsible decisions. It also presents the additional dilemma of how governments can hold those technology platforms accountable for their impact on democratic institutions.

Researching What Works in Other Industries

These two questions have been the driving force behind a Duke University research project over the past year. Our multi-disciplinary research group is comprised of faculty and students from public policy, law, business, engineering, and computer science. The team has worked to analyze the policies, processes, and accountability structures of technology platforms. Additionally, the team has researched the governance frameworks of industries, such as finance, environment, and oil and gas, to identify best practices.

As the first release of our team’s research findings, we have launched a three-episode mini-series on the Duke Sanford School of Public Policy’s podcast Ways and Means. The series is a joint production with the Debugger podcast from the Sanford Cyber Policy Program that explores the impact of technology on the lives of individuals and democracy.

In the podcast, my colleague Bob Sullivan explains that we have reached a point where governments often do not have the tools and resources to properly hold large technology platforms accountable. Bob’s interviews with experts highlight the challenges that many platforms face as they attempt to enforce rules to promote greater safety and cybersecurity, while facing criticism from those who are negatively impacted by those decisions.

A Path Forward

As democracies around the world explore what can be done to assist the Ukraine conflict, technology companies will continue to be presented with difficult far-reaching decisions. What platforms should be restricted? What content should be removed and what should be kept up? What apps should be removed from a platform? What threat intelligence should be shared and with whom? What cybersecurity tools should be provided to which governments? What efforts should be made to ensure that Russian citizens have access to the truth, despite government-imposed restrictions? At some point we will also need to ask, how can these platforms demonstrate that they are making these decisions in a responsible manner and how can governments hold those companies accountable.

 In the final episode of the podcast series, we offer recommendations to promote objective and responsible decision making by technology platforms, as well as recommendations for increased supervision of these platforms. Our team will continue this research and explore in more detail these accountability framework proposals. Please listen to the podcast and provide us with your feedback which will help guide our efforts.

About the author

David Hoffman is the Steed Family Professor of the Practice of Cybersecurity Policy at the Sanford School of Public Policy. He also formerly was the Associate General Counsel, Director of Security Policy and Global Privacy Officer for Intel Corporation.  Hoffman currently chairs the Civil Liberties and Privacy Panel for the Director’s Advisory Board for the US National Security Agency. He also chairs the board of the Center for Cybersecurity Policy and Law, and serves on the Advisory Boards for the Future of Privacy Forum and the Israel Tech Policy Institute. Hoffman also founded and chairs the board for the Triangle Privacy Research Hub, which highlights and fosters cybersecurity and privacy academic research done in the North Carolina Research Triangle.

Remember what we like to say on Lawfire®: gather the facts, examine the law, evaluate the arguments – and then decide for yourself!

You may also like...