EFF: Our EU Policy Principles: User Controls
As the EU is gearing up for a major reform of key Internet regulation, we are introducing the principles that will guide our policy work surrounding the Digital Services Act (DSA). We believe the DSA is a key opportunity to change the Internet for the better; to question the paradigm of capturing users’ attention that shapes our online environments so fundamentally, and to restore users’ autonomy and control. In this post, we introduce policy principles that aim to strengthen users' informational self-determination and thereby promote healthier online communities that allow for deliberative discourse.
A Chance to Reinvent Platform Regulation
In a few months, the European Commission will introduce its much anticipated proposal for the Digital Services Act, the most significant reform of European platform regulation in two decades. The Act, which will modernize the backbone of the EU’s Internet legislation—the e-Commerce Directive—will set out new responsibilities and rules for online platforms.
EFF supports the Commission’s goal of promoting an inclusive, fair and accessible digital society. We believe that giving users more transparency and autonomy to understand and shape the forces that determine their online experiences is key to achieving this goal. Currently, there is a significant asymmetry between users and powerful gatekeeper platforms that control much of our online environment. With the help of opaque algorithmic tools, platforms distribute and curate content, collect vast amounts of data on their users and flood them with targeted advertisements. While platforms acquire (and monetize) a deep understanding of their users, both on an individual and collective level, users are in the dark about how their data is collected, exploited for commercial purposes and leveraged to shape their online environments. Not only are users not informed about the intricate algorithms that govern their speech and their actions online; platforms also unilaterally formulate and change community guidelines and terms of service, often without even informing users of relevant changes.
The DSA is a crucial chance to enshrine the importance of user control and to push platforms to be more accountable to the public. But there is also a risk that the Digital Services Act will follow the footsteps of the recent regulatory developments in Germany and France. The German NetzDG and the French Avia bill (which we helped bring down in court) show a worrying trend in the EU to force platforms to police users’ content without counter-balancing such new powers with more user autonomy, choice and control.
EFF will work with EU institutions to fight for users’ rights, procedural safeguards, and interoperability while preserving the elements that made Europe’s Internet regulation a success: limited liability for online platforms for user-generated content, and a clear ban on filtering and monitoring obligations.
Principle 1: Give Users Control Over Content
Many services like Facebook and Twitter originally presented a strictly chronological list of posts from users’ friends. Over time, most large platforms have traded that chronological presentation for more complex (and opaque) algorithms that order, curate and distribute content, including advertising, and other promoted content. These algorithms, determined by the platform, are not necessarily centered on satisfying users’ needs, but usually pursue the sole goal of maximizing the time and attention people spend on a given website. Posts with more “engagement” are prioritised, even if that engagement is driven by strong emotions like anger or despair provoked by the post. While users sometimes can return to the chronological stream, the design of platforms’ interfaces often nudges them to switch back. Interfaces that are misleading or manipulating users, including “dark patterns”, often contravene core principles of European data protection laws and should be addressed in the Digital Services Act where appropriate.
Platforms’ algorithmic tools leverage their intimate knowledge of their users, assembled from thousands of seemingly unrelated data points. Many of the inferences drawn from that data feel unexpected to users: platforms have access to data that reaches further back than most users realize, and are able to draw conclusions from both individual and collective behavior. Assumptions about users’ preferences are thus often made by making inferences from seemingly unrelated data points. This may shape (and often limit) the ways in which users can interact with content online and can also amplify misinformation and polarization in ways that can undermine the transparent, deliberative exchange of information on which democratic societies are built.
Users do not have to accept this. There are many third-party plugins that re-frame social platforms’ appearance and content according to peoples’ needs and preferences. But right now, most of these plugins require technical expertise to discover and install, and platforms have a strong incentive to hide and prevent user adoption of such independent tools. The DSA is Europe’s golden opportunity to create a friendlier legal environment to encourage and support this user-oriented market. The regulation should support interoperability and permit competitive compatibility, and should establish explicit, enforceable rules against over-aggressive terms of service that seek to forbid all reverse-engineering and interconnection. Beyond the Digital Services Act, the EU must actively support open source and commercial projects in Europe that offer localised or user-empowering front-ends to platforms, and help foster a vibrant and viable market for these tools.
Giving people—as opposed to platforms—more control over content is a crucial step to addressing some of the most pervasive problems online that are currently poorly managed through content moderation practices. User controls should not require a heightened threshold of technological literacy needed to traverse the web safely. Instead, users of social media platforms with significant market power should be empowered to choose content they want to interact with—and filter out content they do not want to see—in a simple and user-friendly manner. Users should also have the option to decide against algorithmically-curated recommendations altogether, or to choose other heuristics to order content.
Principle 2: Algorithmic Transparency
Besides being given more control over the content with which they interact, users also deserve more transparency from companies to understand why content or search results are shown to them—or hidden from them. Online platforms should provide meaningful information about the algorithmic tools they use in content moderation (i.e., content recommendation systems, tools for flagging content) and content curation (for example in ranking or downranking content). Platforms should also offer easily accessible explanations that allow users to understand when, for which tasks, and to which extent algorithmic tools are used. To alleviate the burden on individual users to make sense of how algorithms are used, platforms with significant market power should allow independent researchers and relevant regulators to audit their algorithmic tools to make sure they are used as intended.
Principle 3: Accountable Governance
Online platforms govern their users through their terms of service, community guidelines, or standards. These documents often entail the fundamental rules that determine what users are afforded to do on a platform, and what behavior is constrained. Platforms regularly update those documents, often in minor but sometimes in major ways—and usually without consulting or notifying their users of the changes. Users of such platforms must be notified whenever the rules that govern them change, must be asked for their consent and should be informed of the consequences of their choice. They should also be provided with a meaningful explanation of any substantial changes in a language they understand. Additionally, platforms should present their terms of service in machine-readable format and make all previous versions of their terms of service easily accessible to the public.
Principle 4: Right to Anonymity Online
There are countless reasons why individuals may not want to share their identity publicly online. While anonymity used to be common on the Internet, it has become increasingly more difficult to remain anonymous online. In their hopes to tackle hate speech or “fake news”, policymakers in the EU and beyond have been proposing duties for platforms to enforce the use of legal names.
For many people, however—including members of the LGBTQ+ community, sex workers, and victims of domestic abuse—such rules could have devastating effects and lead to harassment or other forms of attribution. We believe that as a general principle, Member States should respect the will of individuals not to disclose their identities online. The Digital Services Act should affirm users’ informational self-determination also in this regard and introduce the European right to anonymity online. Deviating terms of service should be subject to fairness control.
Comments
Post a Comment