The landscape of social media and the dynamics of platform governance have experienced significant shifts in recent years, particularly with the ownership transition of Twitter to Elon Musk, now rebranded as X. Recently, X released its first transparency report since Musk’s acquisition, raising questions about the company’s approach to transparency, data measurement, and content moderation. This new report provides an opportunity to assess the evolution of transparency within the platform, offering insights into how changes in policies, practices, and operational metrics have impacted the way the company communicates its actions to the public.
Prior to Musk’s takeover in 2022, Twitter established a norm of biannual transparency reports that covered similar grounds to X’s latest effort. These reports provided detailed statistics regarding content takedowns, user accounts reported for violating community guidelines, and responses to governmental requests for user data. The final report under Twitter’s branding was released for the second half of 2021, containing 50 pages of data and insights. In sharp contrast, X’s new report is considerably shorter at just 15 pages and lacks the interactive features that once allowed users to delve into data granularly.
While it is evident that transparency reports serve as vital tools for accountability, the drastic reduction in length and complexity from Twitter to X raises questions about the motivations behind such changes. For instance, while X claims to summarize its findings concisely, users may find this brevity detrimental to understanding the true scope of the company’s actions and the implications for platform governance.
When dissecting the numbers in the latest report, the inconsistencies in data collection and interpretation become glaringly apparent. For example, whereas the 2021 report indicated that 11.6 million accounts were reported, with 1.3 million suspended, X reports an astonishing 224 million total reports, leading to 5.2 million suspensions. Such a dramatic increase prompts skepticism regarding the comparability of metrics between the two eras.
Moreover, certain types of content flagged for removal appear to have drastically differing outcomes. In 2021, reported accounts for hateful content accounted for nearly half of all reports, resulting in significant action taken against violators. In contrast, X indicates action taken on only 2,361 accounts for similar infractions. This reduction could stem from a fundamental shift in policy regarding hate speech—an observation supported by comments from Theodora Skeadas, a former member of Twitter’s public policy team. With alterations to definitions and regulations around hate speech, previously violative content may no longer be flagged, leading to discrepancies in reporting.
Musk’s acquisition has ushered in significant changes to the platform’s policies, notably with the scaling back of rules surrounding COVID-19 misinformation and modifications to the definitions of hate speech. These transformations challenge the efficacy and utility of transparency reports as they make it difficult to measure user experience within a shifting framework. Additionally, the reduction in staffing, especially among trust and safety personnel, complicates the enforcement of any guidelines that remain in place.
Furthermore, X’s user base has seen a decline since the acquisition, raising questions about whether changes in reporting metrics correspond to shifts in user behavior, or if they reflect the effectiveness of enforcement strategies. As Skeadas poignantly notes, the challenges of interpreting data in light of evolving policies and reducing user engagement could yield misleading interpretations of platform dynamics.
As X charts its course forward, the implications of its recent transparency report are paramount for understanding the evolving nature of social media governance and accountability. The interplay of shifting policies, data presentation inconsistencies, and fluctuating user dynamics underline the complexities that researchers, policymakers, and users face when assessing the health of the platform.
Moving forward, it will be essential for X to not only maintain transparency but also ensure that its reports are comprehensive enough to reflect the actual state of content moderation and user safety. Ultimately, the company’s approach could set precedents for the handling of transparency within the wider tech landscape, impacting user trust and platform integrity for years to come.