Article 21 of the DSA introduces a new framework for resolving disputes over online content moderation. The article examines the first four certified Online Dispute Settlement bodies (ODS-bodies), their structures, processes, and early experiences. It explores the potential impact of this new system, highlighting both its promising start and potential systematic flaws that warrant attention as case numbers increase. The analysis also considers the role of Digital Services Coordinators (DSCs) in overseeing these bodies and the overall cost-bearing mechanism, which places the financial burden almost entirely on platforms, regardless of the outcome.

Metadata

Summary

  • Four Online Dispute Settlement bodies (ODS-bodies) are now certified under Article 21 of the DSA.
  • These bodies offer EU citizens low-threshold access to an independent review of platform content moderation decisions.
  • Each body has its own structure, fee schedules, and areas of focus.
  • Most bodies are run by legal professionals and operate independently from their managing functions.
  • The cost-bearing mechanism places the financial burden almost entirely on platforms, even if they win the dispute.
  • Early observations show a significant number of cases where platforms' decisions have been overturned.

What makes this novel or interesting

  • This is a new framework for online dispute resolution, representing a novel approach to content moderation.
  • The article provides on-the-ground insights from the very first ODS-bodies and their initial operations, offering unique and timely information.
  • The cost structure, which puts the financial onus on platforms, is a significant departure from traditional dispute resolution models and has potentially far-reaching implications.
  • The analysis of potential systemic flaws and long-term challenges provides a critical perspective on the nascent framework.

Verbatim Quotes

  • Certification and Oversight: "Not so much is known about the certification processes. But it seems that DSCs have been acting fast, thus helping to make Art. 21 DSA an early success."
  • Fee Structure and Cost-Bearing: "The fees and the cost-bearing mechanic of Art. 21 DSA are the provision's most disruptive aspect. ODS-bodies will charge overall-fees, which are supposed to cover ODS-bodies' total costs."
  • Platform Engagement: "Most ODS-bodies found platform engagement, for most parts and with differences between platforms, to look promising. Interestingly, some platforms are pretty often reversing moderation decisions immediately after learning of the ODS proceeding ('immediate remedy'), while other platforms would not do so at all."

How to report this in the news

The EU has introduced a new system for handling disputes between users and online platforms about content moderation, similar to a small claims court for the internet. This means if a user believes a platform wrongly removed their post or account, they can now appeal to an independent body. This is a significant shift, putting the onus on platforms to justify their actions. However, there are concerns about potential issues like the financial burden placed on platforms and whether this might lead to them being overly cautious in their content moderation, potentially over-removing content.

Detailed Recap

1. For Business Leaders of Regulated Platforms

  • Financial Implications:
    • The cost-bearing mechanism of Art. 21 DSA places the financial burden predominantly on platforms, regardless of outcome. Budget for these costs, which can range from €95 to €700+ per case depending on the ODS-body and complexity.
    • "Immediate remedy" (reversing the moderation decision before a final ODS decision) often results in lower fees, but repeated use could incentivize over-caution and impact free speech. Evaluate the cost-benefit of this strategy carefully.
    • Explore potential cost reductions through Mutual Administrative Assistance Agreements (MAAAs) with ODS-bodies like ADROIT, or efficient data transfer methods with ODS-bodies like User Rights.
  • Operational Adjustments:
    • Establish clear internal processes for handling Art. 21 DSA disputes. Designate a team responsible for liaison with ODS bodies and ensure they are familiar with each body's rules of procedure and fee structure.
    • Train content moderation teams on the implications of Art. 21 DSA and the increased scrutiny of their decisions.
    • Monitor ODS-body decisions and identify trends that may indicate areas where content moderation practices need adjustment.
  • Strategic Considerations:
    • The current system incentivizes platforms to reverse decisions to avoid higher fees. Consider the long-term implications of this approach on your platform's content moderation policy and its impact on users.
    • Engage with ODS-bodies proactively. Building relationships and understanding their procedures can streamline the dispute resolution process.
    • Pay close attention to evolving interpretations of Art. 21 DSA and its implementation by DSCs to adapt your strategies.
  • Competitive Landscape:
    • Different platforms are engaging with Art. 21 DSA with varying degrees of proactiveness. Observe competitor strategies and adapt accordingly to maintain a competitive edge while minimizing financial and reputational risk.
  • Long-Term Risks: The high volume of potential cases could become a significant financial burden. Monitor the evolution of the system and advocate for more balanced cost-sharing mechanisms if necessary.

2. For the General Counsels of regulated platforms

  • Legal Framework Interpretation: Stay informed about evolving interpretations of Art. 21 DSA by DSCs and ODS-bodies. This includes understanding the scope of "systemic shortcomings" and how DSCs evaluate them. Monitor case law developing around ODS-body decisions.
  • Liability Management: Art. 21 DSA introduces a new layer of legal risk for platforms. Develop strategies to mitigate potential liability arising from content moderation decisions and subsequent ODS-body rulings. This includes robust documentation of decision-making processes and adherence to platform policies.
  • Cost Management: Work with finance and operations teams to develop a budget for Art. 21 DSA-related costs. Explore strategies for minimizing costs, such as negotiating MAAAs with ODS-bodies, streamlining internal processes, and strategically utilizing the "immediate remedy" option. Analyze cost trends and advocate for legislative changes if the financial burden becomes unsustainable.
  • Policy Review and Alignment: Ensure platform content moderation policies are clear, comprehensive, and compliant with relevant laws. Regularly review and update these policies to align with evolving interpretations of Art. 21 DSA and best practices.
  • Dispute Resolution Strategy: Develop a clear strategy for handling Art. 21 DSA disputes. This includes determining when to utilize "immediate remedy," when to defend moderation decisions, and how to present evidence effectively to ODS-bodies. Consider factors like the specific ODS-body, the nature of the dispute, and the potential costs involved.
  • Data Privacy Considerations: Balance the need to provide context and feedback to ODS-bodies with user data privacy obligations. Develop procedures for securely sharing data while complying with GDPR and other relevant regulations. Be aware of requirements around data localization and transfer for certain ODS-bodies.
  • Contractual Implications: Review existing contracts with users and third-party providers to ensure compatibility with Art. 21 DSA requirements. Consider incorporating clauses related to dispute resolution and data sharing.
  • Regulatory Engagement: Maintain open communication with DSCs and other relevant regulatory bodies. Engage in consultations and provide feedback on the implementation of Art. 21 DSA to advocate for platform interests and shape future policy developments.
  • International Coordination: If your platform operates in multiple jurisdictions, coordinate legal strategies across regions. Be aware of potential conflicts between Art. 21 DSA and other regulations.
  • Precedent Setting: ODS-body decisions may create legal precedents that impact future content moderation practices. Closely monitor these decisions and analyze their implications for your platform's legal strategy.
  • Long-term Strategic Planning: Art. 21 DSA is a relatively new framework that is likely to evolve. Develop a long-term legal strategy that anticipates changes and positions your platform to navigate the changing regulatory landscape effectively.

3. For Trust & Safety Teams of Regulated Platforms

  • Increased Scrutiny: Your content moderation decisions are subject to independent review by ODS-bodies. Ensure decisions are well-documented, consistent with platform policies, and justifiable under relevant laws. Document reasoning clearly to avoid reversals.
  • Procedural Awareness: Familiarize yourselves with the rules of procedure of each ODS-body, including deadlines, evidence requirements, and available remedies.
  • Collaboration with Legal: Work closely with the legal team to ensure alignment between content moderation practices and the legal requirements arising from Art. 21 DSA.
  • Emphasis on Accuracy: The focus of ODS-body reviews appears to be on the application of platform policies and relevant laws. Ensure consistent and accurate application of these policies.
  • ODS-Body Specifics: Be aware of the different specializations and limitations of each ODS-body. Some focus on specific platforms or types of content. Direct disputes to the appropriate body to avoid delays or rejections.
  • Early Intervention: Platforms often reverse decisions upon notification of an ODS proceeding ("immediate remedy"). Be prepared to review and potentially reverse decisions swiftly to minimize costs.
  • Feedback Loop: Analyze overturned decisions to identify potential areas for improvement in content moderation guidelines and training. Use this feedback loop to refine processes and ensure compliance.

4. For Lawmakers

  • Oversight Effectiveness: Monitor how Digital Services Coordinators (DSCs) are implementing their oversight role. Evaluate whether current practices are sufficient to address systematic shortcomings in the Art. 21 DSA framework.
  • Cost-Bearing Mechanism: The current system puts a disproportionate financial burden on platforms. Assess the long-term implications of this and consider whether a more balanced cost-sharing model is needed to ensure fairness and avoid unintended consequences.
  • Impact on Content Moderation: Assess whether the “immediate remedy” mechanism is incentivizing over-removal of content by platforms. Consider adjustments to the fee structure to discourage this behavior.
  • ODS-Body Diversity: Evaluate the range of expertise and approaches among ODS-bodies. Consider if standardization or specialization is necessary to ensure consistent and high-quality dispute resolution.
  • Procedural Harmonization: Assess the differences in procedures and rules of evidence across ODS-bodies. Explore opportunities for harmonization to ensure fairness and predictability for both platforms and users.
  • Systematic Flaws: Monitor for unintended consequences such as “over-put-back” or “under- and overblocking” and develop strategies to mitigate these issues.
  • Resource Allocation: Ensure DSCs have adequate resources to effectively oversee ODS-bodies and address the growing number of disputes. Consider mechanisms for information sharing and coordination between DSCs.
  • Long-Term Sustainability: The system is still in its early stages. Continuously monitor its effectiveness, address challenges, and adapt legislation as necessary to ensure the long-term sustainability and fairness of the framework.