The month of November 2024 proved relevant for debates on the regulation of digital platforms in Brazil. It was when the Supreme Federal Court (STF) began to debate the responsibility for user-posted content, an analysis that by the end of January 2025 had yet to be concluded. The municipal electoral cycle had ended, with important lessons for monitoring the social networks of candidates, who in some cases were persistent in opening new accounts whenever the original ones were suspended.
And it was also in that month that 28 specialists from the public sector, academia, private enterprise, and civil society gathered to discuss the topic. They dialogued under the Chatham House Rule, created by the UK Royal Institute of International Affairs in 1927, which states that participants of an event can debate freely without the identity of who expressed a particular analysis or opinion being published. Thus, everyone feels free to express their views without compromising their professional positions.
The meeting "Duty of Care and Internet Platforms" was part of the research project "Platform regulation in Brazil and in the UK: designing and enforcing 'duty of care' frameworks," funded by the ISPF ODA Challenge-Oriented Research Grants, from the British Academy, with the support of the International Science Partnerships Fund from the British government.
It was the result of a partnership between Insper and the University of Sussex and produced a policy brief (a technical document that presents recommendations to improve or implement public policies) signed by Francisco Brito Cruz, Beatriz Kira, and Ivar A. Hartmann, published in January under the title "Duty of Care and Regulation of Digital Platforms: A Brazilian Overview."
"The moment is opportune to debate ways to regulate digital platforms, and the document has proven even more relevant after Meta updated its usage policy," says Hartmann, who is an Associate Professor at Insper, holds a master's and doctorate in Public Law, and is a former Program Director of the project Supremo em Números at the Getulio Vargas Foundation (FGV). In early January, Mark Zuckerberg's company indeed announced the end of third-party fact-checking in the United States and the launch of a feature called "community notes," a model already used on X, formerly Twitter.
"This document captures the pulse of Brazilian society regarding best practices on digital platforms. We sought to understand what is consensual and what is not," reports Professor Hartmann. "What no one questions is: Brazil needs to take a position. Even with the Supreme Court's decision, Congress needs to act, and the government only has this year of 2025 to take a stand, because 2026 will be an election year, and sensitive topics like this are not usually approved."
The Fake News Bill, as the bill 2630 became known, seemed to be moving towards creating a legal framework for the topic. But, over the past year, after many changes to the bill, it became clear that there was no longer a political environment to approve the proposal.
“Perhaps the bill would have advanced if it had had a committee of jurists, as happened with the legislation on Artificial Intelligence (AI). In the vacuum left by Congress, the Superior Electoral Court (TSE) has made progress, as has the Supreme Federal Court, but the Legislature needs to manifest itself. If the government manages to pass a law for social networks, it will be a significant victory." There are reports that the Executive seeks ways to expedite the processing of a bill that satisfies Congress and leaves a legacy on the topic. And it is in this direction that the article published in January aims to contribute.
What the report points out is that the criteria set by the companies themselves are unstable, as evidenced by Meta's recent decision. "It is not feasible to rely solely on the decisions of large corporations. We need a response from the public sector. The civil liability norm is not enough; it is necessary to define what the rules are and who will oversee them," says Hartmann.
There are legal references, especially from the United Kingdom, from where the partnership with Insper originated, to anchor more assertive decisions, he says. "The most important thing is to pay attention to content recommendation. It is not just about removing and suspending illegal posts. The most important thing is to ensure that the data shared by many people is based on recommendations on transparency, limits, and accountability for those who publish and for the social networks themselves," points out the professor.
"Care for posts spreading incorrect information and fueling prejudices today is even more important than any legislation on AI and drones. The damage caused by false posts shared by millions of people is enormous because it favors brainwashing the population," he says.
The policy brief generated from the November meeting highlights relevant issues regarding this topic. The main aspects discussed were:
* In the United Kingdom, the Online Safety Act, adopted in 2023, imposes a series of duty of care obligations on platforms, requiring them to implement robust systems and processes to identify and mitigate risks. This legislation inaugurated a new milestone in the regulation of digital platforms, intensifying the debate on implementing and effectiveness of models based on the concept of duty of care.
* In Brazil, a theoretical gap persists regarding how a regulatory regime based on this concept should be designed, interpreted, and implemented within the Brazilian legal framework.
* Civil liability for third-party-generated content is a cutout of a broader governance debate. Voices from different sectors argue that the discussion about the individual removal of content following court orders (or other liability triggers) is insufficient to address systemic risks.
* Participants in the workshop engaged in a discussion on the resurgence of the State's role in crafting and articulating a public policy for digital platforms more sophisticated than the provisions in the Brazilian Civil Rights Framework for the Internet (MCI), in favor of the public interest. It was argued that self-regulation (mainly through content moderation mechanisms) has proven insufficient to address emerging risks (both within and outside the electoral context) and that it is necessary to build more robust regulatory capacities beyond judicial control over damages caused by user-generated content.
* One path suggested by some workshop participants was the structuring of a new independent regulatory body with the technical capacity to oversee and ensure compliance with due diligence norms.
* What is the most adequate institutional design? Participants pointed out the complexity and difficulty of defining and implementing new duties and obligations, especially without a regulatory body and clear parameters. The lack of specificity can create legal uncertainty and hinder the platforms' expected functioning by society, leading to unforeseen and undesired effects.
* Participants expressed concern that updates to the MCI or new regulatory layers could compromise the exercise of freedom of expression on the internet, especially by incentivizing arbitrary content moderation by large digital platforms or other intermediaries, given their natural aversion to judicial or administrative risks.
In summary, the article concludes that the debate presented was complex and multifaceted, reflecting the difficulty of finding an effective regulatory model that is balanced and respects fundamental rights in the country. It resulted in four points.
1. To advance a duty of care framework for digital platforms in Brazil, it is necessary to update the MCI or create new layers of legislation to address the systemic risks generated by the new business models of digital platforms, including the dissemination of misinformation and hate speech, amplified by algorithms and paid boosting. In this field, the institution of transparency and risk detection and mitigation duties emerges as particularly relevant. Electoral legislation also needs to adapt to these new digital realities.
2. It is necessary to work on the legal elaboration of the "duty of care" with a focus on its administrative dimension, in contrast to civil liability. The debate pointed to the existence of sources and possible starting points in Brazilian law but also highlighted that the elaborations applicable to the case of digital platforms are still preliminary.
3. The debate made clear the gap in overseeing new regulatory layers and operationalizing a duty of care. At the same time, there is a dispute between existing authorities, uncertainty in the distribution of competencies, and risks to rights. The consensus is that the authorities assigned this task must have technical expertise and independence from the government and economic power.
4. The participation of civil society is fundamental in building an effective and democratic regulatory model. Co-regulation instruments and supervised dialogue with platforms are also of interest, with performance tracking metrics.