In a world where traditional social media platforms dominate the digital conversation, will decentralized solutions emerge as a bright spot against censorship or become a growth ground for language? from hostility?
TinTucBitcoin spoke with Anurag Arjun, co-founder of Avail, a pioneer in blockchain infrastructure who is passionate about how decentralization can transform online speech and governance.
In October, X (formerly Twitter) suspended the Hebrew account of Iranian Supreme Leader Ali Khamenei for “violating platform rules.” This controversial post reignited global debates about the power that centralized platforms hold in public discourse.
Many people wonder: Is it possible that a nation’s supreme leader is not allowed to comment on air strikes occurring within his own territory?
Political sensitivities aside, the same thing happens to creative people every day in much less stressful contexts. In the second quarter of 2024, YouTube’s automated flagging system deleted about 8.19 million videos, while users only flagged about 238K videos for deletion.
In response, decentralized platforms such as Mastodon and Lens Protocol are becoming popular. For example, Mastodon was witnessed a sharp increase 2.5 million active users since Elon Musk took over Twitter in November 2022. These platforms promise to redistribute control, but this raises complex questions about censorship, Accountability and scalability.
“Decentralization doesn’t mean censorship-free—it’s about transferring control to user communities while maintaining transparency and accountability,” said Anurag Arjun, co-founder of Avail. shared with TinTucBitcoin.
Decentralized platforms aim to eliminate corporate influence on online speech. Users define and enforce their own moderation standards. Unlike Facebook or YouTube, which face accusations of algorithmic bias and implicit bans, decentralized systems claim to promote open dialogue.
However, although decentralization eliminates monopoly control, it certainly does not ensure fairness. A recent survey from Pew Research Center found that 72% of Americans believe social media companies hold too much power over public discourse.
This skepticism applies to decentralized systems, where governance must maintain transparency to prevent louder voices from dominating the dialogue.
“Distributive governance ensures that no individual or company arbitrarily decides what can or cannot be said, but it still requires safeguards to balance diverse viewpoints, ” Arjun explained.
Without centralized oversight, decentralized platforms depend on moderation by the community. This approach hopes to ensure inclusivity but also risks fragmentation when consensus cannot be reached. Mastodon groups often have different moderation rules, causing confusion for users and harming the community.
Wikipedia is a great example of successful community-led moderation. It rely on 280,000 active editors maintain millions of pages globally. A transparent and collaborative user process ensures trust while protecting free speech.
“Transparency in governance is crucial. It prevents exclusion and builds trust among users, ensuring everyone feels represented,” says Arjun.
Decentralized platforms face the challenge of balancing freedom of expression with controlling harmful content such as hate speech, misinformation and illegal activities. A prominent example is the controversy surrounding Pump.fun, a platform that allows live broadcasting for the promotion of meme coins.
Abuse of this feature has led to harmful broadcasts, including threats of self-harm related to cryptocurrency price fluctuations.
“This highlights an important point. Platforms need layered governance models and evidence-based verification mechanisms to tackle harmful content without becoming dictatorial,” explained Arjun.
The obvious solution is to use artificial intelligence. While AI tools can identify harmful content with up to 94% accuracy, they lack the nuanced judgment needed for sensitive cases. Regardless, decentralized systems must combine AI with human-led moderation and transparency to achieve effective results.
So the question remains: How do we protect people from harm or enforce any form of regulation without prior agreement on what constitutes fraud? Furthermore, how will the community restructure itself if it successfully polices itself?
New governance and censorship risks
Decentralized governance decentralizes the decision-making process but also introduces new risks. Voting systems, while participatory, can marginalize the opinions of minority groups, replicating the very problems that decentralization was intended to eliminate in the first place.
For example, on Polymarket, a decentralized prediction platform, majority voting has sometimes suppressed opposing views, demonstrating the need for safeguards.
“In an age when centralized control of information is a systemic risk, prediction markets provide a way to cut through false narratives and see the unvarnished truth. Prediction markets are technology that preserves freedom and drives social progress,” said a blockchain researcher comment on X (formerly Twitter).
Transparent appeal mechanisms and oversight of majority power are essential to prevent new forms of censorship. Decentralized platforms prioritize user privacy, giving individuals control over their data and social graph.
This freedom boosts trust, as users are no longer at the mercy of corporate data breaches, as in Facebook’s Cambridge Analytica scandal in 2018, which exposed the data of 87 million users. In 2017, 79% of Facebook users trusted Meta with their privacy. After the scandal, this number dropped to 66%.
However, privacy can complicate efforts to address harmful practices. This ensures decentralized networks remain secure without compromising their core principles.
Arjun explains, “Privacy cannot come at the expense of accountability. Platforms must adopt mechanisms that protect user data while ensuring fair and transparent moderation.”
Legal and regulatory concerns in decentralized social media
A key challenge for decentralized platforms is addressing legal issues such as defamation and incitement. Unlike centralized systems like X, which receive around 65,000 data requests from governments each year, decentralized platforms lack clear mechanisms for legal action. Arjun emphasized the importance of cooperation between platform founders and legislators.
“Engaging with regulators can help establish guidelines that protect user rights while preserving the spirit of decentralization,” he said.
In authoritarian regimes, decentralized platforms provide opportunities to resist censorship. During the Mahsa Amini protests in Iran, for example, a government-implemented internet shutdown affected 80 million users, highlighting the need for anti-censorship networks. Although decentralized platforms are more difficult to shut down, they are not immune to external pressure.
“Decentralization provides powerful tools for resistance, but individual users remain vulnerable. Platforms must develop additional protections to shield them from repression. Decentralization began as a movement to empower users. To further that vision, platforms must prioritize inclusivity, transparency and technological innovation,” Arjun concluded.
Overall, the future of decentralized social media depends on solving these obstacles with creativity and collaboration. If successful, decentralized platforms could redefine the dynamics of online speech, providing a freer and more sustainable ecosystem for expression.
The question is not whether decentralization can work but whether it can evolve to balance freedom with responsibility in the digital age.