Decentralized social network Bluesky has announced a new wave of moderation upgrades aimed at strengthening user safety, improving transparency, and ensuring consistent enforcement of its Community Guidelines. The updates arrive as part of the platform’s latest app release (v1.110), which also introduces a dark-mode app icon and redesigned reply controls.
The company says the changes are necessary as Bluesky experiences rapid user growth and an increasingly diverse community. “Clear standards and expectations for how people treat each other” are essential, the platform noted in a statement.
Expanded reporting tools and stricter enforcement tracking
One of the biggest updates is the expansion of user reporting categories from six to nine, giving people more precise options when flagging harmful content. New categories include Youth Harassment, Bullying, Eating Disorders, and Human Trafficking, aligning the platform with emerging safety regulations such as the U.K.’s Online Safety Act.
Bluesky has also upgraded its internal moderation tools to more effectively track violations and enforcement actions in a central system. While the company says it is not changing what it enforces, the improved tooling is expected to make enforcement more consistent, faster, and more transparent.
A new strike system with severity rankings
Bluesky’s revamped strike system now assigns every violation a severity rating, which determines the enforcement action:
- Critical-risk content → permanent ban
- Medium or high-risk content → temporary penalties
- Multiple accumulated violations → potential permanent account removal
Users will now receive clearer notifications explaining:
- What guideline they violated
- The severity level assigned
- How many strikes are on their record
- How close they are to an account-level action
- Suspension duration and end date
All enforcement decisions can be appealed, Bluesky added.
A platform trying to avoid X-style toxicity
The changes follow a recent controversy in which a user was suspended over a comment referencing a Johnny Cash lyric—an incident critics said highlighted inconsistent moderation. Bluesky says the new tools will reduce misunderstandings by giving moderators better context and users clearer guidance.
The platform has also been under scrutiny from its own community. While many users see Bluesky as a healthier alternative to X/Twitter, the company faces pressure to address harmful content without compromising its goal of building an open, decentralized network that hosts diverse viewpoints.
Regulatory pressure shaping platform design
Bluesky’s moderation push also reflects rising global regulation targeting online safety. Earlier this year, the company temporarily blocked access in Mississippi, citing an inability to comply with the state’s age-assurance law—which could impose fines of up to $10,000 per non-compliant user.
With more governments introducing rules to protect minors and combat harmful content, Bluesky says establishing transparent, scalable moderation systems is now essential for growth.
As the platform continues to expand, these updates mark one of its most comprehensive efforts yet to balance openness with safety—while trying to avoid the pitfalls that have plagued mainstream social networks.
