Schedule a meeting with our Expert to discuss your needs and explore tailored software solutions.
Support center +91 9825 122 840
As Election Day in the U.S. nears, social networking startup Bluesky, now flush with new capital, hopes to demonstrate that its platform can serve as a more trusted, fact-checked alternative to Elon Musk’s X. While the latter is dominated by Musk’s support for the Trump campaign, Bluesky tends to lean left, thanks to its influx of disgruntled former Twitter users who don’t like the platform’s new direction. Now, with the U.S. elections upon us, Bluesky is setting up for its biggest test yet: its ability to handle the potential misinformation that can mislead users during these critical national events, including any posts meant to disrupt the voting process or those using new technologies, like AI, to confuse the voting public.
While X’s other competitor, Meta’s Threads app, has distanced itself from politics — even going so far as to no longer recommend users political content — Bluesky has capitalized on the demand for a real-time social network that prioritizes such discussions. Timed with X’s recent changes to the block function, which have angered some users, Bluesky may be poised to once again benefit from another X exodus as users make the switch.
To manage its election operations, Bluesky earlier this year hired a notable former Twitter leader as its head of Trust and Safety, Aaron Rodericks. Already experienced with the policies, tools, and teams needed to manage election safety at Twitter, where he co-led its Trust and Safety team, Rodericks once made headlines as the target of a right-wing campaign on X after announcing on LinkedIn how he was looking to hire more staffers for the 2024 election season. The exec later lost his job at X when Musk cut half the election integrity team after promising to expand it.
Now at Bluesky, the Rodericks-led team announced how it’s preparing to handle the U.S. presidential election, including by reviewing content for potential misinformation as well as other unconfirmed reports and claims.
In a series of posts on Bluesky, the Bluesky Safety team detailed its plans for election safety, reminding users that they can report posts to Bluesky’s moderation service for misleading and illegal or urgent content by clicking on the three-dot menu next to each post and account. There will also be a priority queue in its system for any election-related reported posts.
To keep the process “safe and accurate,” Bluesky says it will also remove any content that “encourages or glorifies intimidation or disruption in voting, tabulation, or certification.” It also plans to label posts with misleading claims about voting, like those sharing incorrect requirements around voter ID or other manipulated media.
Meanwhile, “emerging” reports related to the election that can’t immediately be verified will be labeled as “unconfirmed.” For instance, if someone reports there are long lines at their polling place or other incidents at the polls, these will likely be labeled unconfirmed at the time of the assessment. (The company didn’t share if and how it would update these reports if national media later confirmed them, however.)
The company says its plans to moderate the platform extend beyond election day, too, as it will work to identify and address any disruptions to the “peaceful transition of power” as well.
Plus, Bluesky says it is reserving the right to roll out more safeguards in the days to come, if needed, to ensure election safety on the platform.
Unlike X and Threads, where moderation is handled only by the business itself, Bluesky’s decentralized promise is that anyone can run their own Bluesky server and their own moderation service. Users can also subscribe to multiple moderation services to customize their feed to their liking.
“Our online experience doesn’t have to depend on billionaires unilaterally making decisions over what we see,” the company explained in March. “On an open social network like Bluesky, you can shape your experience for yourself.” In other words, if you don’t like how Bluesky is running its app, you can build your own. And if you don’t like Bluesky’s moderation choices, you can build your own independent service instead.
The Bluesky moderation team has also been expanded with additional hires following two recent surges that brought more users to the service. Though the company hasn’t said how large its moderation team today is, CEO Jay Graber hinted at the team’s size in an interview with Nilay Patel’s Decoder podcast in March, when she remarked, “We’re about 18 across engineering and ops, and then we have about that number on support and moderation.”
Work with us