Uncategorized

Mastodon: what is and how to social network in which users decide what is allowed to work | technology



The chaos following Elon Musk’s purchase of Twitter has users migrating to Mastodon, which passed last Sunday Two million monthly active users. “The future of social media doesn’t have to belong to a billionaire, it can be in the hands of its users,” Mastodon wrote on Twitter. One of the main differences between this platform and Twitter is content moderation, in which the users themselves set the rules and play an essential role.

How does mastodon work?

“Mastodon is not just a website like Twitter or Facebook, it is a network of thousands of communities run by different organizations and individuals that provide a seamless social networking experience,” say its creators. It is a decentralized, open-source application in which users are grouped into communities or “states” with multiple themes and characteristics: from those related to technology, games, activism, or art, to transgender people and advocates for privacy or freedom of expression.

After downloading the application, it is possible to choose which instance you want to be a part of. There is also a website that allows you to search for communities based on language and number of users. When doing research, it is possible to find examples of “for everyone who follows ethical journalism,” “the horror theme for all mutants,” or “for geeks of all shades.”

Once inside an instance, the process and interface file apps It is very similar to Twitter. It is possible to follow other users, reply to posts, share them, or mark them as favourites. However, there are also some differences: Messages on Mastodon can contain up to 500 characters, compared to 280 characters on Twitter.

How is content moderated?

It is important to know how Mastodon works to understand how content moderation is performed. Each instance has its own rules and moderation team. “Theoretically, there are as many legal terms as there are cases, since each has its own servers, administrators, and therefore rules,” explains Fernando Suarez, president of the General Council of Professional Colleges in Computer Engineering.

See also  The most curious thing at MWC 2023: robot dogs, 3D without glasses and facial recognition for pets | technology

This means that “the moderators are the same users, but within their own version”. Therefore, one type of speech may be allowed on one server, but not on another. If someone ignores the rules of an instance, administrators can mute them (hiding their account from other users on that server) or kick them out.

Mastodon’s unstoppable growth has been accompanied by some moderation failures. “In 13 years on Twitter they’ve never taken a post from me, and in three weeks on Mastodon they’ve already taken something from me for racism and sexism,” Indicates a user on Twitter. The retiree’s tweet was as follows: “For the past few years I have been reading almost exclusively books by non-male, non-white, non-straight authors (or a combination thereof) and loved it, but now I think I need to go back to white heterosexual authors to study The psychology of rich white men who have power over us all.”

This user posted this message on mastodon.social, a public instance run by Eugen Rothko, the original developer of Mastodon, and a small team of colleagues. They realized it from the mastodon And theThis post has been deleted by mistake“We recently hired more moderators and the post in question was misinterpreted to mean something it didn’t.” Rothko announced a week ago that he has hired content moderators, for example, but every case is different and many lack the resources to pay moderators.

Is it possible to guarantee that there will be no racist or homophobic messages on Mastodon?

“Because each case is self-managed, what is done in a case is what those who created it want,” says Fernando Muñoz, partner at Grupo Raíz Digital, an agency that specializes in moderating digital content. In theory, all Mastodon servers are committed to demonstrating “active moderation against racism, sexism, homophobia, and transphobia.” “Users must have confidence that they are joining a safe space, free of the white supremacy, anti-Semitism, and transphobia of other platforms,” ​​they explain from Mastodon.

But, if the moderators are the users themselves, can a group of racist people make an example and allow hate messages or hoaxes to be posted on them? Muñoz replies, “Authority can, but it goes against the rules of creating mastodon archetypes.” This problem is not new. Gap Network, a far-right social network known as a meeting space for white supremacists, became the largest example of Mastodon in 2019, according to the portal. Vice.

See also  Latest notice from Social Security | How to add contact details or phone number

Mastodon released a statement criticizing Gab’s use of the platform and saying it will do everything it can to isolate this case. It was not a welcome move on our part; however, the license under which we publish software Anyone is allowed to use it as they see fit, as long as they keep the same license and make their modifications public.

How might other examples work against this type of content?

says Kit Walsh, an attorney with the Electronic Frontier Foundation, a nonprofit organization dedicated to advocating for civil liberties in the digital world. As he points out, “No single party controls the use of technology and can impose restrictions on content, but each forum or email server will have moderators who control that particular community.”

If there is a problematic instance, administrators of other communities may block it so that users cannot see and interact with its content. “This kind of blocking regarding servers is very common and is part of the way each node is set up [o instancia] Walsh says. There are those who share information about problematic cases “with the hashtag #fediblock and explain why they should not be trusted.” When searching for this hashtag On Mastodon, some users have reported instances of “insults and racism” or “posting harmful content”.

The fact that some communities can block others, according to Muñoz, can cause, for example, several instances with fascist content to interact with each other but not with the rest, “thus increasing the filter bubble”. Walsh asserts that some servers take a cautious approach, only accepting connections from a “white list” of instances they know they trust. The ideal, according to the EFF, is to find an instance where you agree to the moderation policies. The user can change at any time to another that fits more with their preferences or has more active supervision.

See also  It was like 'Big Brother', someone saw us and listened: What do you do when there are cameras on AirBnb | technology

You can follow country technology in Facebook s Twitter Or sign up here to receive The weekly newsletter.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button