Did they just drop more algorithms to solve this problem?
Four major social media companies yesterday pledged to work to improve women's safety in their operating systems. Facebook, Google, Tik Tuk and Twitter signed the pledge in response to recommendations from a 120-member working group organized by the Web Foundation. p>
Read more If we deal with online harassment the same way, will we deal with spam? The commitments are a step in the right direction, but it's unclear if they do much to tackle online abuse. Sarah Subraj, Professor and Tufts University faculty member, "I'm really grateful to the Web Foundation for what they've really done to create the conditions for this effort, because not so much has been done over time." At the Berkman Klein Center for Internet and Society, he told Ars. "Looking at the commitments, it's hard to know what they mean or look like," he said. "It's very open. A lot depends on what the operating systems want to do."
In these commitments, there are eight equal tactics between categories: Monitoring and Reporting.
Companies pledge to:
Provide better ways for women to practice their online safety by: Using simple language accessible through the user experience, providing easy navigation and access to safety tools to reduce The burden placed on women by effectively reducing the abuse they see
Improving reporting systems by:Providing users with the ability to track and manage their reports provides greater ability to handle text and/or language, providing more policies Product Guidelines When reporting abuse, create additional ways for women to access help and support during the reporting process. Commitments don't seem very specific, and users may stress how to fully implement commitments - social media companies are used to being tolerant when their efforts fall short. How to write commits gives operating systems a great deal of freedom. Advertising
More emphasis on users having more control over what they see and better reporting. Many social media platforms have a tedious reporting process that often results in no action being taken against the harassers. An observer reviewing a report, for example, may misunderstand the text of the post, or fail to understand the meaning of the encoded language. These shortcomings mean that current policies are often uneven.
READ MORE The user is responsible for handling abuse, not the operating system. “These mechanisms are essential for reporting or reporting and are absolutely essential,” Subraj said. "But you can't watch or read the content being sent to you. This is a lot of work for highly targeted women. It's a waste of time, but it's also very annoying."
“Not only is it the recipients’ responsibility, but the senders still know that the recipients should read [posts] to [report]. [users] are still satisfied. It really needs purging.”
Some women have outsourced the management of their social media accounts to combat abuse. However, this is a dangerous and costly step that puts the vast majority of women out of their reach on social media. p> < p> “We see that the abuse is particularly severe for women of color, women belonging to religious minorities., homosexual women and so on.” If people start withdrawing from participation in public places, it is not fair to them, but This is not fair to the rest of us. People who go there style. It is no coincidence that we will lose certain combinations of votes. You may not feel that this is an important issue if you are talking about pop culture, but it is very important if you are talking about politics. And I would argue that it is very important in the case of popular culture as well. “Promoting
active solutions h2>
rather than placing the responsibility for combating abuse primarily on users, said Sobiraj, “social media platforms when online violence becomes more active,” Sobi Araj said. “The first time, repeat offenders who have not yet been banned should keep it, effectively forcing the operating system to log in.” He suggested that operating systems monitor the flow of content to potentially abused people and modify it until attention is lost. Editing in these cases puts a burden on the target users so they and others don't have to see the content, Soberaj said, adding that this is also the main motivation many users use. Despite this protection, the vast majority of content can still be accessed freely in the public system. social flows.
Of course, these solutions require that operating systems take a more active role in modifying content. Companies may consider this type of law enforcement - the fourth promise under "oversight" can be interpreted in this way. But history has shown that social media companies are disgusted with mediocre content.
Other promises are useful but not revolutionary. To keep up with this, operating systems are unlikely to change much. These companies can easily modify existing systems which are in dire need of improvement. The result may be the result of more algorithm utilities about problems that algorithms have not yet solved. p>
Facebook and Twitter vow to fight abuse of women, but leave plenty of room for failure
Facebook is taking over social media by providing an easy w...