How to Edit Facebook: Manage algorithms as authors

 Should social media companies continue to enjoy clothing protection that is closely linked to content content on their sites? An ongoing question. But we are dealing with legal differences. Section 230 of the U.S. Communications Act The 1996, which provides those protections, has never been made considering public communication. In the meantime, we need to hold social media accountable, and there may be a good way to amend this law to do so.

Section 230 lived during the time of chat rooms and America Online and was designed primarily to protect AOL from liability when people speak ill of conversations. The authors of the action never imagined that algorithms would be developed to promote the use of content and to promote content depending on whether it conducts engagement. But that’s how sites like Facebook and Twitter work best. This new reality requires a change of mindset about how we view social media companies, and how the law can apply to them.

Few deny that social media algorithms can fuel hatred and promote extremist ideologies, even contributing to disasters such as the one we saw in the US Capitol on January 6. But Facebook and Twitter are not to blame for the content and purpose as expressed on their forums. Any of us who post there remain liable for anything we say that is illegal or free, but the platform itself is included, as a result of Section 230.

But today, social networks can be viewed and should be considered the true creators of this content. That’s because they write algorithms that manage users ’feeds, determining what we see. They often suggest very swollen views to give them a higher exposure. Their middle business goal is to generate page views and user attention. So if social media companies' expertise determine what the user sees, I believe they should be seen as writers and editors of media feed content. Therefore, networks should be held accountable for that curation.

AOL may specify a consumer internet for its date. The fact is, though, that most of what people do is communicate via email, instant messaging, and communication on message boards and chat rooms. Communication was so advanced that at one point Steve Case thought the entire future of the company was a telecommunications company and he went to the board of directors of telephones to learn more.

After the transfer of section 230, AOL could not be prosecuted or prosecuted if someone misbehaved on a chat room. One of the key ways AOL won the war with its heeled Prodigy competition was by allowing more topics in the chat rooms. AOL lets teens talk about sex; The prodigy did not.

AOL had a host of models room for volunteers, although there were also paid staff. But the only compensation for those volunteers was free service. Because they were volunteers, they could not be treated harshly, so another Section 230, with the company in mind, stated that AOL or any other supplier could not be held liable or charged for failure to measure. That's right. It's hard to argue.

With nearly three billion users, Facebook has a lot of content to edit. Initially, a close friend of Mark Zuckerberg and his contemporary colleagues Adam D'Angelo found out how to make a large amount of content simultaneously available to users. But that created a problem: How do you choose what they should see?

The solution was an algorithm - rules applied to software that selected what content came in front of that user. Algorithms are soulless. Being told to increase clicks, they will increase clicks in the best way. Algorithms have good behavior. Balance and reason are not always popular with the algorithm, even if someone may try to plan one that way.

As many observers have noted, fear and anger are what keep most social users engaged. That was the key point of Netflix 'hit documentary Social Dilemma. Emotional issues. In addition, research shows that angry people show little ability to find the truth from fiction.

The algorithm itself will never be emotional but will instead analyze logically what makes you angry and emotionally involved, if that is what ultimately engages and engagement is the goal the algorithm is designed for.

Under the law, what is an algorithm, especially sophisticated algorithms such as those used by Facebook to order feeds for their stories? I believe the answer is simple: such an algorithm is author, and should be treated as one. Indeed, in some places the big tech companies are arguing about which products are copyright-protected, meaning that the algorithm, or its creator, is the author of the resulting content.

Treating algorithms as authors can have a significant impact, because authors have a constructive responsibility. Article 230 does not protect them. And more recently, Justice Clarence Thomas asked in the Supreme Court's official letter whether the courts have come a long way in extending the protection granted by section 230.

If it were to be responsible for the choices made by its algorithms, a company like Facebook would have to work differently. If a lawsuit was filed and someone saw a burning piece of content - such as promoting violence or some form of illegal hate speech - in their news feed, the company would have no choice but to find ways to avoid displaying the content.

If Congress or the courts declare that the algorithms set by the provider are not authors and therefore accountable for their content, that could solve many of the problems plaguing social media.

Post a Comment

Previous Post Next Post