Crime & Justice

Supreme Court examines Twitter’s responsibility in terrorist attacks

Washington, Feb 22 (EFE).- The US Supreme Court on Wednesday began hearing a case involving Twitter and examining whether Elon Musk’s firm is responsible for helping in the commission of a terrorist attack by not properly eliminating content published by organizations like the Islamic State.

The social network was sued along with Facebook and Google (as the owner of YouTube) by the family of Jordanian citizen Nawras Alassaf, who was killed on Jan. 1, 2017, in an Istanbul nightclub by Abdulkadir Masharipov, a terrorist who attacked the nightspot and murdered 39 people.

The plaintiffs allege that given that the terrorist organization used these platforms to recruit members, issue terrorist threats and disseminate propaganda, create fear and intimidate the civilian population, the tech companies bear responsibility for instigating this attack.

In the opinion of the plaintiffs, the tech firms provided material support to the IS by providing the infrastructure and the services that enabled them to promote and carry out their terrorist activities and by not monitoring and proactively eliminating the terrorist content.

They are relying on the Anti-Terrorism Act (ATA) and the Justice Against Sponsors of Terrorism Act (JASTA), which permit victims of terrorism to file lawsuits for primary and secondary responsibility against any entity that aids in the commission of a terrorist act.

The high court justices will have to rule on whether, according to the ATA, the social platforms that host user content have helped in the commission of a terrorist act for allegedly failing to filter and eliminate content posted by terrorist organizations.

In the Wednesday hearing, Twitter attorney Seth Waxman focused his defense on the argument that not doing everything possible to comply with Twitter’s rules and policies prohibiting this kind of harmful content is not equivalent to “knowingly providing substantial assistance” to posters of violent content.

He said that the plaintiffs had not claimed that Twitter had provided “substantial assistance, much less knowing substantial assistance, to that attack or, for that matter, to any other attack,” going on to say that it was undisputed that Twitter “had no intent to aid ISIS’s terrorist activities.”

“What we have here,” he said, “is an alleged failure to do more to ferret out violations of a clear and enforced policy against assisting or allowing any postings supporting terrorist organizations or activities,” but that did not amount to “aiding and abetting an act of international terrorism.”

If the Istanbul chief of police had come to Twitter saying that they have been following three user accounts and these users seemed to be planning some kind of terrorist act and Twitter had not investigated it, in that case the firm would have assumed responsibility for whatever attack they might have carried out, he said.

The tech firm owned by magnate Elon Musk says that the fact that the Islamic State used the platform does not constitute knowing assistance, a stance shared by the Joe Biden administration.

According to Deputy Solicitor General Edwin Kneedler, the government’s representative, the firm cannot be considered responsible under the ATA because Congress has said that this law is not broad enough to inhibit legitimate and important activities of companies, organizations and others.

But in the opinion of several of the high court’s magistrates, Twitter “knew all that” and “did nothing” about it, as progressive Justice Elena Kagan said.

How can it be said that Twitter did not provide substantial assistance, asked Kagan, adding that the social network is, in fact, providing service to people with the explicit knowledge that those people are using the platform to promote terrorism.

As Nitsana Darshan-Leitner, an attorney for the Nawras Alassaf family, told reporters after the hearing, the suit seeks to end the “immunity of the social networks.”

“Every terror attack begins and ends on the social media. The social media have been immune for too many years. They felt that they’re untouchable, and therefore they allowed the terror organizations to use them as a tool that they never had before and cannot do without,” she said.

The Wednesday session was held a day after the high court brought Google into the dock to evaluate whether the tech giant is responsible for the recommendations its algorithms make to users regarding viewing other content in a case with implications for freedom of expression.

Google, which owns YouTube, was sued by the family of Nohemi Gonzalez, a 23-year-old American of Mexican origin who was killed in the November 2015 terrorist attacks in Paris staged by the Islamic State in which a total of 130 people lost their lives.

In the opinion of attorney Keith Altman, it is necessary for the Supreme Court to take on and rule on the two cases because they involve asking companies to act in a reasonable and responsible way in administering content and not simply say there is nothing they can do to restrict harmful content.

The high court session on Tuesday was the first time that the nine justices have analyzed Section 230 of the 1996 Communications Decency Act, enacted when the Internet was in its early phase and which shields online platforms from lawsuits in which plaintiffs claim that they are responsible for information supplied by another party.

Related Articles

Back to top button