Modern technology gives us many things.

Social media giants face heat in child safety lawsuits



Η ασφάλεια των παιδιών στο διαδίκτυο έχει γίνει σημαντικό θέμα συζήτησης τους τελευταίους έξι μήνες. Οι γιγάντιες πλατφόρμες μέσων κοινωνικής δικτύωσης όπως το ByteDance, το , the and Alphabet are now facing the heat. Their platforms are claimed to be addictive and unsafe for children, and concern about their negative effects on mental health has brought them to court several times.

42 US states, along with several schools in the US, filed the lawsuit claiming that the social media platforms are causing emotional and psychological harm to minors. The lawsuit examined the profound effects that Instagram, Facebook and other sites have on the social and psychological realities of young Americans. He faced more than 140 lawsuits and individual cases against the platforms.

Shocking discoveries are on the way

Recently, US District Judge Yvonne Gonzalez Rogers denied their requests to dismiss the child addiction lawsuits. Most of these lawsuits were filed by schools and various states within the country.

The lawsuit combined more than 100 cases filed in early 2022 after Frances Haugen, a Facebook whistleblower, revealed Instagram's negative impact on teenagers' mental health.

Another Meta whistleblower, Arturo Bejar, pointed to the company's policies and added that the platform is fully aware of the harm it causes to children, but has failed to act. According to Béjar, Meta provides users with "placebo" tools that don't address issues affecting teens. He claimed the company misrepresents the frequency of harm to users, particularly children.

Read more: How to stop Facebook from training its AI on your data

The lawsuit focuses on design changes

The lawsuit focuses primarily on the application of product liability laws to online platforms, demanding improved warnings and design changes. The decision pointed out that Instagram, Facebook, YouTube, Snapchat and TikTok will face liability instead Article 230 and the First Amendment to the Communications Decency Act.

Η ενότητα 230 ορίζει ότι οι διαδικτυακές πλατφόρμες μέσων δεν πρέπει να θεωρούνται εκδότες περιεχομένου τρίτων. Αυτό σημαίνει ότι οι πλατφόρμες μέσων κοινωνικής δικτύωσης δεν μπορούν να θεωρηθούν υπεύθυνες εάν ένας χρήστης δημοσιεύσει κάτι παράνομο ή ενοχλητικό στην πλατφόρμα του. Οι μεγάλες s were trying to gain immunity in this very department.

However, Judge Rogers dismissed all claims under Article 230. The court held that the platforms were responsible for their design. The platforms did not provide adequate parental control measures that could be used by parents to limit their children's screen time.

During the trial, Judge Rogers added that the complaints by the plaintiffs did not fall within the category of freedom of speech or expression. Rather, they are associated with problems such as the lack of strict age verification, inadequate parental control measures, and the complexity of deleting accounts.

The plaintiffs added that the content is not to blame for the mental health problems, but the design features. "Addressing these defects would not require defendants to change the manner or speech they disseminate," Judge Rogers writes.

Why does the case matter?

It's unusual for multiple states to work together to sue tech giants for harming consumers. However, these collective efforts show that states are taking issues against children seriously and pooling their resources to fight social media platforms just as they fought Big Pharma and Big Tobacco.

Many lawmakers around the world are fighting to regulate the use of Instagram, Facebook and other platforms for the benefit of children.

In recent years, Utah, California and Britain have passed laws to improve privacy and security measures for young people. Utah passed laws which automatically disables social media notifications during the night to reduce them of children during sleep. However, lawsuits against children's online safety in the US are moving quite slowly as the tech giants work hard to dismiss them.

Recently, a court document noted that Mark Zuckerburg, CEO of Meta, rejected various initiatives to improve the platform for children and adolescents. José Castañeda, a spokesperson for Google, argued that the claims are false. It also said the company offers family- and kid-friendly content on YouTube and offers strong parental control features.

The other platforms have not responded yet. In recent years there have been many lawsuits claiming that social media platforms are harmful to children. However, many of these cases, including harassment on Grindr, did not get attention in court and were dismissed.

Recent studies have shown the many ways in which online platforms can disrupt mental health, and lawmakers are being pressured to enact laws that protect children, such as age verification. While it is not yet clear whether the online platforms are legally responsible for the damage, this lawsuit may open the door to better security claims in the future.



VIA: TomsGuide.com

Follow TechWar.gr on Google News

Απάντηση