Instagram revamps safety features for teens as pressure from Congress mounts — here’s all the changes the social media giant is making

Mark Zuckerberg’s Instagram unveiled what was billed as a major overhaul of its safety features for children on Tuesday — a move that online watchdogs quickly attacked as an effort to avoid an impending congressional crackdown on the social media giant.

Instagram said it will automatically put users under the age of 18 on “teen accounts” and block people who don’t follow them from viewing their content or interacting with them.

It will also silence Instagram app notifications for teen users between 10 p.m. and 7 a.m. and send “timeout reminders” asking teens to close the app after 60 minutes a day.

Mark Zuckerberg issued a stunning apology to the families of victims of online harm in January. AP

Parents will be able to see which accounts their child has sent to recently, set daily time limits and block teens from using the app during specific time periods.

Additionally, users under the age of 16 will need parental permission to make changes to their account security settings.

Meta’s announcement clashed with internet security groups – some of whom said the security improvements are insufficient.

Technical Oversight Project director Sacha Haworth said parents “should ignore Meta’s latest hollow announcement” and said the company has a history of “broken promises and changing policies” about online safety.

“Meta can push as many child-focused ‘features’ or ‘teens’ as it wants, it won’t change the fact that its core business model is based on profiting and encouraging children and teenagers to become addicted to its products — and American parents are wise to the rush and are demanding legislative action,” Haworth said in a statement to The Post.

The overhaul was announced as the bipartisan Children’s Online Safety Act — a landmark bill that would impose a legal “duty of care” on parent Instagram, TikTok and other social media firms to protect children from harm online , gains momentum in Congress.

In July, the Senate passed KOSA and another bill called COPPA 2.0, which would ban targeted advertising to minors and the collection of data without their consent and give parents and children the ability to delete their information. by social media platforms, by an overwhelming 91-3 vote. .

The House Energy and Commerce Committee is set to mark up the bills Wednesday — a key procedural step that would pave the way for a floor vote in the near future.

Another watchdog, the Technical Transparency Project, argued that Meta has “claimed for years that it is already implementing” versions of the features detailed in Tuesday’s announcement.

For example, Meta first announced plans to make teenagers’ accounts private by default and limit their interactions with strangers as early as 2021, according to previous blog posts.

The group also noted that some of the Internet security experts who promoted Meta’s security changes in the company’s blog post work for organizations that received funding from the company.

“Not only is Meta repackaging these efforts as new, while at the same time claiming for years that it is implementing these security tools. It’s also taking out Meta-funded voices and keeping them as independent experts,” the Technical Transparency Project wrote in X.

Instagram on Tuesday announced revised safety features for kids and their parents. paint drop – stock.adobe.com

Fairplay for Kids, one of the groups leading the charge to pass KOSA, criticized Meta’s announcement as an attempt to sidestep a meaningful legislative crackdown.

“Defaulting private accounts for minors and turning off notifications in the middle of the night are safeguards that Meta should have implemented years ago,” said Fairplay CEO Josh Golin. “We hope that lawmakers will not be fooled by this attempt to obstruct the legislation.”

“The Children’s Online Safety Act and COPPA 2.0 will require companies like Meta to ensure that their platforms are safe and privacy-protected for young people at all times, not just when politically necessary,” Golin added.

Alix Fraser, director of the Council for Responsible Media, had a similar view of the announcement.

“The simple fact is that this announcement comes as congressional pressure is mounting and support for the bipartisan Children’s Online Safety Act continues to grow,” Fraser said. “It wouldn’t be the first time Meta made a promise to avoid congressional action and then never followed through or quietly backed down.”

Internet safety groups accused Meta of trying to avoid a legislative crackdown. New Africa – stock.adobe.com

The Post contacted Meta for comment.

Policymakers have singled out Meta for failing to protect children from “breakup” scams and other forms of online sexual abuse.

Critics have also accused apps like Instagram of fueling a youth mental health crisis with negative outcomes ranging from anxiety and depression to eating disorders and even self-harm.

Last fall, a coalition of state attorneys general sued Meta, alleging the company relied on addictive features to hook children and boost profits at the expense of their mental health.

In January, Zuckerberg issued a stunning apology to the families of victims of online abuse during a tense hearing on Capitol Hill.

Internet safety groups accused Meta of trying to avoid a legislative crackdown. Allison Bailey/NurPhoto/Shutterstock

Despite its easy passage in the Senate, KOSA’s ultimate prospects in the House remain uncertain, with some critics on both sides of the aisle raising concerns about the impact on free speech online.

In July, US Surgeon General Vivek Murthy called for a tobacco-style “warning label” to be implemented on social media apps to raise awareness of their potential mental health risks, including depression and anxiety.

#Instagram #revamps #safety #features #teens #pressure #Congress #mounts #heres #social #media #giant #making
Image Source : nypost.com

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top