Blue Diamond Web Services

Your Best Hosting Service Provider!

April 24, 2025

Parents who lost children to online harms protest outside of Meta’s NYC office

Meta may have managed to kill a bipartisan bill to protect children online, but parents of children who have suffered from online harm are still putting pressure on social media companies to step up.

On Thursday, 45 families who lost children to online harms — from sextortion to cyberbullying — held a vigil outside one of Meta’s Manhattan offices to honor the memory of their kids and demand action and accountability from the company. 

Many dressed in white, holding roses, signs that read “Meta profits, kids pay the price,” and framed photos of their dead children — a scene that starkly contrasted with the otherwise sunny spring day in New York City. 

While each family’s story is different, the thread that holds them together is that “they’ve all been ignored by the tech companies when they tried to reach out to them and alert them to what happened to their kid,” Sarah Gardner, CEO of child safety advocacy Heat Initiative, one of the organizers of the event, told TechCrunch. 

One mother, Perla Mendoza, said her son died of fentanyl poisoning after taking drugs that he purchased off a dealer on Snapchat. She is one of many parents with similar stories who have filed suit against Snap, alleging the company did little to prevent illegal drug sales on the platform before or after her son’s death. She found her son’s dealer posting images advertising hundreds of pills and reported it to Snap, but she says it took the company eight months to flag his account. 

“His drug dealer was selling on Facebook, too,” Mendoza told TechCrunch. “It’s all connected. He was doing the same thing on all those apps, [including] Instagram. He had multiple accounts.”  

The vigil follows recent testimony from whistleblower Sarah Wynn-Williams, who reveals how Meta targeted 13- to 17-year-olds with ads when they were feeling down or depressed. It also comes four years after The Wall Street Journal published The Facebook Files, which show the company knew that Instagram was toxic for teen girls’ mental health despite downplaying the issue in public.  

Parents of children lost to online harms left an open letter to Meta CEO Mark Zuckerberg outside Meta’s office in NYC, April 24, 2025. Image Credits:Rebecca Bellan

Thursday’s event organizers, which also included advocacy groups ParentsTogether Action and Design It for Us, delivered an open letter addressed to Zuckerberg with more than 10,000 signatures. The letter demands that Meta stop promoting dangerous content to kids (including sexualizing content, racism, hate speech, content promoting disordered eating, and more); prevent sexual predators and other bad actors from using Meta platforms to reach kids; and provide transparent, fast resolutions to kids’ reports of problematic content or interactions. 

Gardner placed the letter on a pile of rose bouquets that were placed outside Meta’s office on Wanamaker Place as protesters chanted, “Build a future where children are respected.”

Over the past year, Meta has implemented new safeguards for children and teens across Facebook and Instagram, including working with law enforcement and other tech platforms to prevent child exploitation. Meta recently introduced Teen Accounts to Instagram, Facebook, and Messenger, which limits who can contact a teen on the app and restricts the type of content the account holder can view. More recently, Instagram began using AI to find teens lying about their age to bypass safeguards. 

“We know parents are concerned about their teens’ having unsafe or inappropriate experiences online,” Sophie Vogel, a Meta spokesperson, told TechCrunch. “It’s why we significantly changed the Instagram experience for teens with Teen Accounts, which were designed to address parents’ top concerns. Teen Accounts have built-in protections that limit who can contact teens and the content they see, and 94% of parents say these are helpful. We’ve also developed safety features to help prevent abuse, like warning teens when they’re chatting to someone in another country, and recently worked with Childhelp to launch a first-of-its kind online safety curriculum, helping middle schoolers recognize potential online harm and know where to go for help.”

Gardner says Meta’s actions don’t do enough to plug the gaps in safety.

For example, Gardner said, despite Meta’s stricter private messaging policies for teens, adults can still approach kids who are not in their network through post comments and ask them to approve their friend request. 

“We’ve had researchers go on and sign on as a 12- or 13-year-old, and within a few minutes, they’re getting really extremist, violent, or sexualized content,” Gardner said. “So it’s clearly not working, and it’s not nearly enough.”

Gardner also noted that Meta’s recent changes to its fact-checking and content moderation policy in favor of community notes are a signal that the company is “letting go of more responsibility, not leaning in.”

Meta and its army of lobbyists also led the opposition to the Kids Online Safety Act, which failed to make it through Congress at the end of 2024. The bill had been widely expected to pass in the House of Representatives after sailing through a Senate vote, and would have imposed rules on social media to prevent the addiction and mental health harms the sites are widely agreed to cause.

“I think what [Mark Zuckerberg] needs to see, and what the point of today is, is to show that parents are really upset about this, and not just the ones who’ve lost their own kids, but other Americans who are waking up to this reality and thinking, ‘I don’t want Mark Zuckerberg making decisions about my child’s online safety,’” Gardner said. 

Keep reading the article on Tech Crunch


Threads officially moves to Threads.com and updates its web app

Instagram Threads, Meta’s newest social network and X competitor, is officially relocating from the website Threads.net to Threads.com. The transition will coincide with a handful of quality-of-life improvements for the Threads web app, including features to more easily access custom feeds, saved posts, and likes, as well as other tools for creating new columns, copying posts for resharing, finding your favorite creators from X on Threads, and more.

Meta had initially launched its new social app in July 2023 on the URL Threads.net, as a Sequoia-backed Slack alternative startup had owned the Threads.com domain at the time. (That startup sold to Shopify last year.)

In September 2024, Meta acquired the Threads.com domain name and later began redirecting the URL Threads.com to Threads.net.

Starting today, Meta explains that users will no longer be redirected from the .com to the .net; it will be the other way around.

Going forward, if you type in Threads.com in your browser, you’ll go directly to your Threads home screen without being redirected. Meanwhile, those who type in Threads.net will be redirected to the URL Threads.com.

The change gives Meta a more prominent and better-remembered URL for its social app that now reaches over 320 million monthly active users, as of Meta’s last public earnings announcement in January. The rebrand of sorts may allow the app to better compete with its rival X, which also has a memorable (and simple!) domain name.

In addition to this change, Instagram head Adam Mosseri on Thursday announced a few other minor updates coming to the Threads web app, which is often used by creators.

He said users will now see their custom feeds appear in the web app in the same order as they appear on the mobile app. Plus, users will now be able to access their liked and saved posts via the main menu instead of having to create a pinned column to see them.

Image Credits:Threads

Another new addition allows users to copy a Threads post as an image instead of having to screenshot it. This will make it easier to share Threads posts in other apps, like Instagram, Meta thinks.

Threads users will also now be able to add a column by clicking a new column icon on the right side of the screen.

And they’ll be able to click a plus “+” button in the bottom-right to open a new window and compose a post.

Image Credits:Threads

There’s also a new feature that allows people to find and follow the same creators they previously followed on X. This feature was introduced earlier this month and works by having users download an archive of their X data, which is uploaded to Threads.

Those who previously had access to the feature were shown a pop-up saying they could now “Find popular creators from X.” The feature remains in testing, Meta says.

Keep reading the article on Tech Crunch


Mark Zuckerberg really wants to make Facebook cool again

In an ongoing antitrust case, the Federal Trade Commission says that Meta has a monopoly on “personal social networking services” and should have to spin off Instagram and WhatsApp.

Throughout the proceedings, there have been several internal messages and plans that have come to light. In one, Meta CEO Mark Zuckerberg considered wiping all Facebook users’ connections to rekindle their love for the platform. And while this idea didn’t come to fruition, it shows the amount of power Zuckerberg wields at the wheel of Facebook, Instagram, and WhatsApp.

Keep reading the article on Tech Crunch


Facebook cracks down on spammy content by cutting reach and monetization

Facebook will begin lowering the reach of accounts sharing spammy content and making them ineligible for monetization, Meta announced on Thursday. The company is also increasing efforts to remove Facebook accounts that coordinate fake engagement and impersonate others, it says.

The move comes as Meta CEO Mark Zuckerberg has promised a return to “OG Facebook.” The social network’s plan to crack down on spammy content could be seen as an attempt to return to Facebook’s glory days when users’ feeds were filled with authentic content from real people.

Meta admits that some accounts on its platform try to game the algorithm to increase views or gain unfair monetization advantages, which results in spammy content flooding users’ feeds. To remedy this, it’s cracking down on accounts that exhibit certain types of spammy behavior.

This type of behavior includes accounts that share content with long captions alongside an excessive number of hashtags. It also includes accounts that post content with captions that are unrelated to the content, such as an image of a dog with a caption about airplane facts.

Image Credits:Facebook

Meta says that while the intention behind these sorts of posts isn’t always malicious, it does lead to spammy content that ends up overshadowing original content from creators.

Facebook will also target spam networks that create hundreds of networks to share the same spammy content, making them ineligible for monetization.

To crack down on fake engagement, Facebook will reduce the reach and visibility of comments that it detects as fake engagement. Plus, Facebook will start testing a comments feature that will allow users to signal which comments are irrelevant or don’t fit in the context of the conversation. 

Image Credits:Facebook

In addition, Facebook announced that it’s updating its comment management tool to detect and auto-hide comments from people who may be using a fake identity. Creators will also be able to report impersonators in the comments. 

Today’s announcement comes a few weeks after Facebook introduced a revamped “Friends” tab that will only showcase updates from friends, without any other recommended content. Both the new Friends tab and the crackdown on spammy content show that Facebook is trying to improve users’ feeds and show them content that they actually want to see.

It’s not a surprise that Facebook is looking to return to “OG Facebook,” especially since recently uncovered emails from 2022 showed that Zuckerberg was concerned that the social network was losing cultural relevance.

Keep reading the article on Tech Crunch


and this