Close Menu
Alpha Leaders
  • Home
  • News
  • Leadership
  • Entrepreneurs
  • Business
  • Living
  • Innovation
  • More
    • Money & Finance
    • Web Stories
    • Global
    • Press Release
What's On
She was a customer before she was the CFO. Now she’s steering Workiva to  billion in revenue

She was a customer before she was the CFO. Now she’s steering Workiva to $1 billion in revenue

31 March 2026
How Government Attempts To Reduce Health Spending Can Paradoxically Raise Health Costs

How Government Attempts To Reduce Health Spending Can Paradoxically Raise Health Costs

31 March 2026
Jamie Dimon says American Dream is ‘slipping out of reach’—JPMorgan will spend billions to fix it

Jamie Dimon says American Dream is ‘slipping out of reach’—JPMorgan will spend billions to fix it

31 March 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Alpha Leaders
newsletter
  • Home
  • News
  • Leadership
  • Entrepreneurs
  • Business
  • Living
  • Innovation
  • More
    • Money & Finance
    • Web Stories
    • Global
    • Press Release
Alpha Leaders
Home » What The Meta And YouTube Ruling Means For You
Innovation

What The Meta And YouTube Ruling Means For You

Press RoomBy Press Room31 March 202611 Mins Read
Facebook Twitter Copy Link Pinterest LinkedIn Tumblr Email WhatsApp
What The Meta And YouTube Ruling Means For You

Last week marked a landmark moment in tech history when a Los Angeles jury ruled that social media giants Meta and YouTube were negligent in designing features that harmed a 20-year-old’s mental health.

The seven-week trial kicked off in February, more than two years after California-based “Kaley G.M.” sued the platforms in 2023, alleging that using social media at a young age led to her mental health problems such as body dysmorphia and depression. She also sued TikTok and Snap, both of which confidentially settled just hours before jury selection was due to begin.

Lawyers representing Kaley argued that the platforms hook young users with features such as infinite scrolling, autoplay videos and beauty filters. A jury found that the platforms intentionally built addictive platforms that harmed her health, and that Meta was 70% responsible for Kaley’s harms while YouTube was 30% responsible. She was awarded a total of $6 million in damages — an outcome likely to have crucial ramifications for thousands of similar lawsuits now making their way through U.S. courts.

The ruling came on the heels of a New Mexico jury determining that Meta was liable for $375 million in damages after State Attorney General Raúl Torrez alleged the platform’s features enabled predators and pedophiles to exploit children.

“This damning verdict is a landmark moment in recognizing the harm caused by tech giants in the manipulative designs of their social media platforms,” said Erika Guevara-Rosas, senior director of research, advocacy, policy and campaigns at Amnesty International.

“For years, social media companies including Meta and YouTube have profited from targeting children and young people with addictive design features that prioritize engagement over wellbeing. They have deliberately built into their platforms features such as infinite scroll, autoplay, and persistent notifications that are engineered to ‘hook’ young users into compulsive use.”

Meanwhile, Meta and Google said they disagreed with the verdict and intended to appeal.

A spokesperson for Google said in a statement to the media, “This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.”

How The Features In Question Work

Infinite Scroll is perhaps the most discussed feature in the trial. Rather than presenting users with a paginated feed that has a clear end point, infinite scroll loads new content continuously as users scroll downward, removing any natural stopping point. The concept was first introduced in 2006 by designer Aza Raskin, who has since expressed regret about its consequences. The feature exploits what behavioral psychologists call “variable-ratio reinforcement” — the same mechanism that makes slot machines addictive. Users never know whether the next post will be brilliant or boring, and that uncertainty triggers compulsive seeking. As former Facebook Engineering Director Arturo Béjar testified, internal communications suggested some Meta employees raised concerns about growing “reward tolerance” among users — the phenomenon where users need more and more content to feel the same engagement.

Autoplay is the feature that automatically starts playing a video the moment a previous one ends — or when a user’s scroll brings them to a video in their feed. Autoplay extends viewing sessions significantly, because users often watch at least part of a video before deciding whether to stop — and by that point, the next one has already begun. YouTube, which is primarily a video platform, relies on autoplay heavily. Platforms typically use it in conjunction with recommendation algorithms that serve up content predicted to maximize a user’s time on the platform.

Interestingly, according to Béjar, “hated [autoplay]”.

“They found it disruptive,” he said. “The result was that more people watched more videos and advertisers were happy, but users were unhappy.”

Mark Lanier, who led the lawyers representing Kaley, likened endless scroll and autoplay to getting free tortilla chips at a restaurant — and not being able to stop eating them.

Algorithmically Timed Notifications are a mechanism designed not merely to inform users of activity, but to pull them back into the app at psychologically optimal moments. Notifications exploit the brain’s fear of missing out, inducing a kind of anxiety that can only be relieved by opening the app. The platforms’ own research, according to internal documents introduced as evidence, tracked patterns of compulsive use. When a notification arrives, it creates an almost reflexive urge to check — and checking reinforces the scroll cycle.

Beauty Filters received specific attention because of Kaley’s diagnosis with body dysmorphia. Instagram’s filters allow users to alter their appearance — smoothing skin, changing eye shape, slimming faces — in photos and videos.

“When I got a bunch of likes, I was really happy,” Kaley said in court. “If I didn’t get a lot of likes, I would feel I shouldn’t have posted it, I was ugly.”

After Lanier unspooled a long banner of photos she posted on Instagram, she testified that “almost all” had one of the filters that allows people to change their appearance.

Kaley testified that she did not feel bad about her body until she began using Instagram with its filters. Zuckerberg himself was questioned during testimony about why he retained beauty filters despite internal research flagging their negative impact on young girls’ body image. Behavioral addiction expert Mark Griffiths told the court that receiving likes — often driven by filtered appearance — can release dopamine, the chemical associated with pleasure and motivation, creating a cycle of validation-seeking.

Recommendation Algorithms underpinned all of these features. The platforms’ AI systems analyze every scroll, pause, like, share and repeat visit to build a detailed model of what will keep a given user on the platform longest — and then serve content accordingly. The plaintiff’s case argued that for vulnerable young users, these algorithms do not optimize for wellbeing — they optimize for engagement, which can mean serving progressively more extreme, emotionally triggering or appearance-focused content to users who show a susceptibility to it.

Together, these features were characterized in court as an interlocking system that — particularly for young people whose prefrontal cortexes are still developing — can create genuine compulsive use patterns and hamper health. The jury concluded that Meta and Google knew this, failed to adequately warn users and prioritized engagement over safety.

What Can Users Do Now?

The verdict confirms what many parents, educators and mental health professionals have been concerned about for years: that addictive qualities of social media platforms are, in significant part, a design feature, not a bug. The platforms are designed by teams of behavioral scientists and engineers whose explicit goal is to maximize the amount of time you spend on the app. A Wall Street Journal investigation, “the Facebook files,” found that at least one major social media firm knows far more than it admits about the various harms its products inflict.

The taxonomy around this makes it even more complicated, including how users frame the issue for themselves. A 2025 peer-reviewed study published in Scientific Reports urges caution about how broadly the term “addiction” is applied. Researchers Ian Anderson and Wendy Wood at the University of Southern California surveyed a national sample of 1,204 adult Instagram users and found that only 2% met clinical criteria for addiction risk on a standard symptom scale — and yet 18% believed themselves to be addicted. The gap, the researchers argued, is itself a problem: when users label their behavior as addiction (rather than habit, which is what heavy social media can also be), they experience lower perceived control, higher self-blame and more failed attempts to cut back.

The researchers caution that broad use of addiction language, amplified by media coverage and litigation, may be creating a self-defeating cycle — making users feel more helpless rather than more empowered to change.

Another large-scale study approaches that point from a different angle.

A major 2026 longitudinal study published in JAMA Pediatrics tracking nearly 101,000 Australian students across grades 4 through 12 over three years found that the relationship between social media use and wellbeing is not a straight line — it’s a curve. Moderate users fared best, and both the heaviest users and non-users showed worse wellbeing outcomes, though the risks shifted depending on age and sex.

For girls, moderate use became the most protective pattern from middle adolescence onward. For boys, the findings were interesting in a different direction: by late high school, boys who used no social media at all showed worse wellbeing outcomes than even the heaviest users — suggesting that for teenage boys, total abstinence may carry its own social costs, likely tied to isolation from peer networks that increasingly exist online. The authors are careful to note that the findings are observational and cannot establish causation, but the study’s scale — nearly 174,000 data points — makes it a a strong examination of this question, and its central message is a meaningful complication to simple “less is more” prescriptions. The goal, it suggests, should be helping young people find a sustainable middle ground, not eliminating social media use entirely.

Ultimately, for children and teenagers, whose brains are still developing the impulse control and self-regulation capacities that adults rely on to moderate their behavior, the risk is high. KGM testified that she was unable to stop using Instagram even when she tried — because the platform’s notification systems and algorithmic rewards were specifically designed to override that kind of self-directed restraint.

Here is what the research suggests users can actually do:

Turn off all notifications from social media apps. Notifications are not designed to be helpful information and reminders — they are designed to pull you back into the app at moments when the algorithm predicts you are most susceptible. Disabling them removes one of the most powerful re-entry mechanisms.

Replace infinite scroll with time-limited access. Research recommends treating social media the way you would treat any other scheduled activity — specific windows of time and time blocks, rather than background availability throughout the day. Apps are significantly more difficult to resist when they live on the home screen of a phone you carry everywhere; moving them off the home screen and developing time blocks create a meaningful barrier.

For parents of younger children: The legal minimum age for social media accounts is 13. Since age minimums are not being effectively enforced by companies, having direct conversations with children about why these platforms are designed the way they are — and what specific mechanisms they use — may give younger people a framework to understand what’s happening rather than simply being told “less screen time.”

Use available parental tools critically. Parental controls exist on all major platforms. Screen time limits, content restrictions, and account linking are available on Instagram, YouTube, TikTok, and Snapchat. However, experts caution that these tools are imperfect and often easy for teenagers to circumvent — and that they do not alter the underlying algorithmic architecture. They are a useful layer of protection, but not a complete solution.

Watch for warning signs. Victoria Burke, a therapist who treated KGM and testified at the trial, said that for her client, social media and her sense of self “were closely related” — what happened online would “make or break her mood.” Signs that social media use may be becoming problematic include: significant emotional distress when access is removed; withdrawal from in-person relationships; use of the platforms to cope with difficult emotions, which can deepen rather than relieve distress; disrupted sleep; and a deteriorating sense of self-image connected to online feedback.

New Beginnings?

If history is any guide, what happened in March 2026 could give way to a new era. For several years, one of Big Tech’s premises was that the attention of its users is a product to be harvested and sold, and the tools used to harvest it are neutral features of a communication service. A jury of twelve in Los Angeles has now officially rejected that premise. They heard what the companies knew, when they knew it and what they chose to do with that knowledge — and they concluded it amounted to malice.

Whether that conclusion survives appeal, Congress acts or a global settlement emerges is so far unknown, but the industry’s argument that it bears zero responsibility for the human cost of its engineering choices is no longer the clean legal defense it once was.

Big Tobacco social media Infinite scroll Instagram addiction trial Kaley G.M. Meta YouTube negligence ruling social media addiction Social media lawsuit Social media mental health Social media trial KGM Teens addictive design features
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link

Related Articles

How Government Attempts To Reduce Health Spending Can Paradoxically Raise Health Costs

How Government Attempts To Reduce Health Spending Can Paradoxically Raise Health Costs

31 March 2026
AI Sandboxes Are Crucial Regulatory Safety Nets For Advancing AI And Saving Humanity From Calamity

AI Sandboxes Are Crucial Regulatory Safety Nets For Advancing AI And Saving Humanity From Calamity

31 March 2026
Latest Updates After Raw At MSG

Latest Updates After Raw At MSG

31 March 2026
20 States May See Aurora Tuesday Night

20 States May See Aurora Tuesday Night

31 March 2026
Exactly When To Watch NASA Launch Astronauts To The Moon This Week

Exactly When To Watch NASA Launch Astronauts To The Moon This Week

31 March 2026
U.S. Seeks Critical Mineral Battery Projects To Invest 0 Million

U.S. Seeks Critical Mineral Battery Projects To Invest $500 Million

31 March 2026
Don't Miss
Unwrap Christmas Sustainably: How To Handle Gifts You Don’t Want

Unwrap Christmas Sustainably: How To Handle Gifts You Don’t Want

By Press Room27 December 2024

Every year, millions of people unwrap Christmas gifts that they do not love, need, or…

Walmart dominated, while Target spiraled: the winners and losers of retail in 2024

Walmart dominated, while Target spiraled: the winners and losers of retail in 2024

30 December 2024
Moltbook is the talk of Silicon Valley. But the furor is eerily reminiscent of a 2017 Facebook research experiment

Moltbook is the talk of Silicon Valley. But the furor is eerily reminiscent of a 2017 Facebook research experiment

6 February 2026
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Latest Articles
Ousted Air Canada CEO failed to speak French—and forgot the basics of crisis leadership

Ousted Air Canada CEO failed to speak French—and forgot the basics of crisis leadership

31 March 20261 Views
Latest Updates After Raw At MSG

Latest Updates After Raw At MSG

31 March 20261 Views
Is the org chart dead in the age of AI? LinkedIn’s chief economic opportunity officer thinks so

Is the org chart dead in the age of AI? LinkedIn’s chief economic opportunity officer thinks so

31 March 20260 Views
20 States May See Aurora Tuesday Night

20 States May See Aurora Tuesday Night

31 March 20260 Views
About Us
About Us

Alpha Leaders is your one-stop website for the latest Entrepreneurs and Leaders news and updates, follow us now to get the news that matters to you.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks
She was a customer before she was the CFO. Now she’s steering Workiva to  billion in revenue

She was a customer before she was the CFO. Now she’s steering Workiva to $1 billion in revenue

31 March 2026
How Government Attempts To Reduce Health Spending Can Paradoxically Raise Health Costs

How Government Attempts To Reduce Health Spending Can Paradoxically Raise Health Costs

31 March 2026
Jamie Dimon says American Dream is ‘slipping out of reach’—JPMorgan will spend billions to fix it

Jamie Dimon says American Dream is ‘slipping out of reach’—JPMorgan will spend billions to fix it

31 March 2026
Most Popular
AI Sandboxes Are Crucial Regulatory Safety Nets For Advancing AI And Saving Humanity From Calamity

AI Sandboxes Are Crucial Regulatory Safety Nets For Advancing AI And Saving Humanity From Calamity

31 March 20261 Views
Ousted Air Canada CEO failed to speak French—and forgot the basics of crisis leadership

Ousted Air Canada CEO failed to speak French—and forgot the basics of crisis leadership

31 March 20261 Views
Latest Updates After Raw At MSG

Latest Updates After Raw At MSG

31 March 20261 Views
© 2026 Alpha Leaders. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.