Person watching streaming TV in the dark
Media Society Feb 27, 2026

From Public Service to Private Agenda: Who's Engineering Your Worldview Now?

Your grandparents' TV was designed by sociologists trying to improve society. Your streaming feed is designed by algorithms optimizing for engagement. This isn't conspiracy—it's documented media studies. Here's what changed and why it matters.

LF

Lee Foropoulos

11 min read

Share:

You're three episodes deep into a new series. Something feels off. Not the story—that's fine. It's the messaging. Every character seems to exist to make a point. Every plot beat reinforces a particular worldview. You can't quite articulate what's happening, but you sense it: someone is trying to shape how you think.

Here's the uncomfortable truth: television has always been social engineering. The difference isn't whether you're being influenced—it's who's doing the influencing and what they're optimizing for.

The Broadcast Era: When Engineers Were Transparent

In 1966, a television producer named Joan Ganz Cooney received a grant to study whether TV could be used to educate preschoolers. The result was Sesame Street—not entertainment that happened to be educational, but a deliberate social intervention designed by child psychologists, educators, and researchers to close the educational gap between poor and middle-class children.

They didn't hide this. They published research papers. They invited academics to study whether it worked. The engineering was the point.

Vintage television set
Broadcast TV operated under public interest mandates—and sociologists took that seriously

Norman Lear took a different approach with the same transparency. His shows—All in the Family, The Jeffersons, Maude—weren't subtle. He put a racist, sexist character (Archie Bunker) on screen specifically so audiences would see their own prejudices reflected back. In interviews, Lear was explicit: he wanted to change how Americans thought about race, gender, and class.

The after-school special became a genre unto itself: explicit social messaging about drugs, teen pregnancy, bullying, and peer pressure. Nobody pretended these were just entertainment. They were interventions.

The Science Behind It

This wasn't random do-goodery. Researchers like George Gerbner developed Cultivation Theory—the idea that heavy television viewers gradually adopt television's version of reality as their own. Watch enough crime dramas, and you'll overestimate real-world crime rates. See enough happy families, and you'll internalize those norms.

Albert Bandura's Social Learning Theory showed that people model behavior they see on screen, especially when that behavior is rewarded. Television wasn't just reflecting society—it was actively teaching it.

The Overton Window on Screen

Television has always been a primary tool for shifting the Overton Window—the range of ideas considered acceptable in public discourse. Show something enough times, and it stops being radical. This works in both directions.

The key point: in the broadcast era, this engineering was often done by people with training in sociology, psychology, and education. They operated under FCC mandates requiring broadcasters to serve the "public interest." They published their methods. They invited scrutiny. You could disagree with their goals, but you knew what they were.

The Streaming Shift: Private Control, Hidden Mechanisms

Then came Netflix. Amazon. Disney+. Apple TV+. HBO Max. The streaming revolution didn't just change how we watch television—it changed who controls the narrative and what they're optimizing for.

No Oversight, No Mandate

Broadcast networks operated under FCC licenses that required them to demonstrate they were serving public interest. Lose that license, lose your business. It wasn't perfect accountability, but it was something.

Streaming platforms have no such obligation. They're not broadcasters—they're software companies delivering content over the internet. No FCC license. No public interest mandate. No required transparency about their editorial decisions.

The people deciding what gets made, what gets promoted, and what gets buried aren't sociologists or educators. They're executives, product managers, and—increasingly—algorithms.

Streaming service interface on screen
Your "recommendations" aren't neutral—they're optimized for engagement metrics

Optimizing for Engagement, Not Outcomes

Here's the fundamental shift: broadcast TV, whatever its flaws, was often designed with social outcomes in mind. Educate children. Challenge prejudice. Discourage drug use.

Streaming platforms optimize for a single metric: engagement. Time on platform. Content consumed. Subscriptions retained. The question isn't "Will this make society better?" It's "Will this keep people watching?"

This isn't a conspiracy—it's just business. Netflix has publicly discussed how their algorithms work. They track what makes you pause, rewind, binge, or abandon. They A/B test thumbnails, titles, and even scene sequences. Every piece of data feeds back into recommendations designed to maximize your viewing time.

The Death of Shared Experience

In the broadcast era, television created shared cultural moments. The finale of M*A*S*H was watched by 105 million people—simultaneously. The next day, everyone at work was talking about the same thing.

Streaming has fragmented that experience entirely. You and your neighbor might both subscribe to Netflix, but you're watching completely different content, surfaced by algorithms based on your individual data profiles. There's no shared cultural conversation because there's no shared cultural experience.

This fragmentation makes it harder to recognize when narratives are being pushed—because you can't compare notes with anyone watching the same feed.

The New Manipulation Playbook

Streaming platforms have developed sophisticated methods for keeping you engaged. Understanding these mechanisms is the first step to resisting them.

The Binge Model

Releasing entire seasons at once isn't about convenience—it's about exploiting your psychology. Auto-play starts the next episode before you've decided to watch it. Cliffhangers are engineered to trigger just enough anxiety that closing the app feels wrong. The platform is designed to make stopping feel harder than continuing.

This isn't accidental. Netflix's own research has shown that binge-watching correlates with depression in some users—and they've continued optimizing for it anyway because it drives engagement metrics.

The Illusion of Choice

Your streaming interface presents itself as infinite choice. In reality, algorithms dramatically narrow what you see. The top row—the content that gets clicked most—is carefully curated based on what the platform wants you to watch, weighted by what will keep you on the platform longest.

You're not browsing a library. You're being funneled through a recommendation engine that knows more about your viewing psychology than you do.

The Filter Bubble Problem

Algorithms learn your preferences and serve you more of the same. Over time, this creates filter bubbles—you only see content that reinforces your existing worldview. The platform isn't challenging you; it's confirming you.

Outrage as Engagement

Controversial content drives engagement. People watch things that make them angry. They share things that outrage them. They talk about content that triggers strong emotional reactions—positive or negative.

This creates a perverse incentive: platforms benefit from content that generates controversy, even if that content is socially divisive. The algorithm doesn't care whether engagement is healthy—only that it exists.

Lifestyle as Product

Modern streaming content increasingly functions as lifestyle marketing. Characters don't just live in apartments—they live in aspirationally decorated apartments with identifiable brands. They don't just wear clothes—they wear curated wardrobes. The line between content and advertisement has become nearly invisible.

This isn't new (product placement has existed for decades), but the sophistication has increased dramatically. Entire shows are built around lifestyle aesthetics designed to make you want things.

Who Decides Now?

In the broadcast era, you could identify who was engineering your content. Networks had names. Shows had creators. The FCC published regulations. Researchers published studies.

Today, the decision-makers are more opaque:

  • Small executive teams with enormous cultural influence but no public accountability
  • Algorithms that even their creators don't fully understand
  • Shareholders prioritizing quarterly growth over long-term social impact
  • International content deals that shape narratives to satisfy multiple governments simultaneously

When you ask "who decided this should be promoted?" the answer increasingly is "a recommendation engine optimizing for engagement metrics." That's not reassuring.

"We are in the process of creating what deserves to be called the idiot culture. Not an idiot subculture, which every society has bubbling beneath the surface and which can provide harmless fun; but the culture itself."
— Carl Bernstein, 1992 (and it's only accelerated since)

Equipping Yourself

This isn't doom-posting. Once you understand how the system works, you can make better choices.

Awareness Is Defense

The first step is simply recognizing that you're being influenced. Every recommendation is a decision someone (or something) made. Every trending show got there through mechanisms designed to capture attention. You can still enjoy content—but watch it with your eyes open.

Diversify Intentionally

Break the algorithm by actively seeking content outside your recommendations. Watch foreign films. Read books from perspectives you disagree with. Seek out documentaries with clear sourcing. Don't let a recommendation engine define the boundaries of your cultural experience.

Question Promoted Content

When something is heavily promoted, ask why. What makes this show worth a billboard campaign? Why is this documentary trending? Who benefits from this narrative becoming popular? You don't have to become paranoid—just curious.

Embrace Slow Media

Books. Long-form journalism. Documentaries with clear sourcing and transparent agendas. These formats reward attention rather than exploiting it. They're designed to be finished, not binged infinitely. Neil Postman's Amusing Ourselves to Death remains essential reading on this topic—written in 1985, it's more relevant now than ever.

Reclaim Shared Experience

Watch with other people. Discuss what you're seeing. Compare notes on what's being pushed to different feeds. The fragmentation of streaming makes individual manipulation easier—community makes it harder.

Teach Media Literacy

If you have kids, teach them to question screen narratives from an early age. Not cynicism—curiosity. "Why do you think they showed it that way?" "Who made this and what might they want?" These questions become automatic with practice.

The Bottom Line

Television has always been social engineering. The question isn't whether you're being influenced—it's by whom and toward what end.

The broadcast era had problems, but it also had transparency, oversight, and people trained in social science making content decisions. The streaming era has replaced that with algorithms optimizing for engagement, executives answering to shareholders, and no public interest mandate whatsoever.

You can't opt out of this system entirely—media is how we understand our world. But you can engage with it consciously. Question what's promoted. Diversify your sources. Discuss what you watch with others. Read books.

The screen that raised your grandparents was designed by people who at least claimed to care about society. The screen raising the next generation is designed by people who care about engagement metrics.

That's worth thinking about. Preferably with the TV off.

Share this article

Lee Foropoulos

Lee Foropoulos

Business Development Lead at Lookatmedia | Fractional Executive | Software Veteran

Lee is a seasoned software veteran and product manager who's spent too many hours studying why recommendation engines work the way they do. He writes about technology, media, and the systems that shape how we think.

Related Articles

Link copied to clipboard!