We Need a Social Platform That Puts People First

Women, queer communities, racial minorities, and older workers drown in attacks while the platform charges advertisers billions in blood money.

We Need a Social Platform That Puts People First

Why platforms must serve users, not algorithms

Women send me messages every week. They send screenshots of strangers rating their bodies, telling them to smile, or threatening them.

LGBTQ friends show slurs under harmless photos.

Black colleagues forward job rejections that read like code for “not our culture.”

Abuse piles up while the feed keeps rolling.

Every major network sells ads. Rage keeps users scrolling. Profit sets the rules. The rest of us fight for daylight.

Who Gets Pushed Aside

When engagement rules, the smallest voices disappear first.

The algorithm smells “controversy” and pushes hate to the top.

Women, queer communities, racial minorities, and older workers drown in attacks while the platform charges advertisers billions in blood money.

  • Harassment: Four in ten U.S. adults report abuse online. One in four face stalking or threats.
  • Women silenced: Nearly three quarters of women journalists endure online violence.
  • LGBTQ targets: Two thirds of LGBTQ adults expect hostile policy shifts to hit their daily lives.

Algorithms Keep the Merry-Go-Round Spinning

Researchers checked short‑form video searches last year. Two out of three clips showed bias. Type in a slur and the feed served jokes and insults about women, Black people, and Muslims. Most results had nothing to do with the words you typed.

Platforms also erase peaceful speech.

From October to November 2023, Meta pulled 1,050 Palestine‑related posts. Human Rights Watch checked them. Nine out of ten showed no call for violence. Mothers shared grief, journalists streamed bombings, aid workers posted hospital footage. All gone without clear appeal.

The pattern repeats.

TikTok muted #BlackTikTokStrike for two days in 2022.

LGBTQ creators say the word “lesbian” still halves their reach.

In 2024 the Electronic Frontier Foundation found auto‑moderation flagged one in five posts on reproductive rights as “graphic” even when doctors wrote them.

Each removal drains trust. Families lose memorials. Activists lose evidence. The public loses context.

This scrub-for-profit habit is not neutral. It favors the bland post that calms advertisers. Years of that whitewash helped lead us here: a fascist back in the White House and half the country blind to the harm Donald Trump and the Republican Party keep doing.

Silence enough voices and ignorance wins elections.

Follow the Money

Social media ad spend will be over $247 billion this year, up 14% from 2023. Meta alone made $134 billion in 2024, more than every U.S. newspaper earned at its peak. ByteDance hit $120 billion. YouTube’s $31 billion out‑paced every cable network.

Profits soar, yet trust keeps falling.

Edelman’s 2025 Trust Barometer shows confidence in social media at 42 percent, the lowest of any media category. Users hand over data and attention while executives cash stock bonuses and slash safety teams.

The gap grows wider: money climbs, moderation budgets shrink, and the public pays the price.

Real Stories Behind the Numbers

  • Layla, a queer Filipino activist, lost her account twice after posting Pride videos.
  • Mark, 62, was told he was “too old to matter” on a career post. The flagged comment stayed live for two days.
  • Amal, a journalist, watched her live stream freeze during a raid. The clip vanished without notice.

These are not edge cases. They form the pattern.

What a People‑First Platform Needs

Picture a space where you know exactly how far your words travel.

A small user group, elected and rotated, handles moderation in daylight. You can flip a switch to see posts in the order they land, no mystery math driving the feed.

Ads stay at a set limit and the revenue ledger is open for anyone to read.

Slurs and threats get caught before they hit your screen.

New voices break through because the system lifts small accounts every day.

Can We Build It?

Yes. The pieces already sit on GitHub: servers, apps, safety tools. We can wire them together today. No single company owns the feed.

Money sounds big, but it isn’t. Investors put $300 billion into tech last year. One half of one percent, $1.5 billion, would run servers and pay a lean safety crew for ten years. We don’t need that much to start.

A few thousand people could chip in the price of a streaming subscription and fund a live beta. Two mid‑size foundations could carry us the rest of the way.

The blueprint is ready. We just have to say yes.

Who Joins?

This project is for anyone who opens an app and feels smaller when they close it.

  • Developers who miss the joy of building tools that help, not hook.
  • Writers and artists exhausted by the algorithm lottery.
  • Women who are done muting their own voice to dodge trolls.
  • LGBTQ friends tired of being flagged for existing.
  • Seniors and workers pushed aside by age filters and pay‑to‑play reach.
  • Brands that would rather earn trust than chase rage clicks.

If you have ever clicked “report” and watched nothing happen, you belong here. If you still believe public spaces can feel safe and useful, you belong here. Bring your code, your story, your stubborn hope.

A Practical Next Step

I need your voice before we write a single line of code. Tell me in the comments:

  • What hurts most on today’s platforms?
  • What must a new one fix first?

If this thread fills with clear stories and sharp ideas, we will schedule a live call in mid‑August and begin drafting the public charter together. Want in? Tell me. Bring skill, curiosity, and a calm ego.

Why Bother?

We lose whole evenings to feeds that serve rage before truth. My daughter turns 14 next year; the algorithm already knows her name. Your kids will stand in that same storm. If we see the damage and stay silent, we become part of it.

What future do you want? One where voices travel farther than ads and your child joins a feed that won’t train them to hate.

I would rather build one honest room than keep begging the old owners to fix theirs. A test server, a hundred users, rules we draft in daylight, that is enough for a start. If it works, we scale. If it fails, at least we tried with clear eyes.

Share one change that would make you leave the big sites and join us. Scroll the replies, find a wish that matches yours, and add a fix. A thread can become a plan. A plan can become code.

Platforms should lift speech, not crush it for cash. The data shows harm. The stories show cost. Let’s build the alternative instead of begging for scraps.

I’m ready. Are you?