
Social media. It’s become a trap full of bots, bad actors, misinformation, fake profiles and anonymity. It’s a mental-health hazard and a propaganda snake pit. Many consider it an evil force best ignored. This reality has come about in tiny bites, over decades, as we the internet pioneers circle each other in networks that increasingly undermine human connectivity as much as they enable it.
Must it be this way? Where did we go wrong? How do we fix it?
These are the questions we ask as we roll out Media.com, frame by frame, feature by feature. Our fundamental thrust being that ignorance of a given technological force causes trepidation and the sense that the force controls us. We’re here to provide a better way than social media has provided so far — not only to connect with each other, tell stories and keep informed, but also to quench the thirst so many of us have for reliability, accountability and transparency in the content we consume.
We consider this to be a glaring void the internet’s many pioneers overlooked in their great race to connect billions of nodes to each other – real or not, identifiable or anonymous – as they sought eyeballs or at least potential eyeballs, at almost any cost.
Perhaps initial generations of social media failed in this regard. Disorder was the rule. Connection was the pitch. Engagement was the measure. Advertising revenue was the bottom line. Comprehension came later.
We the innocent consumers saw the shiny new thing and hopped in. We drove and drove until the thing broke down, in the middle of nowhere, with ominous clouds approaching. And here we are: anxious. Unable to fix the thing because, yes, it turns out selfies and untraceable incendiary comments don’t fix much. We’ve gone a little cybersick in the process, within a media metaverse that lacks the coherence to calm itself down.
We aim to learn from and repair the era that brought us here. We seek to rediscover optimism about how the internet and social media might enhance, but not dictate, our personal and business lives. Doing so requires full comprehension and discussion of social media’s effects and mechanics.
Simple. Coherent. Authentic. Trustworthy. Mature. Anxiety-free.
These are some of our guideposts. This is where a new way to think about “social media all grown up” begins — with intention.
Certainly a more mindful internet is possible, following in the tradition of a medium built to find improvements from within, as media and tech build atop each other, new version after new version. That’s our position — for the record. We can do this.
The first challenge: verification
Generations raised on smartphones and online networks seem to grasp these pitfalls more candidly than most. Internet ethnographer Katherine Dee describes the current era as shaped by “an internet-native sadism” where depression, nihilism and sleeping with one’s phone is a common experience for many under 40.
Indeed, millions now bear battle scars from a system that was perhaps never built to manage the forces of online exploitation. We often blame smartphones, but a smartphone is just hardware — a portal to the internet, wired by media in all its forms. We’re addicted to the content, not the device. A television is only as powerful as the shows and content it broadcasts; otherwise, it’s just an inert plastic box. A phone is the same.
In developing our vision for a next-generation social media platform, we prioritized verified identity. We examined how open systems can be exploited and concluded that anonymity — and the bots that feed on it — had to go.
We also implemented an age threshold (18) and built trust and safety into the foundation of our platform. Our Trust Center is at the core of this effort, ensuring oversight on issues such as: dangerous organizations, adult sexual abuse, hate speech, misinformation, copyright laws and high-risk users. See our Trust Center for more details.
Verification is our first step toward a trusted online universe. Many online seem to think anonymity is a fundamental reason to be online; we don’t question that this is attractive. Anonymity has its place, especially for those caught in oppressive societies and situations. But that’s not us. We are the first major social media community to validate each and every user. We understand the need for a safe anonymous space for some, but the other guys have that covered.
For individuals, a valid government-issued photo ID (a passport or driver’s license) is required. For a business profile, we verify an authorized officer before pursuing additional documentation under “Know Your Customer” and “Know Your Business” standards.
This means every piece of content on Media.com has a traceable source. We believe candid conversations and provocative content can coexist with this model. We believe internet commerce and social engagement online function better when humans — not bots — are in control. And we insist that a town square isn’t a town square if it’s full of cardboard cutouts with fake names pretending to be human. After all, isn’t that what computer games are for?
This is real social media, built on trust, safety, opportunity and imagination. A space where fragmented apps, paranoia about online interactions, and distrust dissolve into a sustainable joie de internet — a love of the internet. Yes, we believe it’s possible.
A higher-quality engagement
Susannah Page-Katz, our lead on content moderation, previously served as director of trust and safety at Kickstarter, where she drafted its AI guardrail policy. Crucially, this policy didn’t take a black-and-white stance on AI content creation; instead, it acknowledged AI as a media tool while ensuring creators remained transparent about its use.
The idea was to exercise control over a whiz-bang technology, while implementing a flexible approach that respects a creator’s inherent freedom to create as well as protect work from those that would steal or plagiarize. To control bot software, not bow to its supremacy. The internet is full of short cuts, safe to say — to fame, knowledge, art, journalism, wealth, etc. Page-Katz has been working on many forthcoming tools that address fact-checking and identity.
The trend is moving in this direction. A 2023 YouGov survey of 1,000 adults, cited by Bloomberg Law, found that 62% of respondents believe platforms should require real names and identity verification. Europe leads with tough policies Media.com follows, and several American states have enacted laws to “prohibit impersonation for purposes of harassment, intimidation, threat or deception to facilitate contact.”
Freedom of speech being a respected pillar within our approach, within reason. Does an anonymous burner-bot have the right to say whatever it wants? Not really. Not in our space. We’d have to point out that such entities don’t really “speak” at all. They mimic speech; they fake it. And that is not a protected right, last time we checked.
“These laws aren’t an attempt to limit what anyone can say,” wrote Bruce Heiman, a partner at the law firm K&L Gates, for Bloomberg. “They aim to prevent false identification.”
Page-Katz echoed Heiman. "We’re verifying the individuals behind the account, not the information they might share,” she said. “That’s the safety piece.”
Christy Grace Provines, our global head of brand, encourages new visitors to explore the Trust Center to gain a comprehensive understanding of our approach. Provines emphasizes that every feature we introduce is designed to “do less harm” and “be less toxic.”
“Our Trust Center outlines our reason to exist,” she said. “And verification matters because there are no bots on the platform. That’s our aim.”
We’re building a platform where engagement is higher quality, with fewer trolls, less spam and reduced toxicity. Hate speech, doxing and threats will trigger oversight. The goal is to create a digital space where accountability, trust and meaningful interactions thrive.
Bit by bit, we’re on a mission to redefine social media. We’ve identified the void and have set about filling it. We’ve established a safe online space and we invite you to be a part of it. We hope you’ll join us for our mission.
Recent posts
see all