Social media as a concept can definitely be fixed. Just stop doing algorithms, period.
Strip the platforms of any and all initiative. Make them dumb pipes. Stop pretending that people want to use social media for entertainment and news and celebrities. Stop trying to turn it into interactive TV. Stop forcing content from outside of my network upon me. Make the chronological feed the only option.
Social media is meant to be a place where you get updates about the lives of people you follow. You would visit several times a day, read all new updates, maybe post your own, and that's it. The rest of the time, you would do something else.
Social media as a vessel for diverse discussion is a tall order. It’s too public, too tied to context, and ultimately a no-win game. No matter how carefully you present yourself, you’ll end up being the “bad guy” to someone. The moment a discussion touches even lightly on controversy, healthy dialogue becomes nearly impossible.
Think of it this way: you’re hosting a party, and an uninvited stranger kicks the door open, then starts criticizing how you make your bed. That’s about what it feels like to try to “fix” social media.
A lot of talk goes into how Facebook or other social media use algorithms to encourage engagement, that often includes outrage type content, fake news, rabbit holes and so on.
But here's the thing ... people CHOOSE to engage with that, and users even produce that content for social media platforms for free.
It's hard to escape that part.
I remember trying Bluesky and while I liked it better than Twitter, for me it was disappointing that it was just Twitter, but different. Outlandish short posts, same lame jokes / pithy appeals to our emotions, and so on. People on there want to behave the same way they wanted to on Twitter.
Social media is the new smoking...
Widespread adoption before understanding risks - embraced globally before fully grasping the mental health, social, and political consequences, especially for young people.
Delayed but significant harm - can lead to gradual impacts like reduced attention span, increased anxiety, depression, loneliness, and polarization
Corporate incentives misaligned with public health - media companies design platforms for maximum engagement, leveraging psychological triggers while downplaying or disputing the extent of harm
I'd like to see more software that amplifies local social interactions.
There are apps like Meetup, but a lot of people just find it too awkward. Introverts especially do not want to meet just for the sake of meeting people, so they fallback on social media.
Maybe this situation is fundamentally not helped by software. All of my best friendships organically formed in real-world settings like school, work, neighborhood, etc.
Why can't it be fixed? Just remove algorithms and show only subscribed content in chronological order. That's how most of the early platforms worked and it was fine.
I work in survey research and I'm rather appalled at how many people would rather survey a sample of AIs than a sample of people and claim they can come to some valid conclusion as a result.
There are many ways AIs differ from real people and any conclusions you can draw from them are limited at best -- we've had enough bad experiments done with real people
https://en.wikipedia.org/wiki/Stanford_prison_experiment#Int...
This seems somewhat disproven by the existence of places like this? Strict moderation really does work wonders to prevent some of the worst behaviors.
Not that you won't have problems, even here, from time to time. But it is hard to argue that things aren't kept much more civil than in other spots?
And, in general, avoiding direct capital incentives to drive any questionable behavior seems a pretty safe route?
I would think this would be a lot like public parks and such. Disallow some commercial behaviors and actually enforce rules, and you can keep some pretty nice places?
> Only some interventions showed modest improvements. None were able to fully disrupt the fundamental mechanisms producing the dysfunctional effects.
I think this is expected. Think back to newsgroups, email lists, web forums. They were pretty much all chronological or maybe had a simple scoring or upvoting mechanism. You still had outrage, flamewars, and the guy who always had to have the last word. Social media engagement algorithms probably do amplify that but the dysfunction was always part of it.
The only thing I've seen that works to reduce this is active moderation.
The problem is people.
As a species we are greedy, self serving, and short sighted.
Social Media amplifies that, and we are well on our way to destroying ourselves.
> They then tested six different intervention strategies...
None of these approaches offer what I want, and what I think a lot of people want, which is a social network primarily of people you know and give at least one shit about. But in reality, most of us don't have extended social networks that can provide enough content to consistently entertain us. So, even if we don't want 'outside' content (as if that was an option), we'll gravitate to it out of boredom and our feeds will gradually morph back into some version of the clusrterfucks we all deal with today.
Social media isn't the problem, people are the problem, and we still working on how to fix them.
If you could plug into the inner thoughts of millions of people around the world at once, it would not be pleasant.
Social media has turned out to basically be this.
Social media is a few people selling the data of many people looking at content made by some people selling something.
There is also research and promotion of values going on and the thing as a whole is entertaining and can be rigged or filtered on various levels by all participants.
It’s kind of social. The general point system of karma or followers applies and people can have a career and feeling of accomplishment to look back on when they retire. The cosmic rule of anything. too much, no good applies.
It’s not really broken but this is the age of idiots and monsters, so all bets are off.
Do all of these points apply to the traditional media funhouse mirror that we love to hate, too?
> "The [structural] mechanism producing these problematic outcomes is really robust and hard to resolve."
I see illegal war, killing without due process, and kleptocracy. It's partly the media's fault. It's partly the peoples' fault for depending on advertising to subsidize free services, for gawking, for sharing without consideration, for voting in ignorance.
Social media reflects the people; who can't be "fixed" either.
If you're annoyed with all of these people on here who are lesser than and more annoying than you, then stop spending so much time at the bar.
Can the bar be fixed?
Social media in a profit-seeking system can't be fixed. Profit-seeking provides the evolutionary pressure to turn it into something truly destructive to users. The only way it can work is via ownership by a benevolent non-profit. However, that would likely eventually give in to corruption if given enough time. Outlawing it completely, as well as regulating the algorithmic shaping of the online experience, is probably the inevitable future. Unfortunately, it won't come until the current system causes a complete societal facture and collapse.
point-to-point communication between every human on Earth to every other human on Earth flattens communication hierarchies that used to amplify expertise and a lot of other behaviors. We created new hierarchies, but they are mostly demagogues pandering to the middle. Direct delegation is sort of like trying to process an image without convolution. Nobody knows what anyone else thinks, so we just trust that one neuron.
Any interesting work on using LLMs to moderate posts/users? HN is often said to be different because of its moderation, couldn't you train an LLM moderator on similar rules to reduce trolls, ragebait, and low effort posts at scale?
A big problem I see is users in good faith are unable to hold back from replying to bad faith posts, a failure to follow the old "don't feed the trolls rule".
Here is the moment of appreciation for HN, a social media that arguably doesn't need to be fixed.
> ...the dynamics that give rise to all those negative outcomes are structurally embedded in the very architecture of social media. So we're probably doomed...
No specific dynamics are named in the remainder of the article, so how are we supposed to know if they're "structurally embedded" in anything, let alone if we're doomed?
I'm reading Tim Urban's book titled "What's Our Problem".
It definitely explains the different types of thinking that I'm making up our current society, including social media. I haven't got to the part yet where he suggests what to do about it, but it's fascinating insight into our human behavior in this day and age.
The main reason that it can't be fixed is that it has political or corporate operators and propaganda bots have taken over. There is always an agenda running through threads of social media even for mundane topics that seeking supremacy.
There are some social media networks that promise to do so - for example https://izvir.org is one of them
I'm skeptical of proving stuff about new-social-media with LLMs, because LLMs themselves are [presumably] trained on quite a bit of existing-social-media text.
ugh this again.
>Can we identify how to improve social media and create online spaces that are actually living up to those early promises of providing a public sphere where we can deliberate and debate politics in a constructive way?
they really pomp up what is effectively a message board (facebook, twitter) or a video website with a comment/message feature (youtube, tiktok) or an instant messenger with groups (whatsapp). NONE OF THIS IS NEW.
Social media can be fixed, its just the incentives are not aligned.
To make money, social media companies need people to stay on as long as possible. That means showing people sex, violence, rage and huge amounts of copyright infringements.
There is little advantage in creating real-world consequences for bad actors. Why? because it hurts growth.
There was a reason why the old TV networks didn't let any old twat with a camera broadcast stuff on their network, why? because they would get huge fines if they broke decency "laws" (yes america had/has censorship, hence why the simpsons say "whoopee" and "snuggle")
There are few things that can cause company ending fines for social media companies. Which means we get almost no moderation.
Until that changes, social media will be "broken"
Well you can't by definition fix something that is a rigged game. The social media exist to maximise the ad dollar, not to benefit you.
The solution is to disengage online. Just get off. I look forward to a social-media-less retirement.
really good article on that topic here https://www.lookatmyprofile.org/blog/social-media-apps-engin...
I'm honestly really tired of having to read through so much bloat in these types of articles. They can't just elaborate exactly on the thing of the title ? They have to spend paragraphs writing stories ?
I think this problem is partly due to greedly algos and party due to these sites being so large they have no site culture.
Site culture is what prevents mods from having to step in and sort out every little disagreement. Modern social media actively discourages site culture and post quality becomes a race to the bottom. Sure its harder to onboard new users when there are social rules that need to be learnt and followed but you retain users and have a more enjoyable experience when everyone follows a basic etiquette.
> Ars Technica: I'm skeptical of AI in general, particularly in a research context, but there are very specific instances where it can be extremely useful. This strikes me as one of them, largely because your basic model proved to be so robust.
You can't accuse them of hiding their bias and contradictions.
How can a single paper using a unproven (for this type of research) tech disprove such (alleged) skepticism.
People bending over backwards to do propaganda to harvest clicks.
What do they mean "fixed"? Wasn't social media , from day one, about gossip , self-promotion and gaslighting? they excel at that so one would say they serve their purpose.
It's very misguided to pretend that social media mobs would replace "the press". There is a reason the press exists in the first place, to inform critically , instead of listening to hearsay.
> these platforms too often create filter bubbles or echo chambers.
I thought the latest research had debunked this and showed that the _real_ source of conflict with social media is that people are forced out of their natural echo-chambers and exposed to opinions that they normally wouldn't have to contend with?
The study is based on having LLMs decide to amplify one of the top ten posts on their timeline or share a news headline. LLMs aren’t people, and the authors have not convinced me that they will behave like people in this context.
The behavioral options are restricted to posting news headlines, reposting news headlines, or being passive. There’s no option to create original content, and no interventions centered on discouraging reposting. Facebook has experimented[0] with limits to reposting and found such limits discouraged the spread of divisive content and misinformation.
I mostly use social media to share pictures of birds[1]. This contributes to some of the problems the source article[2] discusses. It causes fragmentation; people who don’t like bird photos won’t follow me. It leads to disparity of influence; I think I have more followers than the average Mastodon account. I sometimes even amplify conflict[3].
[0] https://www.socialmediatoday.com/news/internal-research-from...
[1] https://social.goodanser.com/@zaktakespictures/
[2] https://arxiv.org/html/2508.03385v1#S3
[3] https://social.goodanser.com/@zaktakespictures/1139481946021...