NHacker Next
login
▲Study: Social media probably can't be fixedarstechnica.com
176 points by todsacerdoti 22 hours ago | 168 comments
Loading comments...
Zak 15 hours ago [-]
The study is based on having LLMs decide to amplify one of the top ten posts on their timeline or share a news headline. LLMs aren’t people, and the authors have not convinced me that they will behave like people in this context.

The behavioral options are restricted to posting news headlines, reposting news headlines, or being passive. There’s no option to create original content, and no interventions centered on discouraging reposting. Facebook has experimented[0] with limits to reposting and found such limits discouraged the spread of divisive content and misinformation.

I mostly use social media to share pictures of birds[1]. This contributes to some of the problems the source article[2] discusses. It causes fragmentation; people who don’t like bird photos won’t follow me. It leads to disparity of influence; I think I have more followers than the average Mastodon account. I sometimes even amplify conflict[3].

[0] https://www.socialmediatoday.com/news/internal-research-from...

[1] https://social.goodanser.com/@zaktakespictures/

[2] https://arxiv.org/html/2508.03385v1#S3

[3] https://social.goodanser.com/@zaktakespictures/1139481946021...

lemming 11 hours ago [-]
LLMs aren’t people, and the authors have not convinced me that they will behave like people in this context.

This was my initial reaction as well, before reading the interview in full. They admit that there are problems with the approach, but they seem to have designed the simulation in a very thoughtful way. There really doesn't seem to be a better approach, apart from enlisting vast numbers of people instead of using LLMs/agent systems. That has its own problems as well of course, even leaving cost and difficulty aside.

There’s no option to create original content...

While this is true, I'd say the vast majority of users don't create original content either, but still end up shaping the social media environment through the actions that they did model. Again, it's not perfect but I'm more convinced that it might be useful after reading the interview.

fao_ 4 hours ago [-]
> > There’s no option to create original content...

> While this is true, I'd say the vast majority of users don't create original content either, but still end up shaping the social media environment through the actions that they did model. Again, it's not perfect but I'm more convinced that it might be useful after reading the interview.

Ok, but, this is by design. Other forms of social media, places like Mastodon etc have a far, far higher rate of people creating original content.

Zak 10 hours ago [-]
I'm not sure the experiment can be done other than to try interventions on real users of a public social media service as Facebook did in the article I linked. Of course people running those services usually don't have the incentives to test harm reduction strategies and certainly don't want to publicize the results.

> the vast majority of users don't create original content

That's true now at least most of the time, but I think it's as much because of design and algorithmic decisions by the platforms to emphasize other types of content. Early Facebook in particular was mostly original content shared between people who knew each other. The biggest problem with that was it wasn't very profitable.

nradov 10 hours ago [-]
Nonsense. The vast majority of my Facebook friends post at least some original content.
lemming 9 hours ago [-]
Fortunately we don't have to rely on your anecdata, people actually study this stuff:

https://news.gallup.com/poll/467792/social-media-users-incli...

U.S. adults commonly engage with popular social media platforms but are more inclined to browse content on those websites and apps than to post their own content to them. The vast majority of those who say they use these platforms have accounts with them, but less than half who have accounts -- and even smaller proportions of all U.S. adults -- post their own content.

https://www.pewresearch.org/internet/2019/04/24/sizing-up-tw...

Most users rarely tweet, but the most prolific 10% create 80% of tweets from adult U.S. users

https://www.pewresearch.org/internet/2021/11/15/the-behavior...

The analysis also reveals another familiar pattern on social media: that a relatively small share of highly active users produce the vast majority of content.

nradov 8 hours ago [-]
That's junk science and doesn't refute the specific point I made. Facebook users are far more likely to post original content than X users. It might just be some blurry backlit vacation photos but it is original content.
smackeyacky 8 hours ago [-]
They post but it doesn’t get read, all their friends feeds are just swamped with crap like theirs is.
epgui 2 hours ago [-]
But then we’re back to blaming the algorithm.
Zak 35 minutes ago [-]
Algorithmic choices are likely a major contributor to the phenomenon. If posting vacation photos on Facebook gets interactions from friends and family, more people will do it. If it doesn't, fewer people will.
grishka 5 hours ago [-]
Social media as a concept can definitely be fixed. Just stop doing algorithms, period.

Strip the platforms of any and all initiative. Make them dumb pipes. Stop pretending that people want to use social media for entertainment and news and celebrities. Stop trying to turn it into interactive TV. Stop forcing content from outside of my network upon me. Make the chronological feed the only option.

Social media is meant to be a place where you get updates about the lives of people you follow. You would visit several times a day, read all new updates, maybe post your own, and that's it. The rest of the time, you would do something else.

dragonwriter 4 hours ago [-]
> Stop pretending that people want to use social media for entertainment and news and celebrities

People actually want media, social and otherwise, for exactly that.

> Stop forcing content from outside of my network upon me.

There are social media apps and platforms that don't do that. They are persistently less popular. People, by and large, do want passive discovery from outside of their network, just like they do, in aggregate, want entertainment and news and celebrities.

> Make the chronological feed the only option.

Chronological by what? Original post? Most recent edit? Most recent response? Most recent reaction?

TheOtherHobbes 3 hours ago [-]
People want addictions. The answer is to regulate addictive algorithms, not to give them more things to be addicted to.

But addictions are wonderfully useful politically, so that's unlikely to happen.

The point is simple - an algorithm is a form of meta-content. It's curated and designed, not neutral. And so are its commercial, psychological, and political effects.

Currently SM companies have been allowed to use algorithms with little or no oversight. The emphasis has been on regulating privacy, not influence.

In the same way the media need to have a Fairness Doctrine restored to restore sanity and quality to journalism, algorithm providers need to be able to demonstrate an equivalent for their platforms.

This is very much against the spirit of the times, but that spirit is algorithmically created - which just proves the point.

If you're thinking "Yes, but government..." - how do you know that's a spontaneous original thought, and not something you've been deliberately conditioned to believe?

grishka 3 hours ago [-]
Yes, people want addictions. So give them addictions. Just don't sacrifice actual social media for that. Ideally, in my view of the world anyway, there would be separate commercial addiction-focused platforms like TikTok, and separate non-addictive, pure social media platforms like the fediverse, preferably run by nonprofits.
tgv 2 hours ago [-]
> People want addictions.

I think that's incorrect. Many addicts despise their addiction. A better way to look at it is: people can get addicted easily. Nobody gets addicted to paper press lubricant. The addiction is initiated by a positive experience, which often means: pleasure. Paper press lubricant doesn't provide that, but alcohol and facebook do.

It may not be much of a distinction, but sometimes it helps to think about/see it in another way.

gyomu 5 hours ago [-]
> Strip the platforms of any and all initiative. Make them dumb pipes. Stop pretending that people want to use social media for entertainment and news and celebrities. Stop trying to turn it into interactive TV. Stop forcing content from outside of my network upon me. Make the chronological feed the only option.

Congrats, you now have platforms no one will care about, as attention span gets sniped by competitors who want to maximize engagement and don't care about your arbitrary rules (ie, literally what happened 15 years ago).

rsolva 2 hours ago [-]
I like to think that humanity is starting to build up some amount of immunity and aversion towards algorithmic feeds. Non-tech friends and family in my life (Norway and the Netherlands) express a rapidly growing discomfort being on Google and Meta's platforms, and some have gone cold turkey, others are actively formulating an exit strategy.

I dropped out of these platforms some years ago and have been very happy having the Fediverse (and HN!) as my only social media. It is just the right amount of engagement and impulse for me. I do not check my feeds compulsively, but occasionally – and the people I follow is a diverse bunch, giving me food for thought and keeping me up to date with topics and software projects.

It is still a niche place to hang out, but I'm OK with that. Now and then, friends get curious enough to join and check it out.

xanderlewis 5 hours ago [-]
> you now have platforms no one will care about

I would care, and I imagine there are others who would too. I don’t use social media anymore (at all!) because of this. If I could have the chronological feed restored and no intrusion of other content I’d redownload immediately. There must be a market for this.

gyomu 4 hours ago [-]
There's plenty of options, take your pick. The latest one that I've been hearing about is https://retro.app, there are others.

The issue of course is that your friends won't be on it, most of them won't sign up even if you beg them, and most likely none of you will be using the service anymore 6 months from now.

grishka 5 hours ago [-]
What do you mean by "no one"? There is definitely demand for such a platform.
rickdeckard 4 hours ago [-]
It's a dilemma.

There might be demand, but this "platform A" will be in competition with a dopamine-focused engagement "platform B" which also supports to host updates from "the lives of people you follow".

The majority of people will then have both installed but spend more time on "platform B" as it is actively engaging them.

Platform A will again end up in competition for user-attention with Platform B, as it needs money to operate their business etc.

Now if Platform A asks for a subscription fee to fund their non-engagement social media platform, how many of these users described above will pay and not simply prefer the free "platform B"?

How will such a churn affect users willing to PAY for "Platform A", if people whose "life they want to follow" have completely moved to "Platform B"?

Funny enough, as a European I could use WhatsApp as this "Platform A", as it has features to share status-updates, pictures etc. as part of your profile. Everyone I know has WhatsApp, noone is using those features.

So in essence, this Platform A already exists in Europe but doesn't work as "social media" because people don't engage with it...

grishka 4 hours ago [-]
> The majority of people will then have both installed but spend more time on "platform B" as it is actively engaging them.

And why would that be a problem? Most people also spend more time sleeping than using social media, so what? Let them be. Give them a tool that they would decide how to use best to suit their lifestyle.

> Platform A will again end up in competition for user-attention with Platform B, as it needs money to operate their business etc.

It would not, because it would not be run by a commercial organization. At this point I'm convinced that it's impossible for a sane social media platform to exist unless it's decentralized and/or run by a nonprofit. As soon as one touches venture capital, enshittification becomes only a matter of time.

rickdeckard 1 hours ago [-]
> And why would that be a problem?

I'm referring to the journey that will move people away from "Platform A" again because of Platform B. That's a problem to solve because the value of the social network to an individual is largely the PEOPLE on that network.

> It would not, because it would not be run by a commercial organization.

Agree, taking "platform A" out of the profit-wheel could help. But still in a normal adoption scheme you need to make it worthwhile for a critical mass of people to use it for OTHER people to also consider it.

In the end I believe you will need another external driver to solve this (i.e. restricting for-profit social media), because that other platform "has all the people and the dopamine".

-

There is a more simple parallel situation one can observe: People trying to move from WhatsApp/FB-messenger/.. to i.e. Signal.

It's JUST about direct messaging, but anecdotally the majority of these transitions fail to complete because not all involved are actually on-board to abandon the old Messenger.

So you succeed and your close group installs Signal and starts to use it with you, but each one is also part of other groups who are still also on the legacy app. Everyone has both apps installed, but slowly communication starts to move back to the legacy app because it's the "superset" of all friend-groups.

baxuz 2 hours ago [-]
You wouldn't as competitors who use these dark patterns should be regulated out of existence in normal countries.

Every other form of mass media is regulated, for good reason.

slightwinder 1 hours ago [-]
> Social media as a concept can definitely be fixed. Just stop doing algorithms, period.

Algorithms only make them worse, not bad. The flaws are there, because people.

> Stop pretending that people want to use social media for entertainment and news and celebrities.

No, people buy this content, because they want it, not because it's there.

> Social media is meant to be a place where you get updates about the lives of people you follow.

Strange claim. Nothing is preventing people from doing exactly this.

grishka 58 minutes ago [-]
> Nothing is preventing people from doing exactly this.

I try my best to do this but it's futile.

See, even if you don't use the algorithm, the algorithm still uses you. So, for example, you can't just discuss something on Twitter with your followers. Sometimes it would decide to show your tweet to a bunch of random people with the radical version of the views opposite to yours, and you will be forced to use the block button way too damn much. You can't opt out of having your tweets recommended to strangers.

Even when that doesn't happen, many of your followers would still miss your posts if you don't appease the algorithm because they would still use the algorithmic feed — whether knowingly or because of all the dark patterns around that setting (it's very hidden, never sticks, or both).

So no, the very existence of the algorithmic feed is the problem, it ruins the experience for everyone regardless of whether they use it or not.

slightwinder 22 minutes ago [-]
> So, for example, you can't just discuss something on Twitter with your followers.

That is a problem with the platform. Nothing is stopping you from changing the platform. You want it to be different, but don't want to use the different that already exists?

> Even when that doesn't happen, many of your followers would still miss your posts if you don't appease the algorithm because they would still use the algorithmic feed

That's still their own "choice". They decide to use Twitter, and they decide to stay on the algorithmic view. But ok, to be fair, Twitter, and other big networks, are kinda unstable on which view they offer and use by default. Forcing them to give more stable control to the user would be good. But I doubt it would fix anything on the grand scale.

> So no, the very existence of the algorithmic feed is the problem

The algorithmic feed exists mainly because there is too many content, so you will miss something anyway. Removing it will fix nothing for most people, it will only change what you miss. People have used self-configured algorithms even before social media existed. The demand has always been there.

But of course, we could talk about the actual implementation and it's dark leanings.

grishka 2 minutes ago [-]
There usually isn't "too much content" when you only see content from accounts you follow, and especially when there are no logo accounts, only people.
0xDEAFBEAD 4 hours ago [-]
You know the "retweet" feature on Twitter didn't originally exist? Before the feature was implemented, people would just write "RT" followed by the author username, then past in the text of the tweet they wanted to retweet.
grishka 4 hours ago [-]
There's nothing wrong with reposts that are made knowingly by people you follow. My issue is with current dominant social media platforms all focusing on forcing people to see content from outside of their network that they would've otherwise never seen, because neither them, nor the people they follow, would follow anything like that.
HPsquared 3 hours ago [-]
That's the old vBulletin forum approach. Recommendation algorithms solve the problem of scale when these are too large. Winding back the clock doesn't work, things evolve to the next stage especially if people already know what it looks like.
seydor 3 hours ago [-]
it will fix very little. The "problems" of social media are rooted in selfish human behavior, it's like a giant high-school. You can't "fix" that because it's ingrained in humans
skeezyboy 4 hours ago [-]
>Stop trying to turn it into interactive TV.

wait are you talking about social media or sites that play videos?

grishka 4 hours ago [-]
I'm not talking about literal television, but more about something where you don't really get to decide what you see, you go there to get entertained with whatever.
cmckn 4 hours ago [-]
For several years now, they’re mostly one and the same.
skeezyboy 4 hours ago [-]
for people who dont know what theyre talking about, sure. but pedophiles and addictive videos are 2 completely different things and it would help if you defined which youre referring to
jstanley 5 hours ago [-]
> Just stop doing algorithms

While we're at it, shall we stop storing data?

grishka 5 hours ago [-]
You know what I meant.
fireflymetavrse 6 hours ago [-]
Why can't it be fixed? Just remove algorithms and show only subscribed content in chronological order. That's how most of the early platforms worked and it was fine.
slightwinder 1 hours ago [-]
That will fix barely anything. Early platforms were also bad, but they were new, filled with people still discovering things and trying out this new world, people who had more hope, a more positive view on the world, a better upbringing and experience from the offline-world. This is gone now and won't come back. The global village is settled, and it's burning.
adamors 5 hours ago [-]
Probably because there's no monetary incentive for that, so "can't". It would mean the big social media companies collapsing, because their entire raison d'etre at this point is mass-manipulation.
grishka 5 hours ago [-]
> It would mean the big social media companies collapsing

And what would be the downside of that? :D

fraboniface 3 hours ago [-]
They mention that chronological order increases the amplification of extreme content. They don't seem to have tested only subscribed content though.
luxpir 6 hours ago [-]
I think it really is that simple. Have a discovery channel, recommendations side bar, just stop trying to add "shareholder value" through flawed machine learning attempts. Maintain a useful piece of software, is it too much to ask an earnings-driven corp? Probably.
skeezyboy 4 hours ago [-]
stop using it then.
skeezyboy 4 hours ago [-]
why do you treat it like absolute voodoo? its a website that shows you videos, with the same algorithm theyve been using for 20 years. its now only a problem since basically iOS came out and now every single clueless non technical person is on the internet and discovering decade old memes for the first time.
fsflover 5 hours ago [-]
> That's how most of the early platforms worked and it was fine.

This is also how Mastodon works today, and it is fine.

unsignedint 14 hours ago [-]
Social media as a vessel for diverse discussion is a tall order. It’s too public, too tied to context, and ultimately a no-win game. No matter how carefully you present yourself, you’ll end up being the “bad guy” to someone. The moment a discussion touches even lightly on controversy, healthy dialogue becomes nearly impossible.

Think of it this way: you’re hosting a party, and an uninvited stranger kicks the door open, then starts criticizing how you make your bed. That’s about what it feels like to try to “fix” social media.

oersted 4 hours ago [-]
I really liked the Circles feature in Google+: you defined groups of friends, and you could make your posts visible to particular groups.

They were not like group chats or subreddits, the circles were just for you, it was just an easy way to determine which of your followers would see one of your posts.

This kind of interaction was common in early Facebook and Twitter too, where only your friends or followers saw what you posted, often just whitelisted ones. It was not all public all the time. Google+ just made that a bit more granular.

I suppose that these dynamics have been overtaken by messaging apps, but it's not really the same thing. It's too direct, too real-time and all messages mixed-in, I like the more async and distributed nature of posts with comments.

Granted, if you really want a diverse discussion and to talk with everyone in the world at once, indeed that's a different problem and probably fundamentally impossible to make non-toxic, people are people.

delichon 12 hours ago [-]
Whitelisting solves the problem for me. I curate every tweet I see with a browser extension. Strangers can't kick down the door. I only see content from my direct follows. It dramatically reduces the stress. Maybe a little like horse blinkers.
Scrounger 10 hours ago [-]
> Whitelisting solves the problem for me. I curate every tweet I see with a browser extension. Strangers can't kick down the door. I only see content from my direct follows. It dramatically reduces the stress. Maybe a little like horse blinkers.

This extension? https://github.com/rxliuli/mass-block-twitter

jmiskovic 2 hours ago [-]
I see how this works for you and many others but I hate this practice. You are not turning strangers away, you just preemptively shadow-ban everyone.
AuthAuth 11 hours ago [-]
How do you find new things to follow? If everyone did this it would be extremely rare to encounter new content.
13 minutes ago [-]
owisd 7 hours ago [-]
We'd just go back to human curation, you'd whitelist a few curators you liked, people wanting to promote their content would email a link to a curator, if they thought their audience would like it they'd share it, you'd see it via your whitelist and if you liked the look of it you'd whitelist them.
saberience 3 hours ago [-]
Is this really an issue? I only message on WhatsApp with people I’ve met in real life and I prefer it that way.

I only want to engage with people on Twitter if I specifically add that person to a whitelist. I don’t want to be subject to an algorithm that is trying to increase “engagement”. I don’t want more engagement with Twitter, I want to see random posts which an AI thinks are going to enrage me so I stay on the site longer.

fsflover 5 hours ago [-]
So you are fighting against the platform that you're using. It reminds me about people constantly fighting with their own computer (Windows) to remove ads and crap. In both cases viable alternatives exist which don't require this huge effort.
throwawayq3423 11 hours ago [-]
Do you mind sharing this extension? I would prefer if it also shows the retweets of people you follow as that is an endorsement no matter what people say.
Scrounger 10 hours ago [-]
> Social media as a vessel for diverse discussion is a tall order. It’s too public, too tied to context, and ultimately a no-win game. No matter how carefully you present yourself, you’ll end up being the “bad guy” to someone. The moment a discussion touches even lightly on controversy, healthy dialogue becomes nearly impossible.

Worth reading Jaron Lanier's Ten Arguments for Deleting Your Social Media Accounts Right Now book:

https://www.amazon.com/Arguments-Deleting-Social-Media-Accou...

sien 5 hours ago [-]
It's not solved in real life. It's a huge ask that it should be solved on the internet.

Where are good discussions between really different viewpoints anywhere?

skeezyboy 4 hours ago [-]
none of that is the fault of a website. thats been the case for millenia.
incompatible 6 hours ago [-]
"Diverse discussion" is just something I don't want. Of course I've made up my mind about all kinds of things and I don't really need to see opposing points of view as though they are novel thoughts that I've never considered before. Sure, tell me again why your religion or your conspiracy theory proves that the scientific consensus is a hoax. Maybe you'll convince me this time?

I don't mind Mastodon, but I'm pretty selective in who I follow, and diversity of opinions isn't one of my criteria.

squigz 6 hours ago [-]
> I don't really need to see opposing points of view as though they are novel thoughts that I've never considered before

I mean, that's fine, if you think that you can consider all conceivable angles thoroughly, by yourself. I for one welcome opposing views, but I suppose if my idea of that meant "religion or conspiracy theories" I'd probably be avoiding it too.

greenchair 2 hours ago [-]
no need to consider all angles about an issue. not possible anyway. think about how much time you've wasted on HN reading opposing views that add no value.
squigz 2 hours ago [-]
I will do that, right after I spend some time thinking about how much time I've spent on HN reading opposing views that add immense value and wisdom to my life. :)
incompatible 5 hours ago [-]
I follow people I can learn from, not people who try to convince me that everything I already know is wrong. I don't follow people who post misinformation, reject science, or who think that ad hominem attacks are a valid form of debate. There are a lot of them out there!
dlachausse 13 minutes ago [-]
> I don't follow people who post misinformation

A little off topic, but where do you get your news? I am having the hardest time finding credible news sources that aren't full of misinformation or bias to the point of being Soviet Union Pravda levels of propaganda.

The best I've been able to do is pick a few sources that are left leaning and a few that are right leaning and try to glean the truth myself using critical thinking and for particularly important topics, more in depth research independently. The problem is that this is very time consuming and exhausting.

squigz 2 hours ago [-]
Fair enough. I guess I'm very lucky in that regard, as I don't run into that type of person very often
dlachausse 13 hours ago [-]
I still think old school linear forums are the best format for online discussion. They’re not perfect by any means, but I think they still beat all the alternatives I’ve tried.
slightwinder 53 minutes ago [-]
Linear is awful. Discussions are always so bad with them. The only advantage is that following the newest message is easier, which still doesn't prevent people from ignoring them if the thread becomes too long.

The main reason why you might think that way, is less the format, and probably more the moderation, and lower amount of people in those forums. I mean take this forum here, is far better than twitter or any other social media-sloop. Smaller Subreddits, especially with good moderation, are also far better than big Subreddits.

dlachausse 19 minutes ago [-]
There's a couple really big things I like about linear. As you said you can follow the thread of conversation in chronological order. Another big thing for me is that it doesn't have comment karma. Upvotes and downvotes result in echo chambers that suppress alternative viewpoints. PG got it almost right here by making you have to meet a threshold to downvote, but I think it would be better if downvotes didn't exist here at all. It becomes an "I disagree with your opinion" button. Also, threaded discussions encourage people to just highjack one of the top voted threads instead of replying to a more relevant person later down the line.

Also on the topic of moderation, I feel like less is more. Unless it's spam, child pornography, or threats of violence, it should probably just be left alone. Reddit in particular is extremely over moderated these days. Outside of a handful of subreddits, it is impossible to post conservative views because moderators will ban you for them. Just basic stuff like "I support <Republican candidate> and here's why..." results in a ban. Even just being subscribed to /r/conservative is enough to be automatically banned from several other subreddits. While there are exceptions, most of the moderators are in actuality just a bunch of petty, censorious tyrants.

baubino 12 hours ago [-]
The old school forums also centered around a single topic or interest, which I think helped keep things focused and more civil. Part of the problem with social media is that it wants to be everything for everyone.
AraceliHarker 9 hours ago [-]
The internet has become a primary battlefield for making money, and we can't go back to the days when it was just a non-commercial hobby that people enjoyed. To make money online, it's crucial to spread content as widely as possible, and the most effective methods for this are clickbait and ragebait. That's why the enshittification of the internet was inevitable.
bluefirebrand 11 hours ago [-]
I like old school forums with like an optional chat room for people to sync in real time if they want

The era of sites with a phpbb forum and an irc channel was really fun for me and I miss it a lot

I made lots of friends that way in the past, close friends, and it's unlike anything I've encountered since then with social media

duxup 21 hours ago [-]
A lot of talk goes into how Facebook or other social media use algorithms to encourage engagement, that often includes outrage type content, fake news, rabbit holes and so on.

But here's the thing ... people CHOOSE to engage with that, and users even produce that content for social media platforms for free.

It's hard to escape that part.

I remember trying Bluesky and while I liked it better than Twitter, for me it was disappointing that it was just Twitter, but different. Outlandish short posts, same lame jokes / pithy appeals to our emotions, and so on. People on there want to behave the same way they wanted to on Twitter.

KaiserPro 21 hours ago [-]
> But here's the thing ... people CHOOSE to engage

Kinda, but they also don't really realise that they have much more control over the feed than they expect (in certain areas)

For the reel/tiktok/foryou-instagram feeds, it shows you subjects that you engage with. It will a/b other subjects that similar people engage with. Thats all its doing. continual a/b to see if you like what ever flavour of bullshit is popular.

Most people don't realise that you can banish posts from your feeds by doing a long press "I don't like this" equivalent. It takes a few times for the machine to work out if its an account, groups of accounts of theme that you don't like, and it'll stop showing it to you. (threads for example took a very long time to stop showing me fucking sports.)

Why don't more people know this? because it hurts short term metrics for what ever bollocks the devs are working on. so its not that well advertised. just think how unsuccessful the experiments in the facebook app would have been if you were able to block the "other posts we think you might like" experiments. How sad Zuckerberg would be that his assertion was actually bollocks?

RiverCrochet 21 hours ago [-]
There's definitely a mass of people who can't/won't/don't get past passive/least-effort relationships with things on screens. These would be the type that in the TV days would simply leave the TV on a specific channel all day and just watch whatever was on, and probably haven't changed their car radio dial from the station they set it to when they bought the car. In modern times they probably have their cable TV they still pay for on a 24 hour news channel and simply have that going all day.

To be fair, in times far past, you really didn't have much choice in TV or radio channels, and I suspect it's this demographic that tend to just scroll down Facebook and take what it gives without much thought other than pressing Like on stuff.

Mouvelie 20 hours ago [-]
Yup. Knowing the exact percentage of those people would be hurtful to my soul I think, but I suspect they drive a meaningful percentage of business. Like that time when Netflix displayed shows on, because some people couldn't be bothered to actually choose something to watch ?
CrimsonCape 20 hours ago [-]
Transparency would prove or disprove this. Release the algorithm and let us decide for ourselves. In my experience, Instagram made an algorithm change 3-4 years ago. It used to be that my feed was exactly my interests. Then overnight my feed changed. It became a mix of 1. interracial relationship success stories 2. scantily clad women clickbait, 3. east asian "craft project" clickbait, and just general clickbait. It felt as if "here's what other people like you are clicking on" became part of the algorithm.
pkamb 5 hours ago [-]
Maybe the TikTok algorithm is better, but the "I don't like this" action on Meta properties just blatantly does not work. I still get the same type of clickbait content no matter how many times I try to get rid of it. Maybe watching other types of Reels would do it, but no thanks.
KaiserPro 3 hours ago [-]
on facebook, yes, for many stupid reasons. It doesn't have a "I don't like this" function on most stuff. and there are no controls for stopping "non friend" content injection in the feed.

In instagram, its very different.

First there is "snooze suggested content" which gives you a pure follow feed.

However once you reach the end of that and go into the "for you" feed, which has one "personality". Then there is the explore page, which has another "personality"

The new reels carousel stuff I think is possibly another personality.

So there are now three places where you need to yeet stuff you don't like.

I noticed that when the reels carousel was introduced they went heavy into thirst traps.

But again, this is a regulation issue. If this was the 1980s, there would be a moral panic causing something like the v-chip to stop "the youth" getting access to soft porn (not that it worked that well) Now it'll be a executive fatwa, which'll be reversed when he gets distracted by something else.

johnnyanmac 4 hours ago [-]
> people CHOOSE to engage with that

In the same was a smoker "chooses" to engage with cigarettes. Let's not underestimate the fact that core human programming is being exploited to enable such behavior. Similar to telling a smoker to "just out the cigsreet down", we can't just suddenly tell people in social media to "stop being angry".

>people on [BlueSky] want to behave the same way they wanted to on Twitter.

Yes. Changing established habits is even harder to address. You can't make a horse drink (I'm sure anyone who ever had to deal with a disengaged captive audience feels this in their souls). Whike it's become many peoples primary "news source", aka the bread, most people came there for the circus.

I don't really have an answer here. Society needs to understand social media addiction the same way they understand sugar addictions; have it slammed in there that it's not healthy and to use sparingly. That's not something you can fix with laws and regulation. Not something you fix in even a decade.

slightwinder 50 minutes ago [-]
> people CHOOSE to engage with that

Technically correct, but choice is here very simplified. The system is unable to understand WHY people engage with something, and in which way. That's poisoning the pool, and enforcing certain content and types of presentation.

PaulHoule 20 hours ago [-]
Personally I really enjoy Mastodon and Bluesky but I am very deliberate at avoiding negative people, I do not follow and often mute or block “diss abled” people who complain about everything or people who think I make their life awful because I am cisgender or who post 10 articles an hour about political outrage. The discover page on Bluesky is algorithmic and respects the “less like this” button and last time I looked has 75% less outrage than the following page. (A dislike button that works is a human right in social media!)

Once I get my database library reworked, a project I have in the queue is a classifier which filters out negative people so I can speed follow and not add a bunch of negativity to my feed, this way I get to enjoy real gems like

https://mas.to/@skeletor

Cross posting that would cure some of the ills of LinkedIn!

Scrounger 10 hours ago [-]
> Bluesky

FWIW, I've been consistently posting quality stuff on Bluesky for the last year, and despite having a few hundred followers, I get ZERO engagement.

People in the Bluesky subreddit tell me it's not a "post and ghost" platform in that you have to constantly interact with people if you want to earn engagement, but that's too time consuming.

In other words, the discovery algorithm(s) on BlueSky sucks.

johnnyanmac 4 hours ago [-]
Maybe it doesn't suck. Others are just better at posting discoverable content than you. (note: "discoverable" =/= "engaging")

If we believe the discoverability algorithms to avoid "engagement" is respected, who would be more discoverable? The person coming in to show off one high quality article every 6 months, or the person doing weekly blogs with some nuggets of information on the same topic?

Maybe your article goes viral, but odds are that the weekly blogger will amass more followers, have more comments, and will build up to a point where they 99% of the time get more buzz on their updates than the one hit wonder.

immibis 7 hours ago [-]
It's just Twitter 2. It's the same as Twitter, made by the same people who made Twitter, doing the same thing as Twitter in the same way as Twitter, with the same culture as Twitter, plus a fig leaf to decentralisation.
notTooFarGone 21 hours ago [-]
>people CHOOSE to engage with that

brains are wired that way. Gossip and rage bait is not something that people actively decide for, it's subconscious. It's weird saying that this is the problem of individuals - propaganda is effective not because people are choosing to believe it.

prisenco 12 hours ago [-]
Right. When we're talking about the scale of humanity itself, we've moved far past individual actions.

At the scale we're operating, if only 1% is susceptible to these algorithms, that's enough to translate to noticeable social issues and second-order effects.

And it's not 1%.

PaulHoule 21 hours ago [-]
What gets me about some platform is all the text-in-images and video with senseless motion. I've been dipping my toes into just about any social where I could possibly promote my photography and the worst of them all is Instagram where all the senseless motion drives me crazy.
duxup 21 hours ago [-]
Yeah I miss geocities. The pages were ugly, but they were that users ugly ... gloriously personal ugly.

Facebook is not my page, it looks nothing like I want... my content is in many ways the least important thing featured.

avgDev 18 hours ago [-]
Current social media have basically found the "bliss point" of online engagement to generate revenue and keep the eyes attached. These companies found a way to keep people hooked, and strong emotions seem to be a major tool.

It really isn't a choice. It is very accessible. Many friends are on social networks and you slowly get sucked into shorts. Then, it becomes an addiction as your brain crave the dopamine hits.

Similar to what Howard Moskowitz did with food.

pixl97 14 hours ago [-]
Another way to put it is, social media is an unregulated drug.
derbOac 21 hours ago [-]
> while I liked it better than Twitter, for me it was disappointing that it was just Twitter, but different

I feel exactly the same way.

I think there needs to be a kind of paradigm shift into something different, probably something that people in general don't have a good schema for right now.

Probably something decentralized or federated is necessary in my opinion, something in between email and twitter or reddit? But there's always these chicken and egg issues with adoption, who are early adopters, how that affects adoption, genuine UX-type issues etc.

johnnyanmac 4 hours ago [-]
Sounds like a return to old school, long term forums. They still exist, but there's a reason Reddit and Twitter took over the "forum space". They toom the core ideas and injected it with "engagement". In this case, with the voting system of reddit, and the follower system of Twitter. Gamefying the act of interacting with peope had effects beyond anyone's comprehension in 2007
9rx 21 hours ago [-]
> Probably something decentralized or federated is necessary in my opinion, something in between email and twitter or reddit?

So, Usenet? The medium is the message and all that, sure, but unless you change where the message originates you are ultimately going to still end up in the same place.

taeric 7 hours ago [-]
This seems somewhat disproven by the existence of places like this? Strict moderation really does work wonders to prevent some of the worst behaviors.

Not that you won't have problems, even here, from time to time. But it is hard to argue that things aren't kept much more civil than in other spots?

And, in general, avoiding direct capital incentives to drive any questionable behavior seems a pretty safe route?

I would think this would be a lot like public parks and such. Disallow some commercial behaviors and actually enforce rules, and you can keep some pretty nice places?

magzter 7 hours ago [-]
I generally agree that strict moderation is the key but there's obviously a certain threshold of users and activity that is hit where this becomes unfeasible - ycombinator user activity is next to nothing compared to sites like Facebook/twitter/reddit. Even on Reddit, you see smaller subreddits able to achieve this.

But just like a public park, if 2 million people rock up it's going to be next to impossible to police effectively.

1718627440 2 hours ago [-]
> there's obviously a certain threshold of users and activity that is hit where this becomes unfeasible

Not really. If 5 people can moderate 1000, surely 5000 can moderate 1 million. Divide et impera, it's not a new idea.

Just keep in mind that in a free market there is supposed to be no profit. If there is, then something is wrong. In this case the companies just don't feel like moderating and following laws.

SkepticalWhale 21 hours ago [-]
I'd like to see more software that amplifies local social interactions.

There are apps like Meetup, but a lot of people just find it too awkward. Introverts especially do not want to meet just for the sake of meeting people, so they fallback on social media.

Maybe this situation is fundamentally not helped by software. All of my best friendships organically formed in real-world settings like school, work, neighborhood, etc.

fellowniusmonk 21 hours ago [-]
I ran a co-working space social club that resolved this issue for many introverts in 2015-2017.

This is at core a 3rd places issue, haven't had the capital to restart it post covid.

barbazoo 21 hours ago [-]
That sounds interesting. How did that work, did you rent a place for coworking and then opened it up for the social aspect?
fellowniusmonk 20 hours ago [-]
My tech (I was founding engineer and CTO) company took over a co-working space and expanded it, we ran that portion of it at break even.

We intentionally set out to create a social club/co-working space. A lot goes into it. I'm a non-theist who comes from a multi generational group of theist church planters (like 100s of churches, just over and over), it's a multi factorial process with distinct transitions in space-time and community size, where each transition has to be handled so you don't alienate your communities founding members (who will be very different from later members) and still are able to grow.

People don't do it because they can't see the value while they are in the early mess of it. You have to like people to pull it off, you have to NOT be a high control person who can operate at high control at certain developmental stages. You have to have a moral compass everyone understands and you are consistent with, tech people like 0 trust. You have to create a maximum trust environment which means NOT extracting value from the community but understanding that the value is intrinsic in the community.

You have to design a space to facilitate work and play. It's not hard but you have to get everything right, community can't have mono culture and it must be enjoyable/uncomfortable, and you must design things so people can choose their level of engagement and grow into the community. It's easier once it has enough intertia that they understand they are building a thing with real benefits.

Even things like the flow of foot traffic within the space, chokepoints narrowing, these kinds of thing all effect how people interact.

AuthAuth 10 hours ago [-]
I've been wanting to setup something like a 3rd place that tries only to break even. I'm unfortunately not a very social person.

Because these 3rd spaces are open to anyone and probably bringing people in from internet commmunties. What do you do when someone comes along and they're not breaking any rules but its clear that no one likes them? I've seen it drive entire groups away but because the person has done nothing wrong I cant/dont want to just say "fuck off kid no likes your weird ass"

Physkal 9 hours ago [-]
I woul love to hear more about this. I am in need of a 3rd place, but unfortunately the only meetups around are sports or churches here. What did the members in your group do after you shut down? Were you open to members offering donations to keep your 3rd place going?
WaltPurvis 21 hours ago [-]
That's very interesting. Do you have time to elaborate a bit?
mindwok 9 hours ago [-]
This isn't a technology problem. Technology can help accessibility, but fundamentally this is an on-the-ground, social coordination problem.

Functioning, welcoming, and well-ran communities are the only thing that solves this. Unfortunately, technology often makes this worse, because it creates such a convenient alternative and also creates a paradox of choice. I.e. people think "when there's 1000 meetups to check out, and this one isn't perfect, I'll just move onto the next one" when actually it's the act of commitment that makes a community good.

abnercoimbre 14 hours ago [-]
I call it technology for touching grass e.g. look at The Offline Club [0]

[0] https://www.theoffline-club.com/

johnnyanmac 3 hours ago [-]
Indeed, you're describing the lack of a 3rd place. These days, maybe even the lack of a 2nd place as you graduate school and work is now fully remote. Without that societal push towards being in a public spot, many people will simply withdraw to themselves.

A third place would fix this, especially for men who need "things". You go to a bar for "thing" and if you meet some others to yell at sports with, bonus. We have less "things" for gen Z, and those things happen rather infrequently in my experience. I'm not sure if a monthly Meetup is quite enough to form strong bonds.

PaulHoule 21 hours ago [-]
I work in survey research and I'm rather appalled at how many people would rather survey a sample of AIs than a sample of people and claim they can come to some valid conclusion as a result.

There are many ways AIs differ from real people and any conclusions you can draw from them are limited at best -- we've had enough bad experiments done with real people

https://en.wikipedia.org/wiki/Stanford_prison_experiment#Int...

jerf 21 hours ago [-]
Appalling. The entire question of "fixing social media", for any definition of "fixing", involves not just the initial reaction to some change but the second-and-greater-order effects. LLMs are point-in-time models and intrinsically can not be used for even guessing at second-order effects of a policy over time. This shouldn't have gotten past the proposal phase.
reactordev 21 hours ago [-]
I trust your judgement more than Ars Technica.

For us layman, the flaw of using AI trained on people for surveys is, human. Humans have a unique tendency to be spontaneous, wouldn’t you say?

How would a focus group research team approach this when they’re bombarded by AI solutions that want their research funds?

PaulHoule 20 hours ago [-]
The worst problems with people these days seem to be they don’t pick up the phone. Probability-based polls are still pretty good about most things unless they involve Donald Trump —- it seems some Trump supporters either don’t pick up the phone or lie to pollsters. Some polls correct for this with aggressive weighting but how scientific it really is is up in the air.
johnnyanmac 4 hours ago [-]
Yeah. Sadly our phones for any unidentified number fell to spam. These days, if the message is important then they can leave a voicemail... And 90% of this voicemail reveal spam as well.

>unless they involve Donald Trump

A sense of shame perhaps. If you ask someone "how often do you brush your teeth" and compare it to more pragmatic testing, you see people have some sense of wanting to give the "right" answer. Even in a zero risk anonymous survey.

lispisok 19 hours ago [-]
>I work in survey research and I'm rather appalled at how many people would rather survey a sample of AIs than a sample of people and claim they can come to some valid conclusion as a result.

A YC company just launched doing exactly that.

https://news.ycombinator.com/item?id=44755654

add-sub-mul-div 21 hours ago [-]
"We trained a model on Twitter and Reddit content and were shocked to discover it generates a terrible community."

It's so weird to live in a time when what you just said needs to be said.

Mouvelie 20 hours ago [-]
Genuine question : are you scared for your job ? I see this tendency to use "synthetic personas" growing and frankly, having to explain why this sucks is insulting in itself. Decision makers are just not interested in having this kind of thought argument.
johnnyanmac 4 hours ago [-]
Yes and mostly No. No, because I work in games and I've seen enough people thinking that a "good game" just needs pretty graphics and a facsimile of "fun" to know that AI can't ever simulate this. Mkst Humans can't even seem to do it consistently, on all organizational levels.

But i have a footnote of "yes" because as you said, decision makers are just not interested in having this discussion about "focus on making fun games". So it will unfortunately affect my job in the short and even medium terms. Because so much of big money in games these days is in fact not focused on making a game, but on trying to either generate a gambling simulator, an engagement trap, or (you guessed it) AI hype. Both to try and claim you can just poof up assets, and to try and replace labor.

Knowing this, I do have long term plans to break out into my own indie route.

PaulHoule 20 hours ago [-]
Not really. Sales is doing better than it ever has since I’ve been here. For one thing, AI folks want our data. Despite challenges in the industry, public opinion is more relevant than ever and the areas where we are really unsurpassed is (1) historical data and (2) the most usable web site, the latter one I am a part of.
naravara 19 hours ago [-]
It doesn’t surprise me if they found that the emergent behaviors didn’t change given their method. Modifying the simulation to make them behave differently would mean your rules have changed the model’s behavior to “jump tracks” into simulating a different sort of person who would generate different outputs. It’s not quite analogous to having the same Bob who likes fishing responding to different stimuli. Sort of like how Elon told Grok to be “unfathomably based” and stop caring about being PC” and suddenly it turned into a Neo-Nazi Chan-troll. Changing the inputs for an LLM isn’t taking a core identity and tweaking it, it’s completely altering the relationships between all the tokens it’s working with.

I would assume there is so much in the corpus based on behavior optimized for the actual existing social media we have that the behavior of the bots is not going to change because the bot isn’t responding to incentives like a person would it’s mimicking the behavior it’s been trained on and if there isn’t enough training data of behavior under the different inputs you’re trying to test you’re not actually applying the “treatment” you would think you are.

richardubright 20 hours ago [-]
Wait what? Is there an article on this. That sounds absolutely insane.
PaulHoule 20 hours ago [-]
Lots of them, for instance https://dl.acm.org/doi/10.1145/3708319.3733685
wizzwizz4 14 hours ago [-]
https://arxiv.org/abs/2508.06950 "Large Language Models Do Not Simulate Human Psychology" is a recent preprint.
thrown-0825 10 hours ago [-]
The problem is people.

As a species we are greedy, self serving, and short sighted.

Social Media amplifies that, and we are well on our way to destroying ourselves.

mindwok 9 hours ago [-]
This is true as individuals, but importantly as a society we have far more agency than sometimes it feels like when you watch us all acting out our own individual self-destruction.

Banning CFCs, making seatbelts a legal requirement, making drink driving illegal, gun control (in countries outside the USA), regulations on school canteens. These are all examples of coordination where we've solved problems further upstream so that individuals don't have to fight against their own greedy, self-serving, short-sighted nature.

We do have the ability to fix this stuff, it's just messy.

thrown-0825 8 hours ago [-]
If you don’t think societies can be greedy, self serving, and short sighted I don’t know what to say.

We have raped this planet into a coma and our children will have to scrape together whatever remains when we are done.

mindwok 7 hours ago [-]
I didn't mean to imply that, they definitely can be those things and far worse. But there are many examples of societal coordination that achieve the exact opposite of that (Scandinavian countries are of course the canonical example).

Things can change.

guy2345 5 hours ago [-]
[dead]
SoftTalker 21 hours ago [-]
> Only some interventions showed modest improvements. None were able to fully disrupt the fundamental mechanisms producing the dysfunctional effects.

I think this is expected. Think back to newsgroups, email lists, web forums. They were pretty much all chronological or maybe had a simple scoring or upvoting mechanism. You still had outrage, flamewars, and the guy who always had to have the last word. Social media engagement algorithms probably do amplify that but the dysfunction was always part of it.

The only thing I've seen that works to reduce this is active moderation.

mediumsmart 8 hours ago [-]
Social media is a few people selling the data of many people looking at content made by some people selling something.

There is also research and promotion of values going on and the thing as a whole is entertaining and can be rigged or filtered on various levels by all participants.

It’s kind of social. The general point system of karma or followers applies and people can have a career and feeling of accomplishment to look back on when they retire. The cosmic rule of anything. too much, no good applies.

It’s not really broken but this is the age of idiots and monsters, so all bets are off.

627467 14 hours ago [-]
> Ars Technica: I'm skeptical of AI in general, particularly in a research context, but there are very specific instances where it can be extremely useful. This strikes me as one of them, largely because your basic model proved to be so robust.

You can't accuse them of hiding their bias and contradictions.

How can a single paper using a unproven (for this type of research) tech disprove such (alleged) skepticism.

People bending over backwards to do propaganda to harvest clicks.

0xDEAFBEAD 4 hours ago [-]
I'm skeptical of proving stuff about new-social-media with LLMs, because LLMs themselves are [presumably] trained on quite a bit of existing-social-media text.
standardUser 20 hours ago [-]
> They then tested six different intervention strategies...

None of these approaches offer what I want, and what I think a lot of people want, which is a social network primarily of people you know and give at least one shit about. But in reality, most of us don't have extended social networks that can provide enough content to consistently entertain us. So, even if we don't want 'outside' content (as if that was an option), we'll gravitate to it out of boredom and our feeds will gradually morph back into some version of the clusrterfucks we all deal with today.

skeezyboy 4 hours ago [-]
ugh this again.

>Can we identify how to improve social media and create online spaces that are actually living up to those early promises of providing a public sphere where we can deliberate and debate politics in a constructive way?

they really pomp up what is effectively a message board (facebook, twitter) or a video website with a comment/message feature (youtube, tiktok) or an instant messenger with groups (whatsapp). NONE OF THIS IS NEW.

seanwilson 8 hours ago [-]
Any interesting work on using LLMs to moderate posts/users? HN is often said to be different because of its moderation, couldn't you train an LLM moderator on similar rules to reduce trolls, ragebait, and low effort posts at scale?

A big problem I see is users in good faith are unable to hold back from replying to bad faith posts, a failure to follow the old "don't feed the trolls rule".

burnte 20 hours ago [-]
Social media isn't the problem, people are the problem, and we still working on how to fix them.
mritterhoff 19 hours ago [-]
I think that's an oversimplification. People have problems sure, but just like alcohol, social media can and does exacerbate them. The answer to dealing with the former is regulation. I'm not sure that is feasible for the latter.
throwawayq3423 11 hours ago [-]
I guess you could say the problem is that the wrong things are rewarded and amplified, but that just goes back to people.
_DeadFred_ 16 hours ago [-]
Social media leveraging the billions spent on marketing over the years, the skills of knowledgeable experts in multiple disciplines, basically thousands of human years of experts at manipulating people against a random person with zero guard up that wants to chat with their friends/make new friends isn't a people problem.
throwawayq3423 11 hours ago [-]
Designing social media as a positive place was and continues to be a choice that no one is making. Because it's too damn profitable to make a hellhole / attention vacuum that people can't stop using.
positron26 10 hours ago [-]
point-to-point communication between every human on Earth to every other human on Earth flattens communication hierarchies that used to amplify expertise and a lot of other behaviors. We created new hierarchies, but they are mostly demagogues pandering to the middle. Direct delegation is sort of like trying to process an image without convolution. Nobody knows what anyone else thinks, so we just trust that one neuron.
bentt 19 hours ago [-]
If you could plug into the inner thoughts of millions of people around the world at once, it would not be pleasant.

Social media has turned out to basically be this.

kundi 5 hours ago [-]
There are some social media networks that promise to do so - for example https://izvir.org is one of them
flixing 4 hours ago [-]
really good article on that topic here https://www.lookatmyprofile.org/blog/social-media-apps-engin...
13 hours ago [-]
hansmayer 6 hours ago [-]
Well you can't by definition fix something that is a rigged game. The social media exist to maximise the ad dollar, not to benefit you.
Cortex5936 5 hours ago [-]
I'm honestly really tired of having to read through so much bloat in these types of articles. They can't just elaborate exactly on the thing of the title ? They have to spend paragraphs writing stories ?
dandaka 3 hours ago [-]
I am asking Claude to summarize
getnormality 14 hours ago [-]
> ...the dynamics that give rise to all those negative outcomes are structurally embedded in the very architecture of social media. So we're probably doomed...

No specific dynamics are named in the remainder of the article, so how are we supposed to know if they're "structurally embedded" in anything, let alone if we're doomed?

ElijahLynn 21 hours ago [-]
I'm reading Tim Urban's book titled "What's Our Problem".

It definitely explains the different types of thinking that I'm making up our current society, including social media. I haven't got to the part yet where he suggests what to do about it, but it's fascinating insight into our human behavior in this day and age.

seydor 3 hours ago [-]
What do they mean "fixed"? Wasn't social media , from day one, about gossip , self-promotion and gaslighting? they excel at that so one would say they serve their purpose.

It's very misguided to pretend that social media mobs would replace "the press". There is a reason the press exists in the first place, to inform critically , instead of listening to hearsay.

farceSpherule 21 hours ago [-]
Social media is the new smoking...

Widespread adoption before understanding risks - embraced globally before fully grasping the mental health, social, and political consequences, especially for young people.

Delayed but significant harm - can lead to gradual impacts like reduced attention span, increased anxiety, depression, loneliness, and polarization

Corporate incentives misaligned with public health - media companies design platforms for maximum engagement, leveraging psychological triggers while downplaying or disputing the extent of harm

RiverCrochet 21 hours ago [-]
Not an accurate analogy in my opinion, but close.

- Smoking feels good but doesn't provide any useful function.

- Some social media use feels good and doesn't provide any useful function, but social media is extremely useful to cheaply keep in touch with friends and family and extremely useful for discovering and coordinating events.

Fortunately the "keep in touch" part can be done with apps that don't have so much of the "social media" part, like Telegram, Discord, and even Facebook Messenger versus the main app.

mvieira38 20 hours ago [-]
I think most of the social media power users don't connect with friends and family at all through the platforms. Young Gen Zers just scroll Tiktok (or whatever clone they prefer) and share the ones they like through snapchat/discord/telegram/messenger/sms/whatsapp. Some will post stuff for their friends to see through "close friends" or whatever, but it's much less personal than it once was with Facebook groups and whatnot
RiverCrochet 20 hours ago [-]
Agreed. And it's not necessary when you have so many apps. They're using Tiktok for scrolling and Discord when they actually want to chat with their friends.
_DeadFred_ 16 hours ago [-]
'Smoking get's me taking breaks, going outside more, and makes me more social chatting with my coworkers/others on smoke breaks'
bookman117 13 hours ago [-]
[dead]
mvieira38 20 hours ago [-]
This analogy undersells the negative impact of social media. Smoking wasn't a propaganda machine at the hands of a few faceless corpos with no clear affiliation, for example, nor did it form a global spynet
MeIam 20 hours ago [-]
The main reason that it can't be fixed is that it has political or corporate operators and propaganda bots have taken over. There is always an agenda running through threads of social media even for mundane topics that seeking supremacy.
AuthAuth 10 hours ago [-]
I think this problem is partly due to greedly algos and party due to these sites being so large they have no site culture.

Site culture is what prevents mods from having to step in and sort out every little disagreement. Modern social media actively discourages site culture and post quality becomes a race to the bottom. Sure its harder to onboard new users when there are social rules that need to be learnt and followed but you retain users and have a more enjoyable experience when everyone follows a basic etiquette.

KaiserPro 21 hours ago [-]
Social media can be fixed, its just the incentives are not aligned.

To make money, social media companies need people to stay on as long as possible. That means showing people sex, violence, rage and huge amounts of copyright infringements.

There is little advantage in creating real-world consequences for bad actors. Why? because it hurts growth.

There was a reason why the old TV networks didn't let any old twat with a camera broadcast stuff on their network, why? because they would get huge fines if they broke decency "laws" (yes america had/has censorship, hence why the simpsons say "whoopee" and "snuggle")

There are few things that can cause company ending fines for social media companies. Which means we get almost no moderation.

Until that changes, social media will be "broken"

wk_end 21 hours ago [-]
> Social media can be fixed, its just the incentives are not aligned.

So social media can't be fixed. Incentives are what matter.

KaiserPro 4 hours ago [-]
The US does not have a working legislature. It hasn't for possibly ~15 years.

But, if you think how closely network TV was regulated, by a government regulator, despite the power that those networks wielded (and against the incumbent radio and newspapers) We know it has happened.

The issue is that government in the USA has been dysfunctional for >30 years, not that regulation is ineffective.

amanaplanacanal 20 hours ago [-]
Incentives can be changed though, through law.
fergie 4 hours ago [-]
> these platforms too often create filter bubbles or echo chambers.

I thought the latest research had debunked this and showed that the _real_ source of conflict with social media is that people are forced out of their natural echo-chambers and exposed to opinions that they normally wouldn't have to contend with?

13 hours ago [-]
IAmGraydon 20 hours ago [-]
Social media in a profit-seeking system can't be fixed. Profit-seeking provides the evolutionary pressure to turn it into something truly destructive to users. The only way it can work is via ownership by a benevolent non-profit. However, that would likely eventually give in to corruption if given enough time. Outlawing it completely, as well as regulating the algorithmic shaping of the online experience, is probably the inevitable future. Unfortunately, it won't come until the current system causes a complete societal facture and collapse.
RiverCrochet 20 hours ago [-]
If enough users are destroyed, advertisers (social media's real customers) won't have sufficient markets for their products, and profits will fall. Social media can't destroy its users and survive.

Seriously though, I disagree. Social media in a profit-seeking system can work if the users are the ones who pay. The easiest way for this to work-now that net neutrality is no longer a thing-is bundling through user's phone bills. If Facebook et al. were bundled similarly to how Netflix, Hulu and other streaming apps are now packaged with phone plan deals, then the users would be the focus, not the advertisers. This might require that social media be legislatively required to offer true ad-free options, though.

moskie 12 hours ago [-]
I think you're on the right track, but not getting to what I view as the logical conclusion: publicly funded options, free at the point of service to everyone. I've also humored the idea of taking it one level of abstraction further: a publicly funded cloud computing infrastructure, access to which is free (up to a level of usage). People could then choose to use these cloud computing resources to host, say, federated instances of open social networks.

I mean, it will never happen, but I think it's a path that resolves a lot of problems, and therefore a fun thought experiment.

westurner 21 hours ago [-]
Do all of these points apply to the traditional media funhouse mirror that we love to hate, too?

> "The [structural] mechanism producing these problematic outcomes is really robust and hard to resolve."

I see illegal war, killing without due process, and kleptocracy. It's partly the media's fault. It's partly the peoples' fault for depending on advertising to subsidize free services, for gawking, for sharing without consideration, for voting in ignorance.

Social media reflects the people; who can't be "fixed" either.

If you're annoyed with all of these people on here who are lesser than and more annoying than you, then stop spending so much time at the bar.

Can the bar be fixed?

cwmoore 21 hours ago [-]
Sure can!

“No smoking, gambling, or loose women.”

TaDAaaah!

weregiraffe 20 hours ago [-]
No loose men either.
cwmoore 19 hours ago [-]
Someone has to run the place.
20 hours ago [-]