Thanks to Arnold Kling for his reply. I think, though, he is much too fatalistic about improving online sociability. Today’s internet was optimized to instantly propagate impulsively or algorithmically created content which goes viral by attracting outrage, thereby generating attention, data, and revenue for advertisers. Designing a worse system for human interaction would be difficult. That leaves a lot of room for improvement.
Pace Kling, I don’t think “centralized curation of content” and content restrictions are most of what will happen as digital media grapple with antisocial behavior, though we’ll certainly see some of those (and have, and should). More important, and more successful, will be human-machine partnerships and platform redesigns which identify toxic behavior and content and help users avoid them. Facebook doesn’t ban fake news items, but it does demote them in news feeds and refer users to better information (reducing fake posts’ share velocity by 80 percent, Facebook says). Google gives preferred placement in search results to fact-checked items and has built a special search engine to make fact-checks easier to find—among many other initiatives. In a recent interview with Sam Harris, Twitter founder and CEO Jack Dorsey discusses a host of design changes Twitter is considering to improve users’ experience and behavior. He argues that coping with antisocial behavior will rely more on product design than policy design: that is, on changing how the platforms function to make them more pro-social, rather than regulating users and chasing transgressors. That sounds right to me.
I could go on, but here’s the point: far too many of us have accepted social media’s design flaws as inevitable. That is like accepting Libya and Venezuela as models for our government and economy. Today’s digital infrastructure has barely begun building the institutional guardrails and behavioral incentives that we take for granted in other social forums, and the problem has only just begun to get serious attention. No one says the problem is easy, but fatalism is self-defeating.
On another subject, if this conversation had a “like” button, I’d be jabbing it to endorse Don Downs’s witty and very apt phrase “emotional correctness.” Don, did you come up with that? Cuz it’s brilliant. To be sure, political ideology remains an important driver of campus attacks on free speech, but recent years have seen a shift toward what a political scientist I know calls therapeutic progressivism: a notion of social justice based on assuring self-esteem and emotional safety to historically disadvantaged minorities. “Emotional correctness,” indeed.