One reader's rave

"Thanks for the newspaper with your book review. I can’t tell you how impressed I am with this terrific piece of writing. It is beautiful, complex, scholarly. Only sorry Mr. Freire cannot read it!" -- Ailene

Cassie Jaye, the day before I met her at the _Red Pill_ world premiere

Wednesday, November 28, 2018

NPR Promotes Internet Censorship Advocates

Yesterday's edition of All Things Considered featured an interview with representatives of several authoritarian groups with names like Data & Society Institute, Counter Extremism Project, and (most ironically) New Knowledge, all of whom want social media platforms to suppress "extreme" content. Of course, they want to decide for the rest of us what's "extreme." No one arguing the opposite case was given a platform, so I've submitted this comment:

In re: "Critics Say YouTube Hasn't Done Enough To Crack Down On Extremist Content": No, we are not comfortable with a paternalistic approach, and we shouldn't be. As John Stuart Mill explained a long time ago:

"He who knows only his own side of the case knows little of that. His reasons may be good, and no one may have been able to refute them. But if he is equally unable to refute the reasons on the opposite side, if he does not so much as know what they are, he has no ground for preferring either opinion... Nor is it enough that he should hear the opinions of adversaries from his own teachers, presented as they state them, and accompanied by what they offer as refutations. He must be able to hear them from persons who actually believe them... he must know them in their most plausible and persuasive form.” (On Liberty)

Social media platforms must justify their oligopoly by acting in the public interest, and that means not censoring content that is legal. What they can do is help keep us out of the "filter bubble" by showing us things we're likely to disagree, as well as agree, with on the subjects that interest us. This doesn't and shouldn't mean targeting any particular content; rather, it means using both people's "likes" and "dislikes" in the algorithm that decides what to show them next.

But we don't have to wait for the platforms to act. Any individual who wants to put in the time and energy can engage in their own freelance "cognitive infiltration" by finding people who may be getting dangerously "bubbled" and exposing them to contrary information.

No comments: