More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

  • CashewNut 🏴󠁢󠁥󠁧󠁿
    link
    fedilink
    English
    -22 years ago

    I actually prefer this type of hands-off approach. I find it offensive that people would refuse to let me see things because they deem it too “bad” for me to deal with. I find it insulting anyone would stop me reading how to make meth or read Mein Kampf. I’m 40yo and it’s pretty fucking difficult to offend me and to think I’m going to be driven to commit crime just by reading is offensive.

    I don’t need protecting from speech/information. I’m perfectly capable and confident in my own views to deal with bullshit of all types.

    If you’re incapable of dealing with it - then don’t fucking read it.

    Fact is the more you clamp down on stuff like this the more you drive people into the shadows. 4chan and the darkweb become havens of ‘victimhood’ where they can spout their bullshit and create terrorists. When you prohibit information/speech you give it power.

    In high school it was common for everyone to hunt for the Anarchists/Jolly Roger Cookbook. I imagine there’s kids now who see it as a challenge to get hold of it and terrorist manuals - not because they want to blow shit up, but because it’s taboo!

    Same with drugs - don’t pick and eat that mushroom. Don’t burn that plant. Anyone with 0.1% of curiosity will ask “why?” and do it because they want to know why it’s prohibited.

    Porn is another example. The more you lock it down the more people will thirst for it.

    Open it all up to the bright light of day. Show it up for all it’s naked stupidity.

    • mo_ztt ✅
      link
      fedilink
      English
      02 years ago

      Agreed. I actually had come back to this topic specifically to make this exact point, which for all the time I’d spent on this at this point I feel like I hadn’t said.

      People are adults, generally speaking. It’s weird to say that you can’t have a newsletter that has a literal swastika on it, because people will be able to read it but unable to realize that what it’s saying is dangerous violence. Apparently we have to have someone “in charge” of making sure only the good stuff is allowed to be published, and keeping away the bad stuff, so people won’t be influenced by bad stuff. This is a weird viewpoint. It’s one the founding fathers were not at all in agreement with.

      Personally, I do think that there’s a place for organized opposition to slick internet propaganda which pulls people down the right-wing rabbit hole, because that’s a huge problem right now. I don’t actually know what that opposition looks like, and I can definitely see a place for banning certain behaviors (bot accounts, funded troll operations, disguising the source of a message) that people might class as “free speech,” or adding counterbalancing “free speech” in kind to misleading messages (Twitter’s “community notes” are actually a pretty good way of combating it for example). But simply knee-jerking that we have to find the people who are wrong, and ban them, because if we let people say wrong stuff then other people will read it and become wrong, is a very childish way to look at people who consume media on the internet.