Well, there goes any remaining hope of being recommended by Substack. But that is the point: Substack does do recommendations, whether via their own newsletter, or via encouraging newsletters to recommend, or via algorithms. And that makes them a publisher, or, at least, not a neutral platform.
The Platformer has been chronicling their uneasiness with the presence of Nazis on Substack and the company’s willingness to monetize said Nazi newsletters. They point out that:
Substack doesn’t want to do that. It wants to be seen as a pure infrastructure provider — something like Cloudflare, which seemingly only has to moderate content once every few years. But Cloudflare doesn’t recommend blogs. It does not send out a digest of websites to visit. It doesn’t run a text-based social network, or recommend posts you might like right at the top.
…
The moment a platform begins to recommend content is the moment it can no longer claim to be simple software.
That last line is key. Substack wants to be seen as software, as a platform, so it can accrue all of the legal and moral protections of being a platform while still profiting from Nazis. In general, that is a reasonable proposition. As the Platformer article points out, no one blames Microsoft if a Nazi uses Word to write their screeds. But that is not the situation for Substack.
Substack’s recommendations, whether they are hand rolled (like the time one of owners had a known white supremacist on his podcast and treated him as a reasonable person to recommend newsletters — something he has publicly said he does not regret) or created via algorithm, mean that Substack is putting its hands on the newsletters it hosts to shape the conversation of its readers. It is, in other words, acting as a publisher.
No, it does not matter if some of the recommendations are algorithmically derived. A human being created the rules that went into the algorithm. You cannot evade responsibility for your recommendations by claiming that the computer made you do it. You programmed the computer — it is your responsibility.
Nor does it matter if they have never recommended a Nazi newsletter, something that I don’t know if they have or have not done. By making decisions about what to or not to promote at the same time you host Nazis, you open up the possibility that you will profit from recommending Nazis. More importantly, the act of recommending, of steering the conversation, means you are not a neutral platform but a publisher, and should be treated as such.
Substack knows this, of course, which is why it hides behind the idea that debate will defeat Nazis. I dealt with that silly notion here — WWII and the Holocaust definitely proved Nazi ideas abhorrent — but for our purposes now the point is that they know they are publishing and so argue that publishing Nazis is helpful to society. You will never see, for example, a sex worker’s newsletter recommended by Substack because that ban such material from their platform. They do not make the same editorial choice with Nazis.
Substack wants to make money from Nazis, period, full stop. If they were just software, that might be an unfortunate side effect of the nature of America. Again, no one blames Microsoft for Nazis writing in Word. Microsoft is merely selling a product, not setting up a system where Word tells us to go look at this cool writing. Substack’s growth plan depends in part upon recommendations. If it recommends new newsletters to users that they might enjoy — and yes, eventually that will almost certainly include the Nazi newsletters — then more people will subscribe to more newsletters and Substack’s take will increase. They are making editorial choices in service of their business — the textbook definition of a publisher.
So, it is not exaggeration to say that Substack is a publisher publishing Nazi newsletters. Morally, and I hope eventually, legally, that is precisely what they are.