On Monday, US web infrastructure and security company Cloudflare bowed to increasing pressure and announced it was cutting ties with forum website 8chan.
That pressure followed the latest mass shootings in the US — this time in El Paso, Texas and Dayton, Ohio — and suggestions the suspected perpetrator of the El Paso attack may have been inspired by 8chan and posted on the forum shortly beforehand.
Founded in 2009, Cloudflare is designed to protect websites from the kind of cyber security attacks that could disrupt them enough to knock them — and their content — off the web.
Previously, it has stayed conspicuously neutral on the issue of the kinds of sites it supports, with the exception of white supremicist site Daily Stormer, which it cut off from its services in 2017.
In a statement on Monday, Cloudflare founder and chief Matthew Prince stressed its position is primarily that of network provider.
“In pursuit of our goal of helping build a better internet, we’ve considered it important to provide our security services broadly to make sure as many users as possible are secure, and thereby making cyber attacks less attractive — regardless of the content of those websites,” Prince said.
However, he described 8chan as “uniquely lawless”.
“That lawlessness has contributed to multiple horrific tragedies. Enough is enough.”
From Prince’s full statement, it’s clear this was not an easy decision for the founder to make, and not one he was wholly comfortable with.
But, for any tech provider offering services to external parties, whether, at some point, there is going to be a question of who it’s going to do business with, and at what cost.
Entrepreneur or content arbiter?
Speaking to StartupSmart, founder of Aussie VC firm Jelix Ventures Andrea Gardiner says she is “completely in agreement” with Prince’s decision, and his lengthy statement about it.
“It’s a pretty tough gig of a founder to play the role of a content arbiter,” she says.
“There’s no political legitimacy for them to make those determinations of what is acceptable or not, and where to draw the line,” she notes.
“On the other hand … when it comes to the point that it’s really fuelling hate and potentially hate-motivated violence, then there’s a moral imperative not to support it,” she adds.
“I just don’t think founders, and these companies, can turn a blind eye to it.”
Prince’s statement also stresses that Cloudflare is not a political or legal body.
“Questions around content are real societal issues that need politically legitimate solutions,” Prince says.
And Gardiner is of the same opinion, saying there has to be a political agenda around what can and cannot be published online.
While big tech companies like Facebook do put a lot of resources into moderating content, she says, it doesn’t have a legal remit.
“I don’t think it should be Facebook’s responsibility to draw the line. It actually should be enshrined in law,” Gardiner says.
European legislators are already working on laws that begin to tackle this, and that have different considerations for businesses that actually publish content, such as Facebook and YouTube, and those that merely facilitate third-party sites, such as Cloudflare.
“In principle, I’m not a big one for over-regulation, but I think there does need to be some sort of regulatory obligations,” Gardiner says.
Such rules should only pertain to very extreme views, she stresses. And — importantly — they should be defined with significant consultation with the tech industry.
“The politicians would make a hash of it if they tried to do it without that. It wouldn’t be workable. They would end up destroying the tech businesses,” she says.
Indeed, after the terrorist attack in Christchurch earlier this year, new laws were rushed through the Senate, penalising tech companies that host “abhorrent violent material”, without any real consultation with the industry.
Having the government manage legislation to do with moderating content online “would be potentially completely disastrous unless technology companies are the ones helping to propose and develop the solutions”, Gardiner says.
“They’re the only ones that actually understand what is doable and what is not.”
Find your moral code
But, in the absence of any legal framework, M8 Ventures partner Alan Jones calls on startups to place more importance on their own moral code.
Private companies are often expected to be amoral, he says.
“Individuals should be held to shared moral code and even that our political representatives should adhere to moral code, but for some reason, private companies get a free pass from acting morally,” he notes.
Society seems to allow businesses to run entirely in pursuit of profit, he adds. That’s not a luxury that’s afforded to anyone else.
While Jones agrees Cloudflare’s severing of services to 8chan was indeed the right thing to do, what is interesting is that Price felt so uncomfortable doing it.
“We continue to feel incredibly uncomfortable about playing the role of content arbiter and do not plan to exercise it often,” Price said in the statement.
However, Jones says this was not an editorial decision. It was a moral one.
“I would encourage every startup founder and startup employee to explore the dimensions of their shared moral compass as a company,” he says.
“A moral framework is just as important for a private company as it is for an individual or a government.”
Jones says this is not just an issue for the Cloudflares of the world, or even the Facebooks and YouTubes. It’s something founders should consider from day one.
“The sooner you begin to involve a moral framework in the development of your business, the easier it is.”
This is one of many things in early-stage startups that “seems very important, but not urgent”, Jones explains.
When founders have so many other things to do and such few resources, that can make them “very, very difficult to address”, he adds.
“But it doesn’t get easier. It gets harder the longer you leave it.”
You don’t get a free pass
Jones notes two examples of Australian startup giants that have implemented their moral codes particularly well: Atlassian and Canva.
Canva’s terms and conditions, for example, have a clause saying the startup “does not support and will not tolerate its service being used to discriminate against others, especially when based on race, religion, sex, sexual orientation, age, disability, ancestry or national origin”.
Users are not permitted to use Canva to incite or support discrimination, or to incite violence. And if it suspects they are, it will suspend offending accounts “without notice and liability”.
As a private company, Jones notes, it’s Canva’s prerogative to disallow conduct that doesn’t fit with its own values. And all startups have the power to be just as brutal. Clearly, this approach hasn’t been at the expense of success.
But, that’s not to say startups should take any particular position on anything, Jones says. They should just know where they stand themselves.
“If you want to be a gay-hating gun-toting startup, go for it,” he adds.
“I’m not saying all of Australia’s tech startups have to have the same moral stance as me. That’s not the point at all.”
However, public companies are starting to have more and more responsibility for the services we use on a daily basis.
“So, I don’t think we can afford to continue to give them a free pass from moral obligation,” he says.
“We all want to go home with some money in our pocket, to make all the sacrifices worthwhile. But most of us in the startup industry genuinely do want to try to make the world a little bit better.”