The morality of technology

Last week The Verge published an article about brewing discontent at Signal, the company behind the secure messaging app. The company doesn’t have plans in place for what happens if they discover bad actors (criminal gangs, terrorists, etc) are using Signal to organise themselves, and many working at Signal are very unhappy with this. Those employees see there are opportunities to stop that kind of illicit activity, and there seems to be no desire among the leadership team to take those opportunities.

The Verge pulls out a quote from Brian Acton, co-founder of WhatsApp (now departed), who bankrolled Signal and is still active in its day-to-day operations:

“There is no morality attached to technology, it’s people that attach morality to technology,” Acton told Steven Levy for his book Facebook: The Inside Story. Acton continued:

“It’s not up to technologists to be the ones to render judgment. I don’t like being a nanny company. Insofar as people use a product in India or Myanmar or anywhere for hate crimes or terrorism or anything else, let’s stop looking at the technology and start asking questions about the people.”

This reminds me of the argument used against gun control. “Guns don’t kill people,” the argument goes, “People kill people.” It’s a false dichotomy. People with guns kill people. Both are implicated. Take away the guns and you take away the people-with-guns.

The technology we create may be neutral, but we are responsible for its creation, what is hard or easy to do with it, and we are responsible for the world we want. If we don’t want our technology to be aiding terrorists then we can choose to make things more difficult for them; we might even be able to stop them. If we are okay with our technology aiding terrorists then let’s say so, and see what happens. If we are okay with our technology aiding terrorists but don’t want to say so, then we are living in denial.

Photo by Brian Evans