Ideally, engineers would work side-by-side with ethicists when developing new technology, according to journalist and academic Waleed Aly.
Take self-driving cars. In the creation of the technology that drives those cars, engineers are required to tell the car what to do when it’s faced with a collision with a pedestrian. Should the car be programmed to swerve, missing the pedestrian, even if it means killing the driver? Or should the driver’s safety be paramount?
Get daily business news.
The latest stories, funding information, and expert advice. Free to sign up.
If that were a regular car, the decision would be made in a split second by the driver. In the case of self-driving cars, the ethical conundrum is placed on those who build it.
“Who should decide this? Should we leave this to car manufacturers, or software manufacturers, who are going to be creating these cars?” Aly asks.
“Does Google make this decision for us? Is that appropriate? Should it be the government? Should it be the driver? We’re talking about the moral fibre of society really. Is there something ethically different about a split-second decision that can be critiqued afterwards, or one that is made well in advance?
“These are ethical questions that you cannot escape merely by technology.”
Aly was speaking on the topic of ethics in technology at the Above All Human conference in Melbourne on Tuesday. He says it’s important to examine the reasons behind developing new technology, in order to be aware of, and to accept the ethical implications of such decisions.
“I would just ask you to think about it. If you can ask yourself a series of questions about what it is you’re about to unleash, that perhaps you weren’t asking before, and you could perhaps talk to other people,” Aly says.
“Shouldn’t engineers and ethicists be in the same room when they’re designing these (self-driving) cars? If not, that’s when we have a problem.”
A good ethicist won’t tell developers what to think, Aly says, rather they will ensure that engineers understand and are comfortable with the implications of their actions.
“We’re not used to having ethical conversations because we don’t have people that we have access to that are very good at them, and we don’t recognise what they look like when we do have them,” he says.
“If you reach a conclusion, it’s not that your conclusion is incontestable, it’s that it’s considered. It’s that you’ve actually thought about what it is you’re doing.”