Many technology companies rely heavily on computer algorithms to drive decision making parts of their business. Without this technology, they would not be able to achieve the efficiencies and capabilities that they do.
But can key decision making and judgment calls be replaced by machines that we know cannot think like us and often make mistakes? Major advertising platforms are leaving algorithms to make vital decisions every day, which can be frustrating to their clients.
Platforms like Google AdWords and Facebook minimise costs by employing algorithms to pre-approve adverts on their platforms. Although very advanced at processing data, they invariably make mistakes when encountering a new ad or image, which can lead to disapprovals for advertisers. This often requires an appeal and a lengthy approval period, leading to lost time, revenue and bad customer service. This impacts advertisers, who are treated like guinea pigs, subject to algorithmic testing.
Algorithms make mistakes when it comes to thinking like humans; these are functions that humans still need to do. I’ve seen many cases of appeals following algorithmic decisions that are accepted by human reviewers. There are cases, as we shall see, where a judgment call needs to be made by a human, such as when it comes to public decency. It is absurd that algorithms should be making these calls.
Nursing wear: A debate on public decency
Whether a mother should be able to nurse her baby in public has been at the forefront of public debate. Many argue that there’s nothing more natural than a mother nursing her baby and public opinion tends to be supportive of breastfeeding in public places. In the US, 49 states say it is legal to breastfeed in public. It is also legal in Australia. This debate.org poll has 61% of people for it and 39% against. Although most are in favour, the debate is still front and centre for many.
Without entering the debate of whether breastfeeding should be legal or not, it is still important to consider what is an acceptable decision in online advertising.
Our client manufactures and sells nursing wear tops for women to allow them to nurse comfortably in public.This product allows women to conveniently nurse in public places and do so in a way that is non-offensive. Newborn babies need to feed every two to three hours, so it is only practical for a mother to be able to nurse them while out and about. As the tagline of our the client’s new Facebook ad campaign states: ‘Nurse comfortably & conveniently in public’.
A violation of Facebook advertising policies?
We recently launched a new campaign for this client on Facebook, and as soon as our new ads went live, we received the following notice:
Facebook has told us that our ads are flagged as showing ‘nudity or cleavage’. Looking closely at these three ads, it is very difficult to see any instances of actual nudity or cleavage:
The ads show a mother with a child against her breast, however, there is no nudity or cleavage. On the other hand, there are many ads from other advertisers that are currently running and must have been approved that do show cleavage, for example:
It is unclear to us how ads of Victoria Secret models wearing nothing but bikinis are approved to run on Facebook, while ads of women holding babies to their breast in more modest clothing are not.
Who is responsible for the mistakes?
I’ve been running campaigns on advertising platforms for close to eight years now. I’ve seen many cases of adverts disapproved by algorithms only to have the decision overturned by a human reviewer following an appeals process. I’ve even seen instances where the exact same banner ad has been approved in one campaign on AdWords whilst in a second campaign it was disapproved.
Algorithms make mistakes.
The issue isn’t whether nudity in Facebook advertising is okay or not. Nor are we debating whether breastfeeding in public is right or wrong. The issue is whether we can leave these decisions in the hands of algorithms to make. And if we do, who is responsible for the mistakes that are made?
The above example illustrates that when a mistake is made, it can play out with a larger effect then intended. Should we point fingers at Facebook? Are they not saying, at least indirectly, that images of women breastfeeding are considered nudity and not fit for the public domain? Even if it was an algorithmic mistake, surely the owner of the algorithm is responsible.
The employment of algorithms is undoubtedly aimed at increasing efficiency, with the end goal to increase profits. I believe that since this is a profit-making exercise, advertisers should not have to suffer unreasonable disapprovals and long appeals process.
I also believe that the onus of responsibility should lie with these advertising platforms for the decisions that are made by the machines. The advertising platforms need to be held more accountable to higher levels of customer service and for decisions that go wrong.