Porn, kids, trojans and trust – an industry problem

The guys at Unruly held a fantastic event in London this week, titled Trust Talks. The event saw many senior leaders from the sell-side, buy-side, and everyone in-between come together to discuss issues of trust in our industry.

It got me thinking about trust. Specifically about how gaining trust has helped me, and losing trust has hampered me in my career to date. I believe that trust is earned slowly, and lost quickly. Behaviour is a big driver of both.

One of the most frustrating challenges I have faced in my career relating to trust is what’s collectively known as ‘bad ads’. I have seen this problem both working at a publisher, and then working at ad-exchanges and SSP’s, while working for publishers, primarily supporting the monetisation of their mobile-apps.

Bad ads can mean many things to many people. Some consider a low resolution creative a ‘bad ad’ and I would agree, but for different reasons. I am talking about wholly inappropriate content. For example, trojan ads, which present and serve onto a publishers site as a brand-safe creative, which often bypasses blocking tech as a result. It is only after the ad has been served onto the app it unpacks to deliver a more sinister payload. The fact of the matter is that porn ads are being served on premium content, and worse still, being served to kids. Nothing erodes built and earned trust faster than having to tell a publisher you don’t know how such an ad was allowed to be served in the first place, and despite your tech-teams working flat out on this, they are completely unable to find the origin.

trojanhorse

The complex ecosystem of buy-side and sell-side technology, many of whom re-sell among themselves, add to the complexity of finding the origin of just one creative. It’s like searching for a needle in a haystack. Also, an advertiser can set up a DSP account with as little as £50, which means that the barrier to entry is very low.

This can cause significant lost revenue for the publisher and ad-tech vendor alike. If a publisher spots one of these ads, and rightly demands that the ad-tech vendor kills it, the ad-tech vendor is rarely able to do that, and certainly not able to do that quickly. Revenue is lost by both parties in this instance because the delay is unacceptable to the publisher (because the ad could serve again) but necessary to the ad-tech vendor (reconciling billions of ad requests), and that unacceptable delay often results in the whole exchange being suspended by the publisher for hours, days, or longer.

As a father of two digitally active kids, I am concerned on both a business and personal level. Before writing this article, I spent a bit of time on Google, searching for some stats …

I found this Pubguard.com article from July 2018 that states ‘1.1 billion instances of sell-side ad-fraud are being served to the most vulnerable people in our society each year’. 1.1 billion? Wow.

So I want to know how many porn ads are being served to our kids, and how do we stop it? This New York Times article from October 2018 is titled ‘Your Kid’s Apps Are Crammed With Ads’ and one sentence sounds particularly ominous saying ‘In apps marketed for children 5 and under in the Google Play store, there were pop-up ads with disturbing imagery.’ It doesn’t say porn in those words, but I think disturbing imagery can certainly be classified under the moniker of ‘bad ads’.

When you consider how huge the mobile in-app gaming sector is, and how many of those apps are games that attract a younger audience, it doesn’t take a mathematician to conclude there is a high degree of probability our kids will have been served something awful at some point.

An article on Information-age.com described this process as ‘AdultSwine’ and said ‘Explicit pornographic adverts have been found on 60 apps – mainly aimed for children – on Google’s Play store’.

When ad-tech vendors can’t effectively control what runs through their pipes, and publishers are unwittingly subjecting their users to such disturbing content, there is a big problem to solve. Reviewing every creative does not work perfectly either, as often the ad presents as brand-safe, and only unpacks its inappropriate content after it has been served, and allowed onto the app.

In an article on Independent.co.uk, an adult advertiser blamed third-parties for allowing their porn ad to be served into a kids app, and the app developer said they had asked all their partners, but were unable to find the source. That one article sums up the problem perfectly, I think.

Before getting into this industry, I never thought I would have cause to write the words Porn, kids, trojans and trust in that order. This appears to be a big problem though, and one that needs solving?

As always, I would love to spark a debate and hear the thoughts of industry leaders who are in a position to positively shape the future of this challenge. What are your thoughts on the problem and solutions?

 

 


Here are the links to the articles I have referenced above:

Pubguard – https://pubguard.com/1-1-billion-trojan-ads/

NYT – https://www.nytimes.com/2018/10/30/style/kids-study-apps-advertising.html

IA – https://www.information-age.com/malware-displays-porn-ads-childrens-apps-123470361/

Indy – https://www.independent.co.uk/news/media/advertising/explicit-porn-advert-banned-after-appearing-in-talking-tom-app-used-by-children-10310701.html

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: