4 Oct 2022

Looking at how unhealthy products are marketed online

From Afternoons, 1:25 pm on 4 October 2022

Over the last decade, digital platforms have created a new form of advertising known simply as ‘dark’ advertising.  

Unlike real world advertising, dark ads are only visible to the people they’re targeted at. 

For products like alcohol, where the public has an interest in how they are being advertised, that is a huge challenge to public accountability, says University of Queensland professor Nic Carah. 

He researches digital media and has looked into how harmful industries market online. 

People sitting down looking at their phones.

Photo: Unsplash / Robin Worrall

Dealing with adverts unique to an individual meant Carah and his team of researchers relied on volunteers to install a plugin on their computer, essentially scooping up all the ads they see. 

With 2000 people in Australia using the plugin, the research team were able to scoop half a million ads in six months. 

In another project, young Australians were asked to screenshot ads they saw on their phone. 

Little by little, the larger picture became clear – who the algorithm perceives you to be plays a big role in what ads you see. 

Young men were targeted with sports betting ads, and when it came to alcohol, young women were often shown advertising that was pink and full of botanicals. 

“It’s an extremely data driven model and so it learns things about people and then shapes the flows of ads they see,” Carah says. 

The fuel in the online ad engine is a list of 700-1000 words that a platform has created about you, call ad interest data, which builds an incredibly rich portrait of who you are, he says. 

It’s a machine learning model sitting under a platform like Facebook watching everything you do online. 

“For example, what we found was the more alcohol you consume the more alcohol-related words you have in your list – the platform kind of learns who’s a high-volume consumer of alcohol.” 

It will even associate you with moods and colours. 

You then get compared to everyone else on the platform and grouped together with others who have a similar list of words. 

“And then they watch what kind of ads and brands you interact with and figure out well, other people with similar list of words might like the same product as you.” 

If you’re a young parent, Carah says the algorithm may assume you’ll be stressed out at a particular time of the day, and want a drink, so you’ll start to see alcohol ads. 

“They’re not just working out your moods and vulnerabilities but also quite intimate domestic practices of your everyday life.” 

Carah believes advertising companies need to be accountable to the public. 

“We’ve always had ads in the public and it’s enabled civil society and the regulators to in sense have some checks and balances on advertisers and what they do – I think that’s the problem. If you take the advertising completely out of public view, you need to figure out some sort of way to put it back in front of the public so we can collectively agree on what’s appropriate.”