The UK’s Betting and Gaming Council has announced ground-breaking rules for the British gambling industry aimed at preventing under-18s from seeing their adverts online. Now the focus moves to how gigantic online advertising platforms such as Google and Facebook, will facilitate the application of this policy.
In truth, at present, they only guesstimate the ages of their customers, or have to accept their word when they supply a date of birth to open an account. This forces advertisers to apply a wide margin for error in their demographic selection by aiming at those believed to be at least 25, and to apply other targeting safeguards to try to comply with regulations.
In a welcome change of emphasis, the nascent UK trade association for the regulated betting industry, which last year combined separate bodies for the casinos, bookmakers and remote operators, frequently describes itself as a ‘standards body’. It has also acquired the self-regulatory role of the Industry Group for Responsible Gambling, publishing the new rules in the Sixth Industry Code for Socially Responsible Advertising.
From the start of October, BGC members must ensure that all sponsored or paid for social media adverts must be targeted at consumers aged 25 and over unless the website can prove its adverts can be precisely targeted at over 18s – copying the “Challenge 25” concept from the hospitality industry (which is not surprising as the BGC’s accomplished and widely-respected chair, Brigid Simmonds OBE, is former CEO of the British Beer and Pub Association).
The new code also includes a requirement that gambling ads appearing on search engines must make clear that they are for those aged 18 and over. In addition, the adverts themselves must also include safer gambling messages.
A particular focus of the announcement was YouTube. Users will have to use age-verified accounts before they can view gambling ads, which the BGC argues will guarantee that they cannot be seen by under-18s.
This is perhaps where the theory of the new policy may come up against the practical problem facing all advertisers of age-restricted products – none of the major tech platforms can convincingly distinguish between the children and adults who are viewing their content. You can open a new google account, supply a fake date of birth, and then access Youtube as an age-verified account.
Most social media sites set a minimum age of 13 – the age at which children in the UK are deemed capable of providing consent to their personal data being processed without a parent also approving – but this is generally enforced based on the claimed date of birth of new users.
The allure of a Facebook or Tiktok account is more than enough to persuade a ten-year-old to deduct three from their year of birth. While it may not be the end of the world for them to get an Instagram account a couple of years too soon, the consequence is that these platforms record a false age, and consequently deem the child to have turned 18 several years too soon.
Former shadow Secretary of State for Culture, and CEO of the BGC, Michael Dugher, appears to recognise this challenge: “It is vital that the big internet platforms honour their responsibilities to protect people online and we hope the Government will use its forthcoming Online Harms Bill to that effect.”
The only safeguard at present is so-called ‘age-assurance’. Platforms use algorithms to score a user’s social profile – their likes, their friendships, their school – to assess if these are out of kilter with their claimed age. It is not clear how many accounts are shut down by such checks – but there is no shortage of research to prove how many underage accounts escape them.
So for the gambling industry – and others such as alcohol or unhealthy foods – to comply with age-restrictions on their advertising, platforms must now introduce effective, independent age checks. This is neither onerous nor expensive. To begin with, they could invite any users they currently record as being over 18 to complete a short age-verification process.
Then, on their claimed 18th birthday, new adults can be asked to do likewise, with the promise of access to more content previously off limits. This way, the platforms will quickly develop a sub-set of users they have firmly verified as adults – all of whom can safely be served age-restricted advertising.
Only effective Age Verification, conducted independently and to an agreed standard accepted by regulators, can deliver this goal. Platforms using certified AV technology, such as AgeChecked, will be capable of targeting accurately adult users to the BSI Standard, PAS1296, already specified by the Home Office for online alcohol sales.
Gambling operators will also benefit from more efficient marketing, as they can have complete confidence that they are not inadvertently – and pointlessly – paying for ads seen by those too young to gamble. The benefit to the platforms is that they would no longer need to apply a ‘Think 25’ buffer and could safely advertise to 18-24 year olds as well, offering a wider audience to advertisers of age-restricted goods.
And because age-verification providers adopt a “verify once, use many times” approach, passing an age check will increasingly be invisible to the consumer, with no interruption to their user-experience. If they’ve previously ordered beer from a supermarket, the chances are they’ll have been verified already. This new safety-tech sector is moving rapidly to introduce interoperability between AV providers, so they are able to recognise each other’s age checks, further reducing any impact on customers.
This work, led by the AV Providers own trade association, of which I am co-chair, will accelerate the scale required to allow any website to confirm the age of the user, without needing to know their identity or access any personal data. That will not only allow for compliance with advertising rules, but also with the new Age Appropriate Design Code, which became law on 2 September, and threatens fines of up to four per cent of global turnover for allowing children’s data to be processed in any way that may be harmful to them, mentally or physically.
While the UK government is still planning action to impose a duty of care on websites to protect children more generally, it is innovations like that of the BGC’s new advertising code which are more likely to drive this important first principle for online protection – you need to know with confidence whether each user is a child or an adult.
Alastair Graham is the Founder and CEO of AgeChecked, the online age verification solutions provider. Alastair is an entrepreneur with over twenty years’ experience of launching businesses and solutions into regulated markets. Prior to founding AgeChecked, he spent ten years in the payments industry as co-founder of a prepaid card company and as CEO of a financial institution in the UK.
Alastair has been closely involved with the development of age checking legislation in the UK. He is Co-Chair of the Age Verification Providers Association and sits on the Digital Policy Alliance Age Verification and Internet Safety Working Group. Alastair sat on the British Standards Institute’s Steering Committee, which produced the Publicly Available Specification of Online Age Checking.