After the initial lockdowns, in an effort to get businesses like retailers, bars and restaurants back to work, a flurry of digital check-in solutions emerged to help businesses comply with contact tracing requirements. These systems provide QR codes, enable customers to sign in, maintain records and, if needed, disclose personal information to contact tracers. Many also request consent for direct marketing, either as an option or by bundled consent (that is, the customer must agree to receive marketing or they can’t check in). In some states, such as New South Wales (NSW), electronic check-in services are now mandatory.

But what about privacy? A recent study conducted by the Consumer Policy Research Centre found that 94 per cent of Australians are concerned about how their personal data is shared online. Can users trust the cavalcade of new app providers? Are these providers regulated? What happens to the data? Does keeping each other safe mean giving up our right to privacy – to not be tracked unnecessarily or receive unwanted marketing? And how do organisations that are using check-in apps manage the risks created by this new genre of service providers?

What laws apply?

Private sector organisations are covered by Australia’s federal privacy law, the Privacy Act 1988 (Cth) (Privacy Act). However, section 6D states that the Act does not apply to “small business operators”, which it defines as businesses with an annual turnover of less than A$3 million (with some exceptions).

So, do COVID-19 check-in app providers fall within this definition of small business operator? The answer is: it depends. Some are, and so are not regulated by the Privacy Act. As such, they could use check-in data for other purposes – such as for marketing or analytics, to build profiles or to sell to third parties – all without consequences.

Let’s take a closer look.

Paid apps

Generally speaking, paid check-in app providers generate QR codes so users can navigate to the right page. The providers then collect and hold users’ personal data, and disclose it to contact tracers when required. Users generally give explicit consent to this data collection, and the business that is using the provider’s service pays a monthly fee.

Some providers are large, established businesses and may have an annual turnover of more than AU$3 million, which excludes them from the category of small business operator and subjects them to compliance with the Privacy Act.

Other providers are startups, and may not draw a large profit, but by running a COVID-19 check-in app they are arguably “disclosing personal information... for a benefit, service or advantage”. This is commonly referred to as ‘trading in personal information’ and excludes them from being considered a small business operator under section 6D(4) unless they have the consent of the affected individuals (section 6D(8)) – which, by and large, they do. Check-in apps usually include a consent request, and noone is being forced to dine at a restaurant or sit in a café – they choose to. Based on all this, these smaller providers probably still qualify as small business operators, exempt from the provisions of the Privacy Act.

Free apps

Some check-in apps are free to use, so the provider is not receiving a “benefit, service or advantage” for disclosing or otherwise dealing with the check-in records. Many of those providers are likely to be startups with low revenue, so they could also be considered small business operators within the meaning of the Privacy Act.

The small business exemption is a significant gap in Privacy Act coverage, allowing organisations to collect large amounts of personal information but remain exempt from privacy regulations.

Australia is unusual in this respect. Other privacy regimes, like the European Union’s General Data Protection Regulation, don’t have a similar carve-out.

How can we make sure privacy rights are protected?

State and Territory governments including NSW, Victoria, South Australia, Western Australia, Tasmania, the Northern Territory and the Australian Capitol Territory are now offering their own free check-in solutions. They’re not covered by the Privacy Act, but their handling of personal information is regulated by their respective state’s privacy legislation. They also provide users with clear and specific assurances about security, guaranteeing that they will not use the data for secondary purposes like marketing, and promising to delete data after 28 days if it’s not needed for contact tracing.

To date, there hasn’t been any legislative action at the Federal level to ensure that check-in providers are subject to privacy laws. As of 1 January 2021, NSW requires all hospitality venues and hairdressers to use its State-developed free check-in app. Other State and Territory governments strongly recommend that businesses use their check-in apps, but haven’t mandated it. The Australian privacy regulator – the Office of the Australian Information Commissioner – recently concluded its consultation on draft guidance for COVID-19 check-in solutions. The current (draft) recommendation is that businesses choose their check-in apps carefully – specifically, that they use checkin providers that are subject to the Privacy Act – and that other providers should voluntarily opt in to Privacy Act coverage (under section 6EA).

It’s also notable that Australia is currently embarking on a review of the Privacy Act. An Issues Paper has been released to seek community input into the review. One of the questions posed in the paper is whether the small business exemption should be amended. Indeed, recent experiences with the multiple check-in solutions underline why this question is more relevant than ever. It is likely the review will consider the small business exemption in detail; whether it continues to be relevant in the digital age, or whether amending or removing it might create an unreasonable impost on small businesses or stifle innovation. Significantly, the OAIC itself has recommended that the small business exemption be removed.

Why does it matter?

For any digital service to be effective, the public needs to be able to trust it. Creating the conditions for trust requires transparency, clear rules and clear consequences for breaking the rules. If we can’t establish trusted relationships with users, users may act to protect themselves – by providing false names or contact details, for example, which ultimately makes everyone less safe.

Ensuring privacy rights are protected – and are seen to be protected – is key to building that trust. As we continue to develop digital solutions to support public health problems, trust needs to be a key consideration in the design process.

How can organisations build trust?

The first step is to ensure your organisation is using a trustworthy COVID-19 check-in solution. Customers don’t necessarily distinguish between an organisation and its service providers. If a provider breaches the Privacy Act, it can reflect poorly on the organisation that engaged it. Depending on the solution, you may want to consider assessing the provider’s privacy and security performance, how personal data will flow through the solution and how your organisation will interface with it. This can help you proactively identify risks and build in protections.

If your organisation is looking to build trusted relationships with your customers, you need to start with transparency. Provide your customers with clear, succinct and readable privacy messages whenever you are requesting information, so they understand what data you are collecting, why it is needed and how it will be used. You should back this up with a more comprehensive (but still plain-language) privacy policy, so customers can learn more if they so desire.

Behind the scenes, consider whether your policy framework and internal processes are sufficient. It’s not enough to just ask your customers to trust you; you must be able demonstrate that you are trustworthy. As an organisation, consider if you are handling personal information in a responsible and compliant way. Have you implemented policies and processes that enable you to comply with the Australian Privacy Principles, as required by APP 1 (i.e., open and transparent management of personal information)? Are those policies and processes effective?