Yesterday I wrote an article about why I won’t be installing CovidSafe. A day later I’ve become concerned by the response to the app from the local tech industry and journalists reporting on misunderstood security reports.
Whether maliciously designed or not, the CovidSafe app is significantly better at general surveillance than it is at tracing the spread of SARS-CoV-2.
If CovidSafe is to truly be a health app and not a pseudo-surveillance app, then it needs to be rebuilt to maintain privacy using a decentralised model and using a design that enables it to provide trustworthy public health advice.
Countries like Germany have switched their tracing app to the decentralised model, as designed through a joint venture by Apple and Google, in the past week for these reasons. Privacy is the motivation behind Apple and Google coming together to build a capability deep into their mobile operating systems, for governments to build apps on top of, without unnecessarily violating the privacy or security of their citizens.
Currently, the Apple-Google model is entirely incompatible with the Australian government’s CovidSafe app as it stands today. While the government are deciding whether to make the source code available for the public to analyse, some groups have reverse-engineered the iOS and Android CovidSafe apps to take a sneak peek. However, this analysis doesn’t reflect that the entire system is either secure or effective, only that the interface to the system has no significant issues.
I’d have been surprised if the front apps did have any significant issues.
Contrary to media reporting, the security industry has not given CovidSafe the nod, and I expect to see the medical sector come together with the tech/security industries in the coming weeks to build an app that works.
What’s the problem?
Bear with me as I try to make an accessible analogy for the CovidSafe app:
You have a house. The government has passed new laws. Now you must get your locks and keys from the government, and they keep a spare copy of everyone’s key in a special storage facility in Sydney. They’ve hired modern Amazonian warriors to keep guard.
The government promises to keep your spare keys safe and to only use them when necessary. They promise that they won’t change their policy to use your key for other uses in the future.
Last year, they made a new set of laws that said that the local banks (i.e. tech companies), if asked, would have to give the government the keys to your vaults (i.e. your online data) and made it illegal for the banks and their employees to go to the media about it.
The local banks, who specialise in secure vaults, are showing support for the government’s new initiative to issue and keep your house keys as well. It might be a smart decision to toe the line and promote the changes positively, rather than to upset the government.
Foreign banks (i.e. Google and Apple) have said they will produce special secure locks with keys for everyone without giving governments access, because governments shouldn’t be keeping copies of every key. They work together in an unprecedented capacity to design special secure front doors for homes that will allow their customers to be protected, and to give governments only the essential access needed.
Some great people have been able to pull apart the locks to see how they work. The locks seem like they function relatively well, albeit not as good as other locks. It’s better than nothing. But we still don’t know anything about how the government are producing or storing the keys, or how easily someone else might be able to steal them.
The government quietly says that they intend to use the new lock and key system to more easily criminally prosecute people. It’s unclear if there will be new laws rapidly implemented, but recently introduced laws have been ambiguous and difficult to interpret so it’s a possibility that ethically acting people may be criminally charged.
The media praise the government for having produced some good locks, and continue to push more people to install them. It’s for the safety of the nation, they say, that the gov should make your locks and store your keys.
They don’t ask why we’re not using a better solution for fear of backlash.
CovidSafe is Flawed
The above analogy is very absurd, but it’s exactly what’s happening today. There are several major flaws with the government’s implementation of CovidSafe and they’re not being addressed, falsely, in the name of public safety.
We all understand that it is safer to keep our own house keys with us, and that there’s no need for the government to produce our keys and locks for us. We know that the government-provided locks wouldn’t be great. We know that there’s no need for the government to keep a copy of all of our house keys.
The situation is the same with our tracing data. While it was a great decision to only record tracing data using Bluetooth rather than tracking data with GPS, this is not enough. The health authorities only need to know who is sick, and who might be sick. They don’t need to know whether one person has been in contact with another. It’s a slight but critical difference.
It’s also no argument to claim that corporations already have too much of our data, and so we shouldn’t worry about this. The point is we need to be more protective of all our personal data, especially when we don’t generally understand the consequences of giving it up until it’s too late.
We need to turn the tide in favour of good data security now.
Elephant in the Room
Our Chief Medical Officer in charge of the project stated that the app won’t record interactions less than 15 minutes long, and this is by design. He says this is because, “We don’t want to give the contact tracers a list of 1000 phone numbers when there are 25 that are much more relevant in terms of potential contact”.
Just like the 5-second Rule is not real, there is no 15 minute rule, either. The limit is enforced for bureaucratic resource limitations, but if we limit the app to report only interactions longer than 15 minutes, we’re going to miss most community infections.
The entire basis of CovidSafe being a public health app is flawed because of this important detail.
CovidSafe will only be good for telling the government who you have spent more than 15 minutes with. It will encourage people to relax social distancing measures as it provides a false sense of security.
If they do, it will directly lead to more and unnecessary deaths.
A Better Solution
There is a better solution available for the goal of tracing the disease. The DP-3T model, which the Apple-Google implementation is inspired from, is a model for an app that ensures your privacy and security is maintained without the need for complex government oversight. An app built on this model can contact everyone you’ve been in proximity with, not only those for interactions longer than 15 minutes, and there is no added effort needed by anyone for the system to function.
Here’s a basic (but still complex) explanation on how it works. You have three kinds of keys (think of them like very long random passwords or codes):
- Master key: a very long random number that your phone keeps secret and belongs only to you
- Daily key: Your phone uses a special mathematical function to combine the current date, every day, with your master key to produce a ‘daily key’. The function is special because it can’t be reversed to determine your master key
- Proximity key: Every 10–20 minutes, your phone uses the same special mathematical function to combine your master key with the number of seconds since the start of the day to produce a ‘proximity key’. This rotating key is broadcast via Bluetooth, and your device records other people’s proximity keys on your phone if they’re in range.
If you have a keen eye you’ll notice that each of the three kinds of keys are related to one another. While you can’t use a proximity key or a daily key to reproduce someone’s master key, you can use someone’s daily key to produce a list of proximity keys.
Let’s say someone tests positive for COVID-19. They then get their app to upload the past few days worth of daily keys to a public server. This data is in no way identifiable, and the database is designed to be publicly accessible — like a special noticeboard.
Every day, your phone checks this list of other people’s daily keys, which represent people who are known to have been diagnosed positive. For each daily key on the list, your app uses a special function to produce a set of that person’s proximity keys for the entire day. If any of those generated proximity keys match a proximity key you’ve previously recorded, then you know you’ve been in proximity to someone who has just been diagnosed positive.
Importantly, you don’t need to know anything else about that person. You don’t need any middle party to tell you that you might be infected. The government don’t need to limit their apps to 15 minute intervals because they don’t have the resources to call everyone for encounters less than 15 minutes long. The government don’t need to store data on who you’ve been in contact with.
The government do need to know when someone has tested positive, though, and it would be great if they could find out who has been in contact with someone infected. The app can easily have a feature to report a notification to the health authority so they can still get this data.
The decentralised model is an incredibly elegant solution that performs much better than CovidSafe while maintaining the privacy of citizens. This is what Apple and Google have used as the basis for their impending updates and are currently asking governments around the world to use as a basis for their local tracing apps.
There is no need for CovidSafe to be implemented in the way it has been. It’s ineffective and it violates the privacy and security of Australian citizens.
We need a safe, working app as soon as possible. Real contact tracing is critically important to overcome the spread of SARS-CoV-2.
We have the capability for a solution to do this now, so lets ditch CovidSafe V1.0 and build a legitimate CovidSafe 2.0 starting today.