The day of reckoning for smartphone contact tracing is here – Fast Company

Contact tracing via smartphones has reached a crucial moment. In early September, Apple and Google announced the release of an app-free COVID-19-tracing program that will alert users when they come into contact with someone infected with the virus. Until now, state public health authorities have released their own contact tracing apps using Apple and Google’s privacy-friendly “exposure notification” technology. Now, the bespoke apps are no longer needed, and this month, millions of iPhones and Androids around the country will ask their owners whether they want to enable exposure notifications. What will Americans answer?

Probably a resounding “no.” Despite the huge potential health benefits of smartphone contact tracing—especially if at least 60% of the population participates—many Americans will opt out of exposure notifications because Apple, Google, and the tech industry as a whole have lost our trust. And the only way to rebuild that trust is with new laws.

When did we become so suspicious of the tech industry? Facebook’s Cambridge Analytica scandal marked a turning point for the sector’s general image, but Apple and Google have done little to improve their own standings. Google has faced countless privacy controversies over the years, from scanning emails, to tracking children through education products, to a whole litany of shady dealings within their digital advertising empire. While Apple has long made lofty claims about privacy (with CEO Tim Cook going so far as to call privacy “a fundamental human right”) and has built many privacy and security protections into the iPhone, Apple has still permitted and profited from letting some of the biggest privacy offenders run rampant on the App Store. The broader sector faces similar levels of public distrust—according to the 2020 Edelman Trust Barometer, trust in tech sunk to new lows this year, and it experienced a sharper drop than any other industry.

Nevertheless, exposure notifications are an impressively privacy-friendly and trustworthy technology. The system uses Bluetooth to approximate the distance between users, meaning health authorities cannot collect, let alone track, a user’s actual location as they can with GPS. The exposure notification system also doesn’t collect or transmit personally identifiable information and uses cryptographically secured temporary identifiers to make sure it can’t be taken advantage of by hackers or data thirsty advertisers.

Despite these precautions, the weak adoption of existing American contact tracing apps—many of which do not yet use exposure notifications—are a sign of the trouble to come. Open up Care19, North Dakota’s official COVID-19 tracking app, and you will see only a few dozen North Dakotans using it at any given time. Rhode Island’s CRUSH COVID RI app is faring only a little better, with around 82,000 downloads (8% of the population). Utah’s Happy Together app, which has so far cost the state over $4 million, has been downloaded by just 2% of Utahns. If usual app adoption trends hold, only a fraction of people who download these apps will ever open them, not to mention keep them running in the background.

Apple and Google, with their unparalleled reach and technical expertise, had a chance to build something far more effective, but they were hamstrung in their engineering efforts by a shadow of public distrust. Their greatest hindrance was being limited to using only privacy-friendly Bluetooth signals. Bluetooth is not designed to measure the distance between two people—it is designed to keep devices connected, like speakers to a phone. Google admits in its developer documentation that a Bluetooth signal can be “misleading” for measuring proximity since the signal can be blocked by clothes, bodies, and walls. Exposure notifications would be much more effective if it could use other signals, such as GPS and Wi-Fi, but the public does not trust Apple and Google to collect that information—even though both companies already have access to a colossal amount of data through their ubiquitous apps and mobile operating systems.

We may never be able to fully trust tech companies, or for that matter any company, to have our best interests at heart. But the right federal privacy regulation could go a long way to ease our concerns. The Senate already has made a weak attempt at this with a recently announced Exposure Notification Privacy Act, but the bipartisan bill is far too specific to the case of contact tracing to meaningfully change how Americans see tech companies. Even laws such as the California Consumer Protection Act and Europe’s General Data Protection Regulation are more focused on giving people rights over their data than building trust.

Trust is inherently about vulnerability, and to make ourselves vulnerable to tech companies, we need to know that they will act in our best interests. To this end, legal scholar Jack Balkin recommends the law treat tech companies as “information fiduciaries.” A fiduciary is an entity that is legally required to put a client’s interests before its own. A doctor has a fiduciary duty to her patient to provide the best possible care; a stockbroker has a fiduciary duty to her investor to accurately portray how risky an asset might be. For the American public to trust tech companies with something as sensitive as contact tracing, we need not just technical protections, but also new legal guarantees that they will not use their unique power to do us unique harm.

As we head into an uncertain political and ecological future, we will likely face new once-in-a-generation crises, and once again, the tech sector may have an important role to play in our response. But these companies’ technical skills and vast infrastructure will not be enough if the public doesn’t trust them. A fiduciary duty won’t solve all of tech’s problems, just as the tech sector won’t solve all of the world’s problems. But the sooner we can enact laws to make them worthy of our trust, the better.

Gabriel Nicholas (@GabeNicholas) is a tech policy researcher at the NYU School of Law’s Information Law Institute and the NYU Center for Cybersecurity. He is also a fellow at the Engelberg Center on Innovation Law & Policy.