Biometric Technology is Turning Privacy Law Upside-Down

Optimized-biometrics-airport.jpg

Laws relating to privacy in the United States have always been fairly simple and straightforward. The 4th amendment to the constitution limits unlawful search and seizure – the government isn’t allowed to send army personnel or police into your house to take your stuff (at least without a good reason). Laws against trespassing and theft prevent people from sneaking on to your property to stare into your windows, or climbing in through those windows to steal your personal journal and use it to blackmail you (also illegal) or sell the information to the highest bidder in the local market. Boundaries were easier in the past – where your property ended and where the public space began tended to be fairly obvious, and the expectation of privacy given up as a result of entering the public space was equally common-sense.

The era of the internet has, of course, changed everything. Nothing online could reasonably be considered “public space,” but even the most private of communications shared between two people occurs in a space belonging to or administered by a third party. How that information is used has been the subject of rigorous debate in recent years. At issue is whether or not the hosting company is entitled to use any of that information at all; if so to what degree, and even what kind of consent is required for them to do so. The lynchpin in the use of this information lies in the End User License Agreement (EULA, for short), a list of permissions written by the company and agreed to by the user. This is where the user will typically, without reading, agree to anything from selling their personal data to selling their soul.

Tenuous though its foundations may be, the agreement between a customer and a business is still fairly straightforward. However, biometrics, the science of determining identity via biological signatures like fingerprints, have begun to complicate these relationships significantly. Even implicit agreement is highly questionable when someone’s movements are being recorded as they walk down the street. It’s easy to see how the limits of even consensual data collection can be stretched, as well. 23andMe, a booming business that offers low-cost genetic testing, is building a more and more comprehensive portfolio of genetic data of millions of Americans, and they promise that they will be trustworthy custodians of such important information.

Returning to regular digital data for a moment, the government has been keen on pressuring companies like Apple and Google for access to cell phones and online accounts when investigating crimes, and there’s nothing to stop them from doing the same for genetic data. It may sound like a great idea, especially when considering the high volume of convicted criminals recently exonerated by new DNA evidence, but some are wondering – what if they use the information to target protesters?

There is now a firm relationship between companies like Facebook and 23andMe that collect and aggregate the data of their users for the purpose of monetization, and the public that use their services. However, EULAs are being constantly modified and implicit consent is being stretched to the point of absurdity. Imagine a real-life relationship like this: You and your spouse are living happily together, and sometimes you share a toothbrush or look through each others’ phones, but one day your partner begins collecting your blood and selling your text messages to Comcast. If this were the case, it would be time to get advice from a divorce attorney. And yet we allow companies to continue doing the same with relative impunity. One has to wonder why.

One reason could be that when used properly, biometrics are very safe and reliable. The same reason why fingerprints found on a gun or at a crime scene can be used to help convict someone can also be the thing that keeps your phone or laptop safe in the event that your password is compromised. This is all the more important because most people have terrible password security habits, making their passwords some combination of birthday, graduation date, and family names, but nobody can intentionally or unintentionally have bad biometric security. It can, in a way, save people from themselves. There can be unintentional wrinkles, however – when crossing the United States border, for instance, devices like laptops and cell phones locked with passwords are protected by the 4th amendment, whereas devices locked with biometric data are not.

In some places, privacy laws are beginning to catch up to with reality. Illinois has become famous for its fairly comprehensive biometric privacy laws, which companies – even the most unexpected ones – are, predictably, fighting against. Six Flags recently lost a court case in which it collected the fingerprints of a teenager, who was too young to legally consent. Six Flags lost their legal battle, meaning that Illinois’ law will stand for the immediate future. But technology will continue to change, and the public and legislators alike must be ready to adapt with the times.

Leave a Reply

Your email address will not be published. Required fields are marked *