Facial Recognition & Privacy

The deployment of facial recognition in commercial applications has increased, creating various use cases for this sophisticated technology including tagging, authentication, targeted advertising and analytics. While facial recognition technology has come under fire for being deeply invasive and biased, those companies with a need to use this technology should not be dissuaded so long as they implement the technology within a proper privacy, security and ethics framework.

What is facial recognition?

While the term “facial recognition” is used broadly, it is important to distinguish between the different uses of the technology, which include:

  • Facial detection: Identifying whether a moving or still image contains a face;

  • Facial characterization: Approximating the characteristics — such as gender or age range — of a facial image;

  • Facial recognition authentication: Comparing a facial image to a single stored template to determine whether an individual is who they claim to be; and,

  • Facial recognition identification: Comparing a facial image to a database of stored templates to determine the identity of one or more individuals.

Of the above, facial recognition for identification purposes is the most privacy invasive. It is this type of facial recognition that has been banned in several U.S. cities including Boston, San Francisco and Oakland. While the ban applies only to city departments, Portland has gone a step further and also banned its use by public-facing companies, such as stores and restaurants. (In Canada, the International Civil Liberties Monitoring Group and Open Media have published a widely signed open letter, arguing for a ban on use of the technology by federal law enforcement.)

The law governing facial recognition

Unlike parts of the U.S., Canada does not have specific facial recognition legislation. However, the collection, use and disclosure of biometric data by a private sector company is subject to the Personal Information Protection and Electronic Documents Act (PIPEDA).

Companies wishing to implement facial recognition technology should first contemplate s. 5(3) of PIPEDA:

“An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances.” (emphasis added)

In evaluating what is appropriate, consider the four factors as set out in R. v. Oakes [1986] 1 S.C.R. 103, namely, (1) Necessity (Is the proposal rationally connected and demonstrably necessary for an activity?); (2) Effectiveness (Is the proposal likely to be effective in meeting the defined objective?); (3) Proportionality (Is the loss of privacy proportionate to the importance of the objectives?); and (4) Alternatives (Are there less privacy-invasive ways to achieve this objective?). By way of example, the Ontario Lottery and Gaming (OLG) has a self-exclusion program that allows those who no longer wish to gamble to be added to a database of self-excluded individuals. Images of individuals entering casinos are then compared against those in this database using facial recognition, with potential matches being flagged and OLG staff notified. Non-matched images are discarded.

A high-level Oakes analysis might demonstrate that facial recognition under these circumstances is appropriate. For example: (1) An automated system is necessary, as it is unlikely that OLG personnel could recognize by sight each self-excluded individual; (2) It is conceivable that this system would be effective (though it would be up to the OLG to demonstrate this); (3) The loss of privacy for non-excluded individuals is minimal, as their images are not retained, and excluded individuals have consented to the use of their data, making the trade-off proportional to ensuring responsible gaming; and (4) The alternative, such as requiring casino patrons to provide an identifier upon entry is arguably more privacy-invasive.

Addressing privacy principles

Companies using facial recognition technology should be prepared to make a strong case, demonstrating how its benefits outweigh privacy invasions. Companies must take a proactive approach (i.e. Privacy by Design) to ensure compliance with privacy regulations not only as a compliance exercise but also to earn the trust of their customers, partners and investors.

Prior to implementing this technology, companies should conduct privacy risk assessments, ideally by privacy experts, to ensure appropriate privacy measures are in place including, but not limited to the following:

1. Consent: Evaluate what kind of consent is required (i.e. explicit or implied), which will be dictated by the data being collected and the purpose for which it is being used.

2. Transparency: Companies must provide notice to individuals about the collection, use and disclosure of personal information collected via facial recognition technology. Retention of facial recognition data should also be transparent.

3. Intended purpose: Ensure the data is being used for the intended purpose, aligned with an individual’s expectation, and not for an unintended secondary purpose. Individuals’ expectations are contextual depending on their age, sex, demographics and sophistication.

4. Data minimization: Companies should collect as little information as needed to fulfil the purpose for which they are collecting the information. For example, facial detection, which is a less invasive method, may fulfil the purpose as opposed to facial recognition which is more invasive.

5. Security: Companies must ensure they have appropriate security safeguards to protect the sensitive information they are collecting both from internal and external security risks.

While facial recognition brings with it significant risks, companies that enforce a strong privacy program and can demonstrate accountability will significantly reduce their risk of privacy noncompliance. As the privacy landscape continues to evolve, companies must also be mindful of who they are targeting and which regional privacy regulation applies to them.

This article was originally published on the Lawyer’s Daily.

Previous
Previous

Online Education and the Responsibility to Protect Children’s Privacy

Next
Next

50 Shades of Privacy