Facial Recognition: Emergence, Threat, and Alternatives

The public at large is aware of the emergence of facial recognition in public spaces, but awareness of the extent and pervasiveness of this technology remains low. The way facial recognition is implemented can be problematic for personal privacy and make it an undesirable alternative to weapons detection technology.

Facial recognition works by scanning a face, for example at a sporting event, and using artificial intelligence to compare that face to an existing database of faces. A match can then be used to determine who an individual is or if they are a person of interest, such as someone with a criminal record. Further action by the venue or authorities can then be determined based on the match.

Challenges with Facial Recognition

Most people inherently recognize the invasive nature of facial recognition technology, however they likely don’t fully grasp its potentially harmful effects.

The database is extremely large and only continues to increase in size. A 2016 report by the Georgetown Law’s Center on Privacy and Technology found that 117 million Americans are in police facial recognition databases, which is a significant percentage of the American population. This is a privacy issue, as most of these individuals have not consented to having their likeness captured and stored.

Misidentification errors and the potential for people being wrongfully singled out is another cause for concern. The technology has significantly increased in accuracy in recent years, however the software behind facial recognition still has issues correctly identifying minorities and other smaller demographics. This is due to smaller datasets against which the technology can be trained.

In addition to matching individuals, facial recognition technology can also be used to identify demographic traits of individuals themselves. This is an area that really illustrates why facial recognition software can be perceived as threatening, from a human rights standpoint. For example, China has used the technology for racial profiling, tracking, and controlling of Uighur Muslims. There have also been reports of Israel using it to covertly track Palestinians inside the West Bank. Uses like these raise the fear of the creation of 1984-style surveillance states. Worse still, this can very well happen anywhere. For example, Britain has no laws overseeing the capture and use of faces for facial recognition. In the US, only five states have laws overseeing the technology. As it stands, there is virtually nothing stopping any government from scraping faces from social media and adding them to their own databases, which can then be used for whatever means they see fit. The question then arises: at what point do the human and privacy costs of facial recognition make it an undesirable alternative to weapons detection systems?

A Better Alternative

There is a way to preserve people’s safety while avoiding the natural consequences of facial recognition; advanced weapons detection scanning technology that identifies threats without violating privacy. HEXWAVE is at the forefront of this emergent category. Instead of scanning faces, Liberty’s technology scans and creates active real-time 3D images. It uses artificial intelligence to learn to distinguish between harmless items, like keys, and actual threats, like guns. The technology also boasts the unique advantage of detecting both metal and non-metal objects.

Privacy is a non-issue with HEXWAVE, because the system doesn’t scan or store any personally identifiable information. The active real-time 3D radar imaging only captures items on an individual, irrespective of any individual’s features, facial or bodily, so privacy and sensitive data is protected. The system will also be able to be set up indoors or outdoors, covertly or overtly. It is entirely flexible and will be able to be seamlessly integrated with existing security systems. It is the perfect innovative security alternative to address the privacy concerns that facial recognition software raise.


Share this post