What no one tells you about Big Brother
Facial recognition software, coupled with ever increase amounts of mass surveillance is worrying. Its compromising our civil liberties and privacy. In this post I explore the the impact, and what should do.
Mass public surveillance is something we associate with authoritarian regimes. Here government and other authorities like to know the whereabouts and activities of all citizens around the clock. This knowledge could prevent the lawful exercise of the rights of citizens to criticize those in office. Or to protest against specific government policies or corporate practices. For example, Beijing makes use of facial recognition technology. It likes to keep track of its people ( especially the Uighur Muslims). But outside the alternative reality of a Mission Impossible movie, this type of activity shouldn’t happen in the West. The reason is that we all value our civil liberties. We have civil rights right organizations and privacy campaigners to highlight any violations. These include:
All express concern that privacy is being compromised by the use of surveillance technologies. And the sad fact of the matter is that in the UK facial recognition technologies has reached epidemic proportions. According to a report by Big Brother Watch:
We now know that many millions of innocent people will have had their faces scanned with this surveillance without knowing about it, whether by police or by private companies
Facial recognition software across the UK
In 2019 the Financial Times first reported that facial recognition software was in use in the King’s Cross area of London. This included 67 acres of regenerated land around King’s Cross station. This area is home to shops, offices, Google’s UK HQ and part of St Martin’s College. The zone’s developer, Argent, says it’s trying to make sure the public is safe. The ICO said:
Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all.
Argent has not revealed what, exactly, it is using the technology for. Let’s take a guess. Argent and other companies could scan faces then match them against police watch-lists. Or even compile their own watch-lists. Or they could pass data to third parties.
Facial recognition software has also been used at:
- Meadowhall shopping centre in Sheffield,
- the World Museum in Liverpool, and
- the Millennium Point complex in Birmingham.
Are you reassured?
Police chiefs try to reassure us. They say that they already use biometric data such as DNA and fingerprints. This new form could enable real- time identification of suspected terrorists. Plus it could help find missing people. That may be so. But DNA samples or fingerprints are time consuming to gather. And there are rules limiting their use and how long police can keep them. Facial recognition data is subject to far fewer limitations. For example, you have to request that your face is deleted from a police database. What’s more, the technology is unreliable. It struggles to identify dark-skinned people, women and children. The result is that those in authority could misidentify people as suspects. Then they’ll have to prove their innocence.
Moreover, as technology becomes more pervasive it creates more opportunities for snooping and blackmail. Too often, politics and the law has failed to keep up with digital innovation. Plus it struggles to retrospectively regulate it. This time, we must get robust and transparent guidelines in place before it’s too late for us all.
What do you think?