spying-eye-monitoring-surveillance-tracking-security-camera.jpg

Image: iStock/Borislav

When he was an undercover specialist surveillance photographer with the South Australia Police Force in the 1990s, David Chadwick was responsible for taking photos of suspected criminals and their associates from the backseat of a car, “just like you see in the movies”, he said.

He would return to the station, print his shots, then make multiple copies of the best quality image that would be distributed to police officers, among dozens of other shots, with hopes of finding out the identity of the individual talking to a known criminal.

“I would zoom in, crop, print off 50 copies of that, and I would stick those in the internal dispatch system and I would send them out to every detective agency in the state and say, ‘Right, we need to know who this is’,” Chadwick told ZDNet. “We had collections of criminal records photos, but they were under ‘name’, and we have no idea who this is.

“Then hopefully, at some stage in the next two, three, four, five days we get a response back saying, ‘Hey that looks like John Smith’.”

John Smith could be an old school teacher, a neighbour, or a drug dealer, but once his name was known, Chadwick said that would become a lead and then police work would come into it.

Now the director of identity and biometrics for Unisys Asia Pacific, Chadwick would argue the use of biometrics in 2021 is just a faster, and safer, way of performing this task.

“What police are doing with facial recognition is exactly what they did without facial recognition,” he said.

“Most of the time, you don’t know if this person has done anything wrong — if they’re coming out of a bank holding a sawed-off shotty and a bag of money, pretty good odds it’s a bad guy, but realistically, it returns essentially ‘I think that’s John Smith’, then police would do police work.”

See also: Australia’s cops need reminding that chasing criminals isn’t society’s only need

BIOMETRICS AND BIAS

The Australian Human Rights Commission in May asked for a moratorium on the use of biometrics, including facial recognition, in “high-risk” areas, such as in policing and law enforcement, until such time that legislation is in place that guarantees the protection of, among other things, human rights.

Chadwick would argue there needs to be education, not a moratorium.

He said real-life use of biometrics is not at all like what you see on CSI or NCIS.

“I’ll hack into the DMV to find a match — A. you’ve committed a criminal offence and B. you can’t,” he said. “It will then flash lots of images on a screen and produce one with flashing text saying ‘match’ underneath. Well, no, that’s not how facial recognition works.

“Facial recognition is incredibly good, but it’s only ever a probability of a match.

“Biometrics is a useful little tool in the identity management lifecycle and nothing else. It is all about identity, biometrics is just the sexy stuff.”

Biometrics only anchors the identity; he said it never returns a result saying, that with 100% accuracy, the person you are looking for is this one, rather it pulls a number of images, usually the top 20 matches, and in a random order.

“Unless you pass in a passport quality photo taken by a surveillance operative — I had a joke that if I ever take a perfect quality facial image, I’m burnt, I’ve been seen, because that means they’re looking right at me — this will be off-axis, might be a bit blurry, might be a bit grainy,” he said.

“You’ll get a stream of 20 images and most systems will not show you the best match because if you see one image that’s 99%, that’s likely to bias you.

“You might have two or three possible matches, but the emphasis is on possible. It’s a lead generation device.

Also raised by the Human Rights Commission, and many, many others, is the possibility of bias in the use of biometrics. According to Chadwick, that isn’t as prominent in Australia.

“Because they use machine learning it depends on the dataset that you train them on,” Chadwick said. “The Australian passport dataset is wonderfully diverse … most of the training databases in America is filled with correctional datasets, which is overrepresented by people of colour.”

A MATTER OF TRUST

Making the distinction between facial recognition and mass surveillance, Chadwick said, is important.

“Everybody’s confused,” he said. “You read about how terrible facial recognition is, about how people want it banned, and then they look at their phone and it unlocks and think this is wonderful, then you cross the border and you go, ‘this is fantastic’, without actually understanding this is also biometrics.”

He was pointing to the Australian government’s digital identity play.

The Digital Transformation Agency (DTA) has been working on Australia’s digital identity system for a number of years, going live with myGovID — developed by the Australian Taxation Office — which is essentially just a form of proof allowing the user to access certain online services, such as the government’s online portal myGov.

Read more: Australia to open digital ID system to private sector with consultation on new legislation

Chadwick would appreciate the DTA referring to this as a digital credential as the first step in correcting any confusion.

“There’s one thing government does really, really badly and that’s sell itself,” he said, noting there needs to be clear, simple communication from government about what it’s actually doing in the space.

“Even the very fact the DTA still calls it a digital identity, the first thing that goes through the average person’s mind is ‘oh you’re creating an identity database’ … It’s not an identity, it’s a credential.”

Chadwick said government needs to lift its game; communicate better and actually gain the trust of people. Industry carries some of the responsibility, too.

“Industry needs to stop selling bullshit, otherwise we end up like China where everyone thinks China has the most unbelievably good facial surveillance system in the world that could pick you out of a crowd and deduct 10 social points because you spat on the ground … it’s utter rubbish,” he said.

He said it is impossible to do accurate, many-to-many facial recognition matches in real-time.

“Imagine you’ve got 10 million people in the city, you’ve got to have a database of 10 million people and you’ve got to be scanning this low resolution camera for a thousand faces, so you’re doing a thousand faces to 10 million records, constantly. Sorry, it’s rubbish.

“We need to start telling an accurate and honest story … and understand some people will never believe you, the tinfoil hat wearers will never believe you.”

He also said there needs to be an understanding that the government is not tracking you.

“Police or intelligence agencies tracking — they may well be, but if they are, then you’ve got more problems because they think you’re up to no good,” he said.

“Biometrics isn’t the bad guy; biometrics is in fact a really important way to protect your identity, all this rubbish about identity, hackers getting in and changing your biometrics, oh my god, the Australian passport office has been doing this for 15 years, they’ve kind of got that bit figured out.

“It’s about trust, it’s about trusting the capability, but its also about the government being able to trust you are who you say you are, so they can deliver higher value services to you.”

Related coverage



Source link