Judges have ruled against a shopper who brought a legal challenge against police use of automated facial recognition (AFR) technology.
The court refused the judicial review on all grounds, finding South Wales Police had followed the rules and their use of AFR was justified.
The High Court said this was the first time any court in the world had considered the use of the technology.
The civil rights group Liberty said its client Ed Bridges will appeal decision.
It had argued it was akin to the unregulated taking of DNA or fingerprints without consent, and it is campaigning for an outright ban of the practice.
The judicial review was held in May after Ed Bridges, from Cardiff, claimed his human rights were breached when he was photographed while Christmas shopping.
Lawyer Megan Goulding said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms.
“Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.”
Mr Bridges added: “South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent.
“This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”
An official at the Information Commissioner, which had argued during the judicial review that the legal framework for police use of AFR was not sufficient, said they would be reviewing the judgment carefully.
They welcomed the finding that the police use of the technology involved processing sensitive personal data.
“Our investigation into the first police pilots of this technology has recently finished. We will now consider the court’s findings in finalising our recommendations and guidance to police forces about how to plan, authorise and deploy any future [facial recognition] systems.
“In the meantime, any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply.”
Automated facial recognition technology maps faces in a crowd by measuring the distance between features, then compares results with a “watch list” of images – which can include suspects, missing people and persons of interest.
Concerns have been raised the technology is intrusive and more likely to return false positives for women and people from ethnic minorities.
Mr Bridges said he had his image captured by the technology a second time at a peaceful protest against the arms trade.
His legal challenge argued the use of the tool breached his human right to privacy as well as data protection and equality laws.
South Wales Police, Metropolitan Police and Leicestershire Police have used facial recognition in public spaces since June 2015.
South Wales Police had argued its use of AFR was lawful and appropriate.
Mr Justice Swift and Lord Justice Haddon-Cave, who gave their decision on Wednesday, had previously described it as “an important case that makes novel and potentially far-reaching” conclusions.