
Judges are set to rule on the first major legal challenge to police use of automated facial recognition (AFR) technology.
A judicial review was held in May after Ed Bridges, from Cardiff, claimed his human rights were breached when he was photographed while Christmas shopping.
The civil rights group Liberty said it was akin to the unregulated taking of DNA or fingerprints without consent.
South Wales Police said its use of AFR was lawful and appropriate.
Automated facial recognition technology maps faces in a crowd by measuring the distance between features, then compares results with a "watch list" of images - which can include suspects, missing people and persons of interest.
Mr Bridges said he had his image captured by the technology a second time at a peaceful protest against the arms trade.
His legal challenge argued the use of the tool breached his human right to privacy as well as data protection and equality laws.
South Wales Police, Metropolitan Police and Leicestershire Police have used facial recognition in public spaces since June 2015.
There are currently no rules controlling the use of AFR, and concerns have been raised the technology is more likely to return false positives for women and people from ethnic minorities.
During the judicial review, the Information Commissioner argued the legal framework for the police to use AFR was not sufficient, with concerns raised the system could be hacked.
The commissioner's legal team argued that if deployed routinely, without tighter rules, the data could potentially track people's movements and habits.
With 12.5 million images on the police national database - including those not found guilty of a crime - it was argued rules were also needed on which images are included on the watch list.
Mr Justice Swift and Lord Justice Haddon-Cave will give their decision at the High Court in London on Wednesday, in what they have already described as "an important case that makes novel and potentially far-reaching" conclusions.