Black and minority ethnic people could be falsely identified and face questioning because police have failed to test how well their systems deal with non-white faces, say campaigners.
At least three chances to assess how well the systems deal with ethnicity were missed over the past five years, the BBC found.
Campaigners said the tech had too many problems to be used widely.
"It must be dropped immediately," said privacy rights group Big Brother Watch.
Lost lessonSeveral UK police forces have been trialling controversial new facial recognition technology, including automated systems which attempt to identify the faces of people in real time as they pass a camera.
Documents from the police, Home Office and university researchers show that police are aware that ethnicity can have an impact on such systems, but have failed on several occasions to test this.
The Home Office said facial recognition can be an "invaluable tool" in fighting crime.
"The technology continues to evolve, and the Home Office continues to keep its effectiveness under constant review," a spokesman told the BBC.
The ability of facial recognition software to cope with black and ethnic minority faces has proved a key concern for those worried about the technology, who claim the software is often trained on predominantly white faces.
Minutes from a police working group reveal that the UK police`s former head of facial recognition knew that skin colour was an issue. At an April 2014 meeting, Durham Police Chief Constable Mike Barton noted "that ethnicity can have an impact on search accuracy".
He asked CGI, the Canadian company managing the police`s facial image database, to investigate the issue, but subsequent minutes from the working group do not mention a follow-up.
Facial recognition was introduced on the Police National Database (PND), which includes around 13 million faces, in 2014.
The database has troubled privacy groups because it contains images of people subsequently cleared of any offence. A 2012 court decision ruled that holding such images was unlawful.
The "unlawful" images are still held on the PND. The government is currently investigating ways to purge them from the system.
Despite this, the PND facial recognition system, provided by German company Cognitec, has proved very popular.
The number of face match searches done on the PND grew from 3,360 in 2014 to 12,504 in 2017, Freedom of Information requests to the Home Office have revealed.
In 2015, a team of assessors from the Home Office tested the PND facial search system, using about 200 sample images. They had identified ethnicity information about the sample photos but, once again, failed to use this opportunity to check how well the system worked with different skin colours.
Another chance to check for racial bias was missed last year during trials by South Wales of real-time facial recognition software, which was used at sports events and concerts. Cardiff University carried out an assessment of the force`s use of the technology,
That study stated that "due to limited funds for this trial", ethnicity was not tested.
Cardiff`s report noted, however, that "during the evaluation period, no overt racial discrimination effects were observed", but said this may be due to the demographic make-up of the watch lists used by the force.
In addition, an interim report by a biometrics advisory group to the government considering ethical issues of facial recognition highlighted concerns about the lack of ethnic diversity in datasets.
Under-representation of certain types of faces, particularly those from ethnic minorities, could mean bias "feeds forward" into the use of the technology, it said.
Silkie Carlo, director of campaign group Big Brother Watch, said: "The police`s failure to do basic accuracy testing for race speaks volumes.
"Their wilful blindness to the risk of racism, and the risk to Brits` rights as a whole, reflects the dangerously irresponsible way in which facial recognition has crept on to our streets."
The technology had too many problems to justify its used, she said.
"It must be dropped immediately," Ms Carlo added.
Big Brother Watch is currently taking legal action against the Metropolitan Police over its use of automated facial recognition systems.
- Three ways all automakers can improve adaptive cruise control
- The Nescafe Dolce Gusto Piccolo XS coffee machine is small, very cool and delivers a quality cup every time
- Google to restrict political adverts worldwide
- The Biggest Bitcoin News of 2019!
- Physicists Claim They`ve Found Even More Evidence of a New Force of Nature
- Iran`s internet blackout reaches four-day mark
- Google Stadia wants to be the future of gaming. So do Microsoft, Sony and Amazon
- Amazon gets closer to getting Alexa everywhere
- John McAfee gives speech at the Barcelona Blockchain Week 2019
- Google Assistant is giving Android Auto users the silent treatment – what to do