China Pushing Explicitly-Biased Facial Recognition Standards And Local Tech Companies Are Pitching In To Help
Facial recognition tech is plagued by bias, most of it unintentional. That's why it tends to perform more poorly when attempting to recognize minorities and women. Law enforcement doesn't tend to view these problems as bugs since it, too, operates with many of the same biases. But these are usually the byproduct of faulty inputs, which can be exacerbated by choices made by end users.
In China, the bias is the point. The Chinese government's persecution of its Uighur population has seen local tech companies tasked with providing surveillance tools that single out Uighur Muslims so the government can more efficiently control them.
Huawei is building a system that provides the government with "Uighur alarms" whenever a suspected Uighur passes in front of the government's millions of cameras. According to Huawei, this is still in the testing phase, which means nothing more than there's a plan for it to be put to use. Even if it's no more accurate at identifying Uighurs than it is at identifying criminals, it will likely be good enough to put to real-world use. Collateral damage to innocent residents isn't the sort of thing that slows surveillance rollouts in China.