Image filtering tech steps up

Saw this over at Arstechnica
I've been reading for a while about various attempts to use automated image recognition techniques to block smut, so I wasn't surprised to see this story on News.com about just such a product, called LookThatUp. What did surprise me, however, was this quote:

The 1-year-old company's first clients were the French police, who have been using an early version of ImageFilter to help find pedophiles since late last year. Detectives nab pedophiles who download child pornography, then hunt for similar photos in other people's hard drives. The French patent office also uses the technology to ensure that patent seekers aren't submitting drawings or photos of previously patented products.

Could it be that this particular product actually works? The previous ones certainly haven't. As someone who took some undergraduate courses in both digital imaging and neural networks, I know firsthand how hard it is to get a computer to even recognize basic shapes, much less whether or not an image depicts a violent, controversial, or sexual act. I mean, check out the following claim, and tell me it doesn't sound impossible:

Before ImageFilter sends out alerts, corporate clients must first designate an "acceptance" rating from 1 to 100. A business that has a high tolerance for offensive images--such as a photo gallery specializing in nude prints--might not want to be alerted until the degree of offensiveness hits 80 or 90. An e-tailer specializing in toys for preschoolers, by contrast, might pick an acceptance rating of 5 or lower. Clients can customize their tolerances for any offensive behavior--nudity, violence, sexual acts or other potentially controversial images.

So they're claiming to be able to actually rate the degree of unacceptability of an image on a scale of 1 to 100? I absolutely don't believe that this can work as advertised. What I suspect is that it might appear to work a little bit under select circumstances, and that when put to the test we'll find that it does much more harm than good.

Tags: