[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[FYI] (Fwd) FC: "Artificial intelligence" filter blocks news -- but




------- Forwarded message follows -------
Date sent:      	Tue, 20 Jun 2000 09:53:37 -0400
To:             	politech@vorlon.mit.edu
From:           	Declan McCullagh <declan@well.com>
Subject:        	FC: "Artificial intelligence" filter blocks news -- but not smut
Send reply to:  	declan@well.com

********
Some pornographic images BAIR approved as OK and my Perl test-script:
http://www.well.com/user/declan/bair/ ********


http://www.wired.com/news/technology/0,1282,36923,00.html

Smut Filter Blocks All But Smut
by Declan McCullagh (declan@wired.com)

3:00 a.m. Jun. 20, 2000 PDT
When Exotrope Inc. introduced its BAIR smut-blocking
software last year, everyone seemed wowed by the company's
claims of intelligent filtering.

New York Governor George Pataki applauded Exotrope's
"state-of-the-art technology," Tucows Network gave BAIR five
stars, and PC Magazine handed the program a coveted editor's
choice award.

But an investigation by Wired News shows that BAIR's
"artificial intelligence" does not work as advertised.

In tests of hundreds of images, BAIR incorrectly blocked
dozens of photographs including portraits, landscapes, animals,
and street scenes. It banned readers from viewing news
photos at time.com and newsweek.com, but rated images of
oral sex, group sex, and masturbation as acceptable for
youngsters.

Company representatives say they can't explain the program's
seemingly random behavior.

"I agree with you. There's something wrong," says Dave Epler,
Exotrope operations manager. "That's not the way our image
server is supposed to be working."

Exotrope, a privately held firm based in Elmira, New York,
claims to have developed "the industry's most advanced
software system" for intelligently blocking sexually explicit
images. BAIR stands for Basic Artificial Intelligence Routine.

Epler said BAIR's smart-filtering, introduced in March 1999, had
worked in the past. But he was unable to produce any version
of the program that performed as described.

Artificial intelligence experts say training a neural network to
work the way BAIR supposedly does would be impossible.
Anti-filtering advocates go a step further, and say Exotrope
hoodwinked journalists and politicians into believing hype about
advanced "artificial intelligence" and "active information matrix"
routines.

"I think all manufacturers of blocking software have suckered
journalists and politicians to some extent by claiming it is more
accurate than it really is," said Bennett Haselton, founder of
Peacefire.org. "This is an unusual case because we're talking about a
product with a zero percent accuracy rate."

One reason why reviewers could have been mistaken is that
Exotrope, like its competitors, has assembled a massive list of
off-limits websites.

If an image appears on one of those blacklisted sites, or if it
has words like "sex" in the filename, BAIR automatically
restricts it. It also will block access based on keywords
elsewhere on the Web page.

Wired News tested BAIR by creating a Perl program to extract
images randomly from an 87MB database of thousands of both
pornographic and non-pornographic photographs. The program
then assigned each of those images random numbers as file
names.

The random number requires BAIR to evaluate the graphic with
its "neural network" filtering to determine whether an image is
sexually explicit or not. The sex-themed images came from
Usenet newsgroups such as alt.binaries.pictures.erotica.female
and alt.binaries.pictures.erotica.male.

The results were dramatic: BAIR inexplicably blocked between
90 and 95 percent of the photographs with no regard for
whether they were sexually explicit or not. Of the ones that
were OK'd, about half were pornographic and half weren't.

BAIR incorrectly blocked photographs of Yellowstone, the
Baltimore waterfront, Snoopy, boats, sunsets, dogs,
vegetables and even a Wired News staff meeting.

It rated as acceptable for minors -- even on the most
restrictive setting -- explicit images of oral sex, anal sex,
group sex, masturbation, and ejaculation.

Exotrope officials say they plan to fix the errors within the
next month. BAIR works by funneling Web connections through
Exotrope's proxy server, which the company says is
malfunctioning.

"We're working through our image server problems as we
speak," says Exotrope's Epler. "We'll have this thing up in less
than 30 days. You caught us at a bad time. I know it works
very well. I did accuracy tests on this thing 30, 60 days ago
and it was perfect. It went south."

When asked if Exotrope has a backup copy of a working
image-recognition routine that was introduced in March 1999
and could be installed on another proxy server, Epler said he
did not. "I certainly don't know of a copy," he said.

Epler also said he could not release a copy of the
image-recognition routine that runs on the server, even under
condition of a nondisclosure agreement.

Dave Touretzky, a senior research scientist in the computer
science department at Carnegie Mellon University, doubts
Exotrope's claims.

"How do you tell the difference between a woman in a bikini in
a sailboat which is not racy and a naked woman in a sailboat?"
Touretzky asks. "The only difference is a couple of nipples and
a patch of pubic hair. You're not going to be able to find that
with a neural network."

"If they don't disclose the training data, there's no way to
figure out what's going on," Touretzky says. "But anyone who
knows anything about neural networks knows there's no way it
can do what they're claiming."

[...snip...]

----------------------------------------------------------------------
---- POLITECH -- the moderated mailing list of politics and technology
To subscribe, visit http://www.politechbot.com/info/subscribe.html
This message is archived at http://www.politechbot.com/
----------------------------------------------------------------------
----
------- End of forwarded message -------