The Race to Prevent ‘the Worst Case Scenario for Machine Learning’
“The posture is different today,” said Dr. Portnoff.
Still, she said, “If I could rewind the clock, it would be a year ago.”
‘We trust people’
In 2003, Congress passed a law banning “computer-generated child pornography” — a rare instance of congressional future-proofing. But at the time, creating such images was both prohibitively expensive and technically complex.
The cost and complexity of creating these images has been steadily declining, but changed last August with the public debut of Stable Diffusion, a free, open-source text-to-image generator developed by Stability AI, a machine learning company based in London.
In its earliest iteration, Stable Diffusion placed few limits on the kind of images its model could produce, including ones containing nudity. “We trust people, and we trust the community,” the company’s chief executive, Emad Mostaque, told The New York Times last fall.
In a statement, Motez Bishara, the director of communications for Stability AI, said that the company prohibited misuse of its technology for “illegal or immoral” purposes, including the creation of child sexual abuse material. “We strongly support law enforcement efforts against those who misuse our products for illegal or nefarious purposes,” Mr. Bishara said.
Because the model is open-source, developers can download and modify the code on their own computers and use it to generate, among other things, realistic adult pornography. In their paper, the researchers at Thorn and the Stanford Internet Observatory found that predators have tweaked those models so that they are capable of creating sexually explicit images of children, too. The researchers demonstrate a sanitized version of this in the report, by modifying one A.I.-generated image of a woman until it looks like an image of Audrey Hepburn as a child.
Stability AI has since released filters that try to block what the company calls “unsafe and inappropriate content.” And newer versions of the technology were built using data sets that exclude content deemed “not safe for work.” But, according to Mr. Thiel, people are still using the older model to produce imagery that the newer one prohibits.
For all the latest Business News Click Here
For the latest news and updates, follow us on Google News.