Explained: Image generators need so much data that there is no one at the companies that build th...

Explained: Image generators need so much data that there is no one at the companies that build them who can keep track of where the data comes from and what is actually portrayed in the source data. No one can take responsibility, which the bros embrace as a feature, not a bug.

That is why I am particularly bothered by generated images of children, as it has previously been shown that large datasets likely contain many images of abuse. There are such large amounts of images that it would take thousands of years to look through them. Which to me makes it obvious that they contains violent images of all kinds.

So when an image (or film) is generated where children appear – in any situation, I always wonder: why does that particular child's face appear? Whose child might have fallen victim to the computational model this time… resurfacing pain that refuses to go away when tech companies immortalize it.

And yes, of course this goes beyond children.

This makes me truly shudder amongst all other carnage the software causes.