I don't think it's a question of putting the cat back in the bag.
That kinda implies that there's nothing that can be done about the bad practices that got us here.
It's more about consent and holding companies accountable when they take personal photos, protected personal health information, and yes, entire galleries of an artist's lifetime of work with their crawlers.
Ideally, laws will be drafted so that companies can no longer help themselves to such data without express consent from affected individuals, people (including artists) can opt out of such engines both now and in the future, and those that decide to opt in can be properly compensated for their work, both initially and continuously (residuals).
This AI technology should've started with public domain artworks from the jump. AI companies should've then employed artists expressly to inform their algorithms. And if they're as smart as they are contending, they can program it to be robust enough to handle an artist pulling their work in the future, should they decide to do so.