

Not that I think this is the way, I don’t, but…
With the amount of time it takes to train a completely green person off the street, even at seemingly menial tasks, im not sure corporations would actually allow this.
Although, they arent paying a wage, this plan would eat into real production time and materials, and with this “just in time,” software oriented, prefab mindset they have, overall i think they would still lose money.
Sure they don’t have to train people to think anymore, but even operating machinery correctly or following a preset design, is rough for alot of people.
The struggle to find knowledgeable, skilled labor is real, but unless paid people are taking time out of thier day to teach these interns the ins and outs of a machine or how to read plans, said intern wouldn’t learn jack squat. Unless the company has time and money to kill, at the very least, trade school is still required.
Nah, corporations would never go for it.
Maybe an unpopular opinion, but I feel like anything produced by AI should be somehow watermarked at the source. At this point there’s only a handful of companies. It wouldn’t be too hard to have them all insert something into the final product that is easily identifiable. Something like a microscopic signature in a corner, with model info and date produced…idk. Not anything that ruins the image, but something that can be seen by anyone, if looked for.
If nothing else there should be a large push to inform the public of telltale features to look for (i.e. too many appendages) to help them determine if it’s created by AI or not. While not fool proof, if it can discount even a portion of the misinformation, imo, it’s worth an effort.
To me, it seems irresponsible of the companies running the AI to just unleash it upon the world without training us humans to understand what we’re looking at. Letting us see how realistic everything is while letting us know its been produced by AI, at least helps us to comprehend the scope of the matter and adapt to the situation at hand. Esp for those who don’t fully grasp what AI can and cannot do.