No. I'm saying it's going to happen. Whether I'm "for it" or not doesn't matter. I'm not for "replacing people", because that's an idiotic and reductive way of talking about change and technological disruption.
What I'm "for" is recognizing reality and dealing with it, instead of being in denial and trying to fight something that's going to happen no matter what. Embrace change, and channel it productively - don't stand in front of it with your fingers in your ears and your eyes closed, hoping it'll just go away. You are not going to stem the tide, and you are not going to build a sandcastle wall to delay it - instead, find a way to make use of the tide, even if it means digging up the beach.
I'm saying it's going to happen. Our job now is to prepare for that. Not to try and defer an inevitable future by artificially propping up an industry that's sure to be disrupted eventually. I'm saying the foundations of the building are going to give way, and you're saying "but for now people need a place to live, so let's add a few more floors to at least give them a roof over their head". Instead you should be finding these people a new place to live.
Now is the time to prepare. I'm not saying the studios are in the right. I'm not saying the union is in the right. I'm saying both sides need to realize this is coming, and find a way to adapt sooner rather than later. The studios need to support that transition, and the unions need to support it, too. Each in different ways.
Because that introduces distorting factors into market dynamics that ultimately make things worse for everyone. There's a reason market-driven economies tend to do better overall, and raise the standard of living more. There needs to be oversight, absolutely - far more oversight than is happening in many places, including the US. But that oversight needs to be regulatory and protective, not directive and deterministic. Governments need to make sure market forces don't go against social interests of wellbeing and equity, not arbitrarily define market forces themselves.
Fundamentally, in a market economy governments should make sure that people are treated fairly, equitably, and safely in their jobs - no one is guaranteed a job, only that if they have one, they're not being endangered or exploited. And just to be clear I do think that governments should do more to ensure everyone has a dignified standard of living but they should do that through other means, not through market intervention - things like UBI, assistance programs, etc. We need to dramatically rethink how societies treat work, but that does not include them artificially preserving jobs we have the technology to supersede. That's counterproductive and dangerous.
There's two main reasons.
1. It's unenforceable. There is no way to check the creative process. Studios could easily just do bizarre things like hire random people that do nothing but read out AI scripts and pass them off as their own. Nothing could practically monitor this. Nor could you effectively monitor AI training data. These are not feasible safeguards, because anyone in their basement can do this, and they will.
2. It's not in the consumers' best interest. Right now, AI isn't of the same quality as human creative work, but that won't remain the case forever. As it approaches human quality or surpasses it (as it will in some areas, though probably not all) you'll approach a cost to quality to quantity ratio that's more beneficial to consumers. It's not your choice to make what ratio you prefer - it's the consumers' choice. If they would rather see 3 TV series that are a 6/10 than 1 series that's a 9/10 then that's for them to decide - not for anyone else. And it's ESPECIALLY their choice if it's "would you rather pay $2 for a 6/10 series or $20 for a 9/10 series". That's how free markets operate, and if you don't allow that to play out, you risk massive corruptive biases by those who get to make that choice instead of the consumers.
I'm not sure you know what apathetic means. I'm not apathetic. I'm vigorously, even vehemently representing a position. It disagrees with your position, but that doesn't mean it's apathetic. I'm the opposite of apathetic in this. I'm unaffected because I'm not a screenwriter, so I'm not sure what that's about; did you mean "uncaring" or something? I care deeply about human wellbeing. I think my position is better for the writers in the long term. I absolutely think the studios should bear some responsibility in ensuring a transition for those jobs that are likely to go away or at least be dramatically reduced very soon. But I also think the unions should embrace that, because it's in the best interest of their members. They are doing them a disservice by trying to demand AI be banned or whatever - that's not solving this problem, that's just putting it off for a bit. I'm a big opponent of band-aid solutions, because they tend to make things worse in the long run and we are too prone to short-term thinking as it is. Saving 10,000 jobs now so we can lose 10,000 jobs later to me is worse than losing 2,000 jobs now so we can save 8,000 jobs later by transitioning them over into a new field. Because when it all comes down, the former has 10,000 people out of a job, and the latter has 8,000 people in new jobs.
You can't save everyone and everything perfectly and have it be all sunshine and rainbows. That's not reality. Hard choices have to be made, and plans have to be put in place for what's going to happen eventually.
No one.
And do you think that kind of technology is just going to go away? Do you think that if we pass laws now, progress on AI will stop and no one will develop it further, and we will just look back on the ChatGPT days going "wasn't that quaint when you could push a button and print out a novel"?
This is not going away. This is going to happen.
Granted, your sci-fi scenario of pushing a putting to get an entire MOVIE rather than just a script (which is what this debate is actually about) is much further into the future, but that, too, is going to happen. I'm just saying that we should embrace the fact that this particular genie is out of the bottle, and rather than wasting time trying to figure out how we can hide the bottle for a bit longer so no one notices it's out we should be thinking of how to change our lives to incorporate that reality productively.
Also: since this is purely about scripts (so far), let me tell you a secret - you can ALREADY make your own movie scripts. Sit down, turn on your word processor of choice, and write it. Nothing and no one can stop you. And yet somehow that does not seem to replace movie visits for you; how come? Maybe your analogy is a LITTLE flawed here, hm?
I trust them as much as I already trust them, because it's not like the process works any differently with humans doing the writing. All that's different is the cost and the scope. But you know who's the one really making choices here? Not the studios, or the executives. It's the consumers. They vote with their wallets. Fast and Furious is a successful franchise not because it's quality cinema OR because the studios decided that's "the zeitgeist", but because consumers are willing to pay for it. If they refused to go see it, it'd die instantly. But they don't. Nothing about that changes with AI. In fact, you could argue it gets BETTER for consumers, because they'll have more selection. But they still get to choose what to watch and what not to watch - as it should be. And if consumers prefer films written by people instead of AI, that's fine too, and that's a choice THEY get to make. Not you or me or anyone else. I'm saying give them a choice; you're saying nope intervene, don't let them choose. Who's putting more trust in some authority there, you think?
Because you can't stop it. And it's more productive to plan for what's coming, rather than try and delay it by extending the status quo. That status quo is going away. We need to start planning for a new status quo, because if we don't, then it'll cause even MORE problems eventually.
Now is the time to act, because we're not there yet, and won't be for some time to come. The ship is heading for an iceberg, and I'm saying end the party now and start getting people to the life boats - you're saying but what if we just slowed down the ship, then we could party a lot longer. But that iceberg is not going away, and the ship is going to hit it eventually. And the more you party, the less prepared you'll be for when that eventually happens. That's the problem we're facing.
That's not my choice to make. It's the market's choice. "Quality" is subjective, and it's up to consumers to decide how much they value it. What I'm saying is that if you have the ability to make 3 new Star Wars movies every year but they won't be as good as 3 you make every 5 years, then that's a choice consumers have - do you want 3 good movies, or 15 not-so-good movies over those 5 years? I don't get to decide that for people. You don't get to decide that for people. THEY get to decide what they'd prefer. You're trying to effectively take such a choice away from people.
You keep making the mistake of thinking that I'm saying these choices are better - I'm saying HAVING OPTIONS is better, not which of those options is better. Consumers get to decide, each for themselves. What I'm saying is it's wrong to withhold the CHOICE between A or B from people, not that I think choosing A over B is better.
My own preference when it comes to quality is irrelevant. Consumers should have choices, and consumers should be responsible in making their choices. My own choice doesn't enter into this discussion, because that's not what it's about. It's about the mechanism of having a choice, not about my choice preference.
The law hasn't caught up with the status quo, let alone the future. There are no robust laws yet for this kind of content production. That's part of the problem.
That's because you're preferencing a biological function without thinking about its technological analogues. Think about what "inspiration" is, or "training" for that matter. We do exactly the same thing that AI does, only in a more sophisticated way: we process input information and come up with ways to recombine it in unexpected ways. We read novels and watch movies and look at pictures all day every day, and then we take all that (and more) and come up with new things based on the sum of our experiences. We can't just create something truly new, either - good luck trying to imagine a new color, for example. We're confined by the parameters of our reality, and our experiences within that reality. AI basically does the exact same thing, except we have to manually feed it those parameters, and it's still not nearly as good as we are at recombining things, especially in fragmented form. But it's getting a lot better. Very quickly.
The only real difference is that AI isn't subject to the same limitations, because it's able to process information MUCH more quickly and for much longer periods of time.
And let's say AI were to reach a point where it COULD replace what you - where a consumer looks at something you made, and something AI made and goes "you know what, I'd rather have the AI one". Where's the problem in that, other than you now not getting paid by that consumer? How is it different from you vs. another human artist? You can say it's unfair because the AI can make 1,000 things in the time you make 1, but why does that matter to the consumer? They just want to pick the one they like better. And the solution there isn't to go "alright let's ban AI" because you can't stop it and the consumer will just go on the internet and get the same thing through back channels, or make it themselves on their home computer; the solution is to get you into a new job where you can still make a living doing something that's still valuable. Or, much better, to have done so 5 years before this you vs. AI choice happens in the first place, so you're not unemployed from one day to the next but had a chance to get into a new career 5 years before your old one ended either way.
And you can value that, personally, as much as you like. As can everyone else. And you have to accept that not everyone will value this the same way you do. ESPECIALLY if there's a different price tag on the finished product. That's a choice consumers get to make, you don't get to make it for them. You can try and convince them of the "actual appreciation for the artistic process", you can educate them on it and why you think it has value, but if they ultimately go "you know what, don't really care, I like the AI one better" that's THEIR CHOICE. You can hate it, but it's theirs to make. And that autonomy is sacrosanct - people get to and forever should get to make their own choices for themselves.
That's a subjective interpretation of how AI works, but that's not super useful in a discussion. Tests have already shown that if you don't tell people something is made by AI, they may not be able to distinguish it. People have won arts prizes with AI artwork (and then come out and admitted it, and given back the prize, to their credit) - meaning professionals in the business who evaluate works for their artistic value decided that these pieces were "better" than human pieces. You can look down on AI and its production process all you like, but the results seem to speak for themselves. And don't forget: this is BABY AI. It's only just begun. This is nothing, NOTHING, compared to what's to come.
I firmly stand against Benjamin and the idea of "aura" - art isn't a mystical craft endowing objects with power. It exists in the observer first and foremost, not the artist. If something has an effect on you, it's art - who or indeed what made it is irrelevant. That's why a leaf on a pond can move you just as a painting of a leaf on a pond can, even one is a confluence of nature and coincidence and the other comes from an artist's mind. AI is no different to EITHER.
You're drawing this distinction without good justification. The process is similar, even if the actor isn't. Only the volume is different. And I'm totally onboard with protecting against copyright infringement for both humans and for AI - I simply think there should be no special pleading for (or against) either. If a human reading a bunch of novels and then getting inspired to write their own is okay, then so is an AI being fed a bunch of novels and spitting out one - and if you want to complain about plagiarism for either of the two, do it the same way. Plenty of human novelists have been accused of plagiarism, too (and often justifiably so, and often despite their protestations, isn't that right SUZANNE COLLINS?!). Same rules apply - you outright copy, you doing a no-no. But the mere fact that you've consumed materials does not a copyright infringement make.
Or that they disagree it's unethical, just as they disagree it's unethical for a human writer to read a bunch of novels and then write their own in inspiration. There's a line for copyright infringement, and that line is NOT drawn at "has seen other people's work".
No. Not "sometimes later", but someplace else. That discussion is happening. Just not here.