Merry Christmas 2023 everyone!

Discussion regarding the Outsider webcomic, science, technology and science fiction.

Moderator: Outsider Moderators

QuakeIV
Posts: 210
Joined: Fri Jul 24, 2020 6:49 pm

Re: Merry Christmas 2023 everyone!

Post by QuakeIV »

Well, I think I agree with you, at any rate I think humans deserve privileged treatment. But, this ai thing needs to be handled carefully. In the past the idea of drawing inspiration was seen as good, but only as long as you weren’t making a blatant copy, as you so astutely pointed out. I proclaim to you, that was an accomplishment on the part of the artist to serve as inspiration to others. I also think that’s kindof a vital component of producing new artistic things. Currently there is a not too horrendous definition of what is and isn’t blatant copying that is in force in the courts (at least it could be far worse), but I think ‘AI’ art can pretty easily jump over that bar. Moving the bar on what is and isn’t considered copying to try to capture ai models in that wouldn’t even require a change in the law, but could hurt actual humans because it’s hard to be sure whether art was generated or made by hand. It would be better in my opinion to try to move in a straight line towards what you want, which in my case is ‘nobody is overly attached to the existence of these machines except lazy people, therefore they should be legislated against completely separately from the idea of copyright’. Maybe you are quite attached to them and are now annoyed at obliquely being called lazy, but I am not aware of that being the case and thus that is my opinion.

There are ai company lawyers on one side, but there are also corporate legal departments that exist purely to eliminate competitors via malicious lawfare on the other. If given adequate room they may push things in a direction that I would personally consider to be bad, so as to give themselves more ammunition.

IMO as long as making these models requires like a half billion dollars in gpus to create it will be very easy to screw them out of existence. There is very little room for the businesses in question to maneuver because they stand to lose these assets (for instance liquidating them to pay off legal debts), thereby making them cumbersome and vulnerable. I think trying to make it a copyright question is hard because it targets the distributor of a product, not the tool that made it. You could amend the law to say ‘any publicly available model that can be shown to make subjectively similar stuff to copyrighted information they have no license to is liable for damage’ and basically totally shut this down, but if it’s mainly a question of trying to bend the law as it’s currently written to the task, I don’t think that will end particularly well. It should be new law or at least amendment to the current law that exists purely to blow up these companies.

If this comes across as vaguely evil sort of scheming to you that’s probably because it kindof is, but we aren’t dealing with a legal system that’s actually coherent, intelligent, and capable of holding onto some kind of honest intent. It doesn’t really deserve nor respond well to trying to treat it like it is a person. It’s just a machine that does a job and should be treated as such.

User avatar
Arioch
Site Admin
Posts: 4497
Joined: Sat Mar 05, 2011 4:19 am
Location: San Jose, CA
Contact:

Re: Merry Christmas 2023 everyone!

Post by Arioch »

Whatever court rulings or new legislation may come, the rule of due process still applies. You can't sanction someone for breaking the law without proving they did so in court (in the United States, at least). An artist can't be punished just because his or her art looks like it was AI generated; there has to be convincing evidence.

I think the current lawsuits focus less on the look of the resulting generated content than on the fact that copyrighted work was used in training without permission or compensation. If the AI companies are going to sell their generated content for profit, then it doesn't seem unreasonable for the copyright holders of the source material they are using for training to expect some kind of compensation, at the very least. This is not too different from the issues of data collection companies selling personal activity data for profit without permission or compensation to the people whose data is being collected, which also a question currently before the courts.

It's really not possible to outlaw any kind of software tool; it's simply not enforceable (especially given the international nature of computing). So all they really can do is act against people and entities for illegal use of said tools, as people must exist in a location, which has a definable legal jurisdiction. In addition to being practical, I think that's appropriate; a tool doesn't know what it's being used for. It's the person using it that gives the use context and a moral (or immoral) framework. This worked itself out in what I think it a reasonable way for the question of video copying, and I expect it will do the same for generative AI. But it will probably be a bumpy road from here to there.

Post Reply