The Screen Lawyer Podcast

AI, Copyright, and the ‘NO FAKES Act' #214

Pete Salsich III Season 2 Episode 14

In this episode of The Screen Lawyer Podcast, Pete Salsich dives into a recent U.S. Copyright Office report that highlights the potential inadequacy of current laws to address the challenges brought by AI, particularly in the areas of deepfakes and digital replicas. The episode also discusses the "NO FAKES Act," a bipartisan legislative initiative designed to protect individuals’ likenesses—beyond just celebrities—from unauthorized use in the evolving digital landscape.

Original Theme Song composed by Brent Johnson of Coolfire Studios.
Podcast sponsored by Capes Sokol.

Learn more about THE SCREEN LAWYER™ TheScreenLawyer.com.

Follow THE SCREEN LAWYER™ on social media:

Facebook: https://www.facebook.com/TheScreenLawyer
YouTube: https://www.youtube.com/@TheScreenLawyer
Twitter: https://twitter.com/TheScreenLawyer
Instagram: https://instagram.com/TheScreenLawyer

The Screen Lawyer’s hair by Shelby Rippy, Idle Hands Grooming Company.

So AI is back in the news. Never really left the news, but its impact on intellectual property and particularly copyright law is in the news. And on this episode of the Screen Lawyer podcast, I'm going to spend a little time digging into a report that was just issued at the end of July by the Copyright Office, analyzing a lot of the issues that come up with AI, and in particular, going through what existing laws we have and seeing whether they address the issue. Not surprisingly, the Copyright Office suggests that we need some new laws. Join us. We're going to dig in. Hey there. Welcome to the Screen Lawyer podcast. I'm Pete Salsich, The Screen Lawyer. And today on the podcast, I'm going to spend a little time talking about a topic that we have talked a lot about already here, and that's AI and its impact on copyright law. As you know, a lot of the discussion over the last year or so on this issue has really been more about the legal challenges to the training methods that the large language models, ChatGPT, etc. have been using. And there's no dispute. I don't think anybody's saying there's any issue anymore about whether these large language models are, in fact, accessing thousands, hundreds of thousands of copyrighted works to train the AI engines. But they've been arguing that that type of access to or use of copyrighted materials is fair use. And of course, the rights holders, whether it's books, music, art, movies, etc., are all claiming the opposite, that it's infringement. We're going to see what happens. And we've talked a lot, a lot about those issues. But one of the topics is that we've also dealt with here is the fact that while we are waiting for courts to decide, and sooner or later, some courts are going to make the first ruling on the fair use issue, and those rulings are going to get appealed to a court of appeals that may have to go to the Supreme Court. You may get another court ruling in a different direction. That all takes time. In the meantime, I think this is an example. We've talked about it before, where in fact, what we need is some new legislation to deal with the world as it is now and try to set some parameters around here so that we're not waiting for the courts to decide. And then once either a court decides or the legislation comes into play, then businesses make adjustments to their contracts, to their revenue structures, to their business models based on what those parameters are. But for a while now, we really haven't had much in the way of guidance. And that's created this sort of Wild West, environment that we have. You know, just a couple of months ago, on an episode here, we talked about Scarlett Johansson's situation in which, the, the newest version of ChatGPT was going to use her voice, as part of its, you know, audio feature. And she said, no, you, you know, they talked about it, they negotiated. And ultimately she said no, but they came out with something that essentially was an AI copy of her voice or a person copying her voice. And we talked about how that violated her right of publicity, her name, image and likeness. But that hinged on the fact that she was very well known and, more importantly, that her voice was so unique and identifiable that she, like Bette Midler before her, Tom Waits a few other celebrities have been able to, establish a claim under the right of publicity law name, image, likeness law on their voice alone. The traditional right of publicity claim, in which a person, typically a public figure, used to be a requirement that it had to be a public figure in some way, developed a property right, in the use of their identity for commercial purposes, essentially for endorsements, individuals that may not have a property write in for endorsements still have a right to privacy. You can't use their, you know, just take random pictures and put them on commercials and use them for commercial purposes. So we have this existing framework of law that's generally served us pretty well over the years in these various different areas. You can typically spot what the issue is and what laws in play, and they vary a little bit from state to state, but generally kind of have the same structure. Well, AI is changing all that. And in particular it's the deepfakes. It's the copies that I can generate that are really, frankly, almost indistinguishable from the real thing, whether that's visually voice, movement, all of those things. Now they're they're everywhere. and it's not just, you know, deepfakes, it's not just the idea of, you know, most of these cases so far have turned on copyrighted works, a book, a not, you know, a piece of art, a song, something like that. Well, here, this is focusing more on the deepfakes of people. And it's interesting at on the same day that the copyright Office came out with his report. And we're going to get into a little bit of the recommendations here in a second. a bill was there had been a bill previously floated in in the Senate. It was updated on the same day. it's a cosponsored, bipartisan bill. We always like to see that those sometimes actually make it into law. Chris Coons, Democrat from Delaware, and Marsha Blackburn, a Republican from Tennessee, introduce an updated version of and this is I love the names of laws. This is the Nurture Originals Foster Art and Keep Entertainment Safe Act long. So what they really call it is the NO FAKES Act. So let's call it the NO FAKES Act. Either way, it's it's a proposed new law that would attempt to do some of the things that the Copyright Office says that we should do, and whether it's this particular the NO FAKES Act that ends up getting amended and involved and works its way through Congress and becomes law, that might be the one. Or there could be something new. But I think it's interesting that at the same time, this is coming up, the Copyright Office issued a report basically saying many of the same things. It's not an accident because obviously all of this stuff has been going on, in public for some time. But the let's focus on really what the Copyright Office suggests, and I can tell you where it matches up with the NO FAKES Act, the subject matter of the of the law. And these are this are the recommendations from the Copyright Office. They basically start with a lengthy premise and analysis of existing laws and conclude that they don't work or they're not sufficient. So it's really this is a statute that's going to target those deep fakes that are so unrealistic or so realistic, excuse me, that they are really hard to tell the difference. and it's it's a narrower protection than the broader name, image, likeness. This is not, at least at present, designed to replace those laws, which are a creature primarily of state law. the other thing that's important is this, recommendation by the Copyright office and the NO FAKES Act, language, proposed by the senators, applies to everybody. You do not have to be a celebrity. You do not have to be a public figure. This would apply to all of us. We all have this inherent right. Not to have someone make a fake version of us and use it in ways that that, we don't approve of. Right. it's also interesting that it's, you know, the idea is in a lot of these other areas, it's been a commercial use we talk a lot about in the copyright space. What is the nature of the second use being made? Is it commercial? In which case, almost certainly you're going to have to get permission, or is it something that might qualify for fair use in the the proposals here, it's not limited to only commercial uses, because a deepfake could be used in a deeply personal way that wasn't trying to make anybody money on the other side. And yet it would still violate these statutes if they come into play. I think that's an important distinction. among the things that are being discussed is, you know, this would be a personal right that each one of us would have, this right, not to have our identity used in this deepfake way. and so it would last for the lifetime of the individual. But then there's the question, the same issue that we have state by state, with name, image and likeness laws. Do these rights last after death in some states? For celebrities, they last a long time. That's why it's still you still have to get permission to use Elvis Presley's identity. long after he died, he had to get that from his estate. But other states that right ends at death. So whether this right ends the death or last longer will be something that gets addressed. I would imagine it's going to have to have some tag afterwards, but we'll see. That'll be. But that'll be an interesting thing that comes up where one thing that's interesting here that I want to focus a little bit here, is on the infringing acts. What would constitute an infringement if these laws come into place in the copyright Office suggestion, they focus on a couple of things. And this is important because they highlight some of the issues that we've had in discussing these other cases. So in the home, all the fair use case is the large language model training cases. The complaint by the rights holders is that it was the access, the going and getting of the original copyrighted work and then pulling it into the training session, etc. that is the problem, not not the distribution of some new work because the you know, you almost can never find an exact copy of the old work in a new work for AI, it's typically conglomeration that's been made because it's learned from these infringing works. Well, the Copyright Office in this new deepfakes world says the liability should arise from the distribution or making available of an unauthorized digital replica. And that's the term they use digital replicas, but not the act of creation alone, which is an interesting distinction. And because maybe this is going to tell us in what direction the courts might head on these other cases, be interesting to see. and also, it should not be limited to commercial uses because harms are often personal in nature, like we just mentioned. But here's a really important thing, and I don't know if this will end up making it into the law, but it says it should require actual knowledge. Both that the representation was a digital replica of a particular individual and that it was unauthorized. So who was supposed to have the actual knowledge? So let's unpack that a little bit. Actual knowledge is a very high bar to prove, and you can't just infer it from. Well, we'll see how the statutes written. There may be some. Sometimes you can infer actual knowledge by, you know, enough evidence that shows this was in front of them and they ignored it, but it's still a higher bar. And it's actual knowledge that the representation was a digital replication of a particular individual and that it was unauthorized. And remember, the liability would come from the distribution. What that means is if I didn't create this digital replica, I don't have the tools, the equipment, my AI engine, whatever didn't create the the deepfake. But if I see a deepfake somewhere and I then use it and I distributed in something, I make it available to the public. I have to have had actual knowledge that it was a deepfake, not an actual picture, and that it was unauthorized used. So it's not going to be everybody gets sued. Every time one of these things come out, you're going to have to prove some actual knowledge, and maybe that's what it should be. If you sort of think about the nature of these things. but again, it's going to be interesting to see how that plays out, because that's a pretty specific, legal standard to have to meet. and what I predict is that just as as soon as the DMCA came out years ago and, and imposed liable, you know, the responsibility for catching infringement and impose it on the rights holder, not the YouTubes of the world. As long as YouTube had a notice and takedown provision, then the rights holder had to find the infringement. Very quickly. Technology came up to enable the rights holder to find that very thing. So what will happen, I predict, is whatever this law ends up saying, technology will adjust to make it possible for people to catch the infringers. and then they also the Copyright Office says, just like with the Digital Millennium Copyright Act before it, the secondary liability needs to be addressed. So just if I'm a platform and this happens on my platform, but I'm not the one who distribute it as such, the simple fact that I have a platform for YouTube, anything else like that is not infringement. As long as I have some sort of notice and takedown, this all could end up in an amended or new version of the Digital Millennium Copyright Act. Might be the easiest place for this all to fit because that, say, safe harbor provisions already in there. But it's important to note that the Copyright Office believes that type of protection should be in this new law. interesting comment on licensing and assignment. According to the Copyright Office, their suggestion individuals should absolutely be able to license these rights. I can give someone else permission to use a digital replica of me, but I cannot assign a way that right. This is a right that stays with me. I can grant permissions from time to time, but I can't give it away from me. Interesting. You know, and that because it is a personal right. I don't know that that's that unique of a situation, but it is an asset that I have. Is that a prediction that says once I die, it goes away or maybe it just says my heirs can only license it to interesting little, nuance there. because normally, if you were the owner of an intellectual property, right, one of the things you can do is assign it to somebody else, sell it, transfer it, convey it, make those rights go from you to someone else. But the nature of this, I think, is that it is uniquely personal, and therefore you can give permission, but you can't sell it. and then importantly, in both the, the NO FAKES Act proposed law and the copyright suggestion, they make it very clear that the First Amendment is still going to apply. Meaning these concepts of fair use - parody, commentary, news reporting, those types of things are still going to be at play. Now, how does that fit into the use of a deepfake? I'm not sure. I mean, I suppose if there's a news story about a deepfake, then of course the news, you know, showing the deepfake on the news will be protected under the news reporting part of the First Amendment. Can I use a deepfake in a documentary? If I'm doing a documentary about how this real person was always the subject of these fake I. Maybe I'm not exactly sure how it's going to play because it's a different thing here than a free standing work. Copyrighted work, either you know, is in a new work or not, and then you have the fair use argument whether it's okay or whether permission was needed. I'm not sure how that's going to apply to this uniquely personal right. that is, you know, tied to, a digital replica of me. but it's we'll, we'll, we'll see. but it's interesting that that's there. And I and I think to some extent, what the Copyright Office is saying here is they're not proposing radical reshape mapping of copyright law. Right. We've talked many times the Copyright Act is superseded in certain circumstances by the First Amendment to the Constitution that protects certain types of speech. And where those two things overlap, if you fit into certain protected speech categories under the First Amendment, that's going to trump somebody else's rights under the Copyright Act. And I think this is simply the Copyright Office saying most of what we're saying here already exists in some other pieces. We just need to bring it all into a whole. That's sort of my initial read. remedies. They would, you know, there would be damages, there would be injunctive relief. These all have to be figured out. you can't get statutory damages because statutory damages are something that you only get if you register a copyright in a new work. Well, there's I don't think we're all going to have to register our identity in a constant basis every time we change our look. But so I don't know the nature of the damages as such. That will be interesting to see, but certainly injunctive relief going in and getting a court to grant your TRO or a primary injunction to get the work taken down, that's an obvious remedy. If you can prove your case, criminal liability might be, available in some instances. We'll have to see, also, the Copyright Office says at this point they do not recommend creating a preemptive, nationwide law that would get rid of all state laws. We probably mentioned in the past that copyright, the Copyright Act, completely preempts all state laws relating to copyright. There is only one law in the United States related to copyrights. That's the US Copyright Act, and that governs all 50 states. There's nothing different about copyright law in California versus Missouri or New Jersey or anywhere else but the name, image, likeness laws are all creatures of state law, and they vary, even though they're basically the same. They do vary in different ways in different states. depending on what you have to prove, who has the rights, how long they last in that, and at least the initial reaction or suggestion here from the Copyright Office is that they we don't trump state laws. This is something that this will be a federal law, but the state laws that apply in other areas would still exist. And courts are very used to analyzing the overlap between state and federal law. So that's not necessarily a problem. So that kind of summarizes where we are. We don't have new law yet. We have a new proposed, law in the Senate. That's just beginning to go through the next round of discussions, whether anything gets done in an election year. And, you know, certainly remains to be seen. Probably not. Although it is bipartisan and I don't see a strong, Republican versus Democrat approach to this. It's a all of us versus scary tech- that might be where the teams are. but again, that's probably not even a fair characterization because AI has, has been doing and is going to continue to do some amazing things. but in this area, I think it's interesting that the Copyright Office is actually focusing on something that doesn't have to do with copyright and works. Instead, it's effectively creating a right in people to prevent other people from creating new works that use digital replicas, deep fakes without their consent. Interesting merging of these two areas of law, but probably has to happen because you know, we talk about it all the time. All these things end up on screens somewhere and they interlock. And it's the overlapping series of different rights that are involved in any given situation that make it challenging for lawyers. But also that's where the good contracts come into play. That's still going to be true. How you draft your contracts about these types of use is still going to matter. But I do think we are going to see new legislation, and I think it's likely either to be an amendment or a newer version of the DMCA or something that functionally replaces it in large part, because there's a lot of similarities here. And that's what happens when new tech emerges that just transforms the way people use information, share content, see themselves, and that's where we are now. The courts will be slow to catch up. Congress, which is not usually very fast, might actually be a little faster. in the meantime, if you're concerned about something, make sure your contracts have the right language. So that's it for today. We are going to continue to watch this. It's ongoing. We'll update you as we see more. And if you like this content, or are you happy to hear what we're going on a listening on audio catches. Find us and follow us wherever you get your audio podcasts here at The Screen Lawyer. And if you're watching on the YouTube channel, hit that Like and Subscribe button. We want you to get as much content as you can. We're always happy to see you and be in your ears. And for anything you can find us at TheScreenLawyer.com. Thanks. Take care.

People on this episode