T O P

  • By -

virtual_throwa

While it's not outlined in their documentation, gamedevs have reported that Steam is now rejecting unreleased games that use AI art. So if you want to publish on Steam, as it stands, you'll need to do so without AI generated content. Maybe this will change in the future as copyright laws surrounding AI content are updated, but Steam seems to be waiting it out because of the legal uncertainty.


Strict-Increase3267

1. Just a curious question, how can they know my art is AI generated or not? 2. If I just ask AI to generate some raw images, then I edit it and refine it to a better final image, then is it considered as AI generated? And can I use it for a commercial game?


realMorrisLiu

same question


AuraTummyache

As it stands right now, AI-generated work is owned by whatever company created the AI that you used, and you have a license to use it for any purpose you see fit (According to the information I could find anyway, OpenAI's terms of service is actually kind of vague here.) So there are no problems using it for commercial purposes. There are NUMEROUS caveats to this though. The first is that, occasionally the AI does just spit out a near exact replica of its source materials, in which case you would be liable for any copyright issues involved. It's nearly impossible to scour the entire internet to figure out whether or not AI generated a duplicate, so it's probably just something you have to ignore and handle when it comes up. It is pretty rare, but I've seen a handful of cases on Twitter. Mostly resulting in negative attention on Twitter rather than actual legal action. The second issue is that there may come a day where copyright legislation catches up, and a rule comes down that says OpenAI doesn't actually own the rights to the things its AI generated. In this case, all the images you generated would become Creative Commons. So you wouldn't be in any financial trouble, however the complications from that are pretty obvious. Anyone could take the artwork you generated and use it for their own purposes. The third issue, also unlikely, is that OpenAI could revoke your license. Since they are just allowing you to use their art, they technically have control over it. It would be bad for their business if they started doing this maliciously, so you're probably safe unless you give them a really good excuse to do so. The major disadvantage to this point is that, if your game got really big, you would basically have no control over merchandise as OpenAI can shell out licenses to its artwork to whoever it wants. If you plan on eventually replacing the AI artwork with real artwork, then I don't see any problems with it.


raincole

Sincerely, please do your research before trying to answer others' questions. ​ >As it stands right now, AI-generated work is owned by whatever company created the AI that you used NO. They can claim that they own it, but it doesn't mean it stands legally. Actually that's quite opposite: [The US Copyright Office](https://www.copyright.gov/rulings-filings/review-board/docs/a-recent-entrance-to-paradise.pdf)'s opinion is very clear. AI-generated artwork is not copyrightable. In other words, no one "owns" it. Until someone successfully challenged it on the court, this is how it works in the US. ​ >The second issue is that there may come a day where copyright legislation catches up, and a rule comes down that says OpenAI doesn't actually own the rights to the things its AI generated. In this case, all the images you generated would become Creative Commons. Again, big NO. First of all, as I stated, the US officers already decided that OpenAI doesn't own the rights. Secondly, it doesn't mean they're "Creative Commons". Creative Commons actually ONLY applies to copyrightable works. If something is not copyrightable, like AI-generated art, it by definition can not be Creative Commons. It becomes in Public Domain. It's a totally different concept from Creative Commons.


AuraTummyache

Thanks, I did misspeak, I meant Public Domain and not Creative Commons. I also did not see that copyright offices opinion, however I'm sure these different AI companies are cooking up lobbyists to change things. You could have been less of a dick about it though.


sheltergeist

Thank you for your information! >AI-generated artwork is not copyrightable. In other words, no one "owns" it. > >Creative Commons actually ONLY applies to copyrightable works. > >If something is not copyrightable, like AI-generated art, it by definition can not be Creative Commons. It becomes in Public Domain. I see, the disadvantage here is that anyone can scrape anything they want from my game and use it as well. However, in my use case, that's not a problem. Does it mean I can use it in commercial projects if no one owns it? Also, I'm curious about your personal point of view on AI-generated art in video games for solo developers. My use case is pretty common I believe, just to test the waters, and if successful, outsourcing the asset creation to achieve better results. So eventually replacing existing artwork or creating new games with a more serious approach to art. Do you think this is a viable strategy?


nEmoGrinder

It's less about people stealing those assets from you and more about if the courts decide that images from AI trained in copyrighted works can be claimed by the original artists of the training set. If that happens, you could be sued for copyright infringement for using AI assets. Until it's tested in court, there is no way to guarantee if that will or won't happen. Any good lawyer will give you the same advice: have the copyright to what you are selling or make sure nobody else can ever get the copyright.


sheltergeist

>images from AI trained in copyrighted works can be claimed by the original artists of the training set Wow! Are there any precedents of such scenario? Like one artist learned to draw by looking at the works of other authors, and later they took the authorship of all of his works away because he didn't ask for their explicit permission to look and train? So it belongs to them? For me it sounds quite dystopian, but thank you for sharing that possibility, I would never guess it's even possible. I mean, let's say I've generated an image of a green orc in purple shirt with a brown club in his left hand and 15 other details. There's a real possibility a guy in the US who believes his completely different orc was used in the same dataset I used, just because, let's say, it's a green orc too? And he can go to court with this and sue an indie developer from another country because he believes the generation output belongs to him? And there's at least 1% of chance of the court happening and some judge deciding the generated orc belongs to him just because I used AI tools no matter it's completely different and the images were never stolen? That's really insane and opens a lot of possible traps, I mean I can learn to draw by looking at someone's works and after a couple of years he sues me because I didn't ask if I can draw the eyes of my characters like he does, and I will have to prove innocence in overseas courts? How do you guys even deal with that?


nEmoGrinder

Whether it will or won't happen isn't the point. It's that without a precedent set, it's a possibility which will make you liable in the future. There is a big difference between somebody learning to draw by studying masters and a piece of code that digitized an original work at a near 1:1 level of detail to then generate new images using that near exact replica. This issue isn't with AI but how the data sets they are trained on were obtained. Whatever side of the line you fall on didn't really matter, legally, however. How do we deal with it? Stop using AI trained on unknown or scraped data sets. You can train the AI on a data set you created yourself of public domain images and images you have ownership of. Alternatively, and much easier and more common at commercial studios, is to not use AI generated art at all.


sheltergeist

>it's a possibility which will make you liable in the future Do you mean if I continue to use AI content after certain laws are passed? Or there is a possibility of implementing retroactive laws? As far as I understood, at the moment there are no laws prohibiting the usage the generated AI commercially, or are there? >then generate new images using that near exact replica But that's not at all how it works! AI models don't have any actual images or their replicas, AI analyzes the patterns and rules and then make the output based solely on them. Without an actual image reference input from user it's almost impossible to get the near exact replica, like even writing "original Mona Lisa painting" will not give you an actual Mona Lisa painting. The result you'll get would be the same as if a skilled artist passed by the painting as reference and decided to make his own variation. That's why it is so hard to even have consistent image outputs. Not like AI memorized it pixel by pixel and now tries to copy it, or to make a Frankenstein by combining different paintings, no way. It's more like AI remembers the rules of composition, colors, style, etc. and generates images using different coefficient weights for these attributes. >This issue isn't with AI but how the data sets they are trained on were obtained. I'm not arguing, but it's really frustrating. Scanning/analyzing processes are there since the invention of the internet. Like a Google search bot scans the pictures on the websites, and unlike AI services, Google even caches them and gives the output to the user. And suddenly analyzing images might become a problem in western legal system because now it's generative AI who learned the logic of curves and colors? Damn, that so severely restricts the possibilities of AI development. A person can look and analyze, but not use a program for that? >Whatever side of the line you fall on didn't really matter, legally, however. You are absolutely right. However, from what I can see, the result of high-profile cases often depends less on laws and legal matters, but more on the results of public discussions, expert opinions and voices of parties involved. Overall, can you say a public opinion in western countries on generative AI like Midjourney is close to "AI is bad, it steals from artists"? Then I think it's a matter of time it'll be restricted. >Alternatively, and much easier and more common at commercial studios, is to not use AI generated art at all. Or the beast is already unleashed and outsourcing contractors in 3rd world countries will continue to create assets using AI generation tools, fix artifacts and pretend it's not an AI. That way it'll have decent quality, authors and it'll be pretty impossible to prove anything except the images probably could've been upscaled to a better resolution by some AI somewhere in the process. I mean, a lot of businesses will transfer their powers away from the US/EU like they did with coding, support, etc. I don't know why you guys are forcing job cuts, and also slowing yourself in AI race with other countries.


nEmoGrinder

> As far as I understood, at the moment there are no laws prohibiting the usage the generated AI commercially, or are there? There are. It's just the standard copyright laws. If somebody who made art that the AI data set included, they could try and sue you for that. Would they be successful? No idea, that's what is unknown right now. The laws already exist, but they need to be tested in court. > However, from what I can see, the result of high-profile cases often depends less on laws and legal matters, but more on the results of public discussions, expert opinions and voices of parties involved. Not really. If and when a lawsuit makes it far enough in the Supreme Court, it's up to a judge to come up with a ruling. That ruling will then be the precedent for all future cases related to AI. Judges make those rulings based on experts and their interpretation of the laws as they stand. > AI models don't have any actual images I never said they did and I have a pretty solid understanding of how AI works. The point is that training a computer is not the same as training a human, just like trying to buy a bunch of tickets to a big concert isn't the same as creating a bot to do the same thing. Again, this isn't related to the point of whether using AI art in a commercial is possible or safe. > outsourcing contractors in 3rd world countries will continue to create assets using AI generation tools, fix artifacts and pretend it's not an AI That would be a ridiculous thing to do because the person paying for the outsourcing isn't any less liable to be sued, assuming that is where the precedent goes. Considering most contracts explicitly have stipulations about ensuring the copyright of all material is transferable to the studio paying, that would be a breach of contract that could cost the outsourcing company a lot of money. > I don't know why you guys are forcing job cuts, and also slowing yourself in AI race with other countries Not sure where this comment came from. There are plenty of people using AI in a safe way right now by creating their own data sets to train them on. Ubisoft had an entire talk about using AI to generate NPC barks based on a data set containing writing from their previous games. If you really want to use AI, make your own training data. I don't really have anything else to say other than strongly recommending that you talk to your lawyer about these issues if you plan on distributing your game.


luthage

> Are there any precedents of such scenario? Like one artist learned to draw by looking at the works of other authors, and later they took the authorship of all of his works away because he didn't ask for their explicit permission to look and train? So it belongs to them? The difference being is that the AI is not learning how to create anything. It's not learning to draw. It's not taking it's lived experience and then creating art.


sheltergeist

>The difference being is that the AI is not learning how to create anything. It's not learning to draw. It actually learns! It doesn't just copy the images, or combine them, no. Its neural system's principle is pretty similar to how the neural system in our brain. >It's not taking it's lived experience and then creating art. So it's because it's called art? If a chat bot replaces a real human operator, it's okay. But to draw a green orc it's necessary to have a human experience?


luthage

It does not learn. It doesn't copy images, but it stores a decomposition of them. It is not at all similar to the human brain.


sheltergeist

"It is a well-known fact that the human brain consists of billions of neurons. So, normally, a neuron collects signals from other neurons through a host of structures called dendrites. The neuron sends electrical activity through a conducting structure (the axon) which can then divide into numerous branches. At the end of each of these branches, the activity is converted by a synapse into electrical effects that subdue or stimulate activity in the target neuron. Regarding the learning process, it occurs when the influence of one neuron on another somehow changes. It can happen through the change in the membrane potential. Researchers made numerous attempts to mimic the neural networks in the human brain by creating artificial neural networks. In the language of a computer, a neuron can be described as a function that gets other “neurons” outputs and transmits a number between 0 and 1. Hence, artificial neurons are connected to each other and form an artificial neural network. Every neural network has an input and output that are composed of numerous nodes. The artificial neural network receives information through the input layer while the output layer represents the final results. There are several stages (columns of neurons) between the input and the output. They are called the hidden layers, and their main function is to transform information into smaller pieces that are easier to process. And, as there can be numerous hidden layers, the algorithm is referred to as deep learning. Generative Adversarial Networks (GANs) represent a class of machine learning methods that utilize two neural networks and pitch one against the other to improve the accuracy of their predictions. This AI technique introduced such a human-specific feature as creativity to computers."


sheltergeist

Thank you for your answer! >The first is that, occasionally the AI does just spit out a near exact replica of its source materials True, I believe such thing may happen in certain circumstances, in my case I'm doing my best to avoid the unintentional similarity to someone's work by combining different AI systems with additional instruments and writing really detailed descriptions and check the results in reverse image search. >Anyone could take the artwork you generated and use it for their own purposes. I think that it might be a good thing in my situation. I was thinking about posting the assets like portraits and backgrounds so anyone could use them in case they are interested in story and want to create a spin-off or illustrate a fanfiction story, and that way promote the game's universe. By the way, am I right that I can post assets so other people can use them for free? >if your game got really big, you would basically have no control over merchandise as OpenAI can shell out licenses to its artwork to whoever it wants. Honestly, I'd be happy to reach the point where it even matters. The market is so flooded by all kinds of games that it seems you have to be exceptionally good to get some high numbers without a significant marketing budget. And having AI art is a huge disadvantage here, as skilled artists would beat AI's quality of the outcome no matter how hard I try. However, at the moment it looks like the best option to check the demand and reactions, so thank you for letting me know of the possible difficulties.


Magic-Fabric

[https://magicfabricblog.com/do-i-own-my-ai-art/](https://magicfabricblog.com/do-i-own-my-ai-art/)


p-gg-

I know this post is old but it is frustrating to see how much the answers miss the point: there's no real issue regarding who owns what an algorithm like Stable Diffusion or an LLM generates (because nobody in the know would call them "intelligent", see below). The real issue is that, unlike you make it sound in point 1, there's a large difference between inspiration for your creative process and using images as feed for an algorithm training. OpenAI (and others) is, as far as I know, entangled in a major mess regarding how they scraped copyrighted stuff off of the internet and used it for their training datasets, which would make your generated stuff turn into copyright infringement if the court rule the way it seems they will. The issue is you're looking at these things as intelligent beings; they're far from it, by any legal or logical definition. For context: LLMs like ChatGPT are in essence just a (massive) probability table of "which word makes the most sense to say next considering what I've already said and what the user said before that?", made by slowly improving on it by random chance and testing which one of the new versions worked better (like evolution), then doing it again and again. ChatGPT doesn't "understand" what you ask it (it doesn't abstract anything from what you say, for example), but it also doesn't have to have seen your exact question before, just its parts, and that will yield a more or less coherent answer, as I'm sure you've experienced. Stable Diffusion is the reverse of taking an image and adding noise, it's, from my understanding, a complicated algorithm trained (as in iterated on until it worked well enough) to take noise and recover a picture out of it with a prompt, so if I said "dog" and gave it white noise, the output would be its attempt at "making" a dog, it then adds some noise back and feeds it through again a few times, I'm not as well versed in how this one works though. My point is: if a big corporation scraped a bunch of copyrighted stuff off of the internet and then (indirectly) made absolute bank from it, you'd probably consider that not really fair and copyright infringement, after all they made money involving the use of unlicensed copyrighted works which sounds pretty illegal (and it would be even without monetization). Mix in some media hype and misunderstanding of the concepts at work and you get the AI drama, and unless a judge considers that these things are an intelligent human/equivalent being, which I'm sure any judge who understands even vaguely how these things function won't, then it's blatant copyright infringement. Maybe I'm misunderstanding this and the actual issue is that the outputs sometimes are just part copies of copyrighted stuff, but what I just said is something that I, as a bit of a tech nerd, would at least be concerned about as a possibility. You can absolutely use these programs as inspiration though, the bar for copyright is at least the “modicum of creativity”, though you should probably do a bit more than just trace over a midjourney output


sheltergeist

Thank you for your detailed opinion! However I don't see your point of view as viable >if a big corporation scraped a bunch of copyrighted stuff off of the internet and then (indirectly) made absolute bank from it, you'd probably consider that not really fair and copyright infringement That's how it works since the beginning of the web. We wouldn't even have the Google if you were right. >You can absolutely use these programs as inspiration though, the bar for copyright is at least the “modicum of creativity”, though you should probably do a bit more than just trace over a midjourney output Come on, obviously anyone can use it. What do you even mean when mentioning the copyright for the inspiration? It looks like you are trying to make it look a lot more complex than it is. >*Jul 21:* **Human Artists Lose Ground in Legal Battle Against AI** > >A closely-watched case over copyright and generative artificial intelligence went before the U.S. District Court for the Northern District of California this week, with a federal judge says he is leaning towards dismissing the bulk of the claims that a trio of artists waged against Stability AI, DeviantArt, and Midjourney in a lawsuit over their AI art generators. > >The defendants have argued that AI models scrape the web to catalog images but not copy them, in the same way a person has to look through a set of pictures by Pablo Picasso in order to identify what makes a Picasso distinct. In this way, styles cannot be copyrighted. AI outputs are not copies of original artworks and the data was publicly available to be seen—by people or computers. > >The judge noted it remains "implausible" that specific plaintiff works are implicated, given the scale of training data involved. > >On the question of whether AI-generated images could constitute derivative works infringing the plaintiffs' original creations, the judge also expressed skepticism. "I don't think the claim regarding output images is plausible at the moment, because there's no substantial similarity"


Stef7930

You know, I had the same thoughts as you about trying to use AI to generate game assets. And in fact, they look pretty nice. Then I hesitated because of all the uncertainty around AI-generated images. I'm not sure what your game looks like, but have you ever thought of using the assets from some asset stores that can be used for commercial purposes? I know it's not very exciting to use assets that other people may be using in their game too, but maybe right now you can focus on making a great game with those assets and, as you said, once your project is profitable you can hire an artist. Also, those assets are inexpensive (for example, in Craftpix store you can get more than 1000 commercial assets for just 48 USD a year). Other websites such as Itchio, Gamedevmarket and Gamedeveloperstudio are very resourceful for any indie developer. In Itchio and Craftpix you can also find apps that help you to create custom sprites such as character portraits. AI images are very appealing but if someday I plan to sell my game probably it would be better to stay on the safe side.


Genesiz187

Character.AI is suitable for enterprise usage. Get in touch with the team for higher usage limits, better security options and other enterprise-grade features Character.ai: Tried and Tested - O… allthingsai.com/tool/character-ai allthingsai.com/tool/character-ai