This is something I used to be excited for but I only have been losing interest the more I hear about AI. What are the chances this will lead to moving character arcs or profound messages? The way LLMs are today, the best we can hope for is Radiant Quests Plus. Not sure a game driven by AIs rambling semi-coherently forever will be more entertaining than something written by humans with a clear vision.
There are some fundamental obstacles to that. I don’t want, for instance, that a game AI does that which I tell it to do. I want to be surprised and presented with situations I haven’t considered. However, LLMs replicate language and symbol patterns according to how they are trained. Their tendency is to be cliche, because cliche is the most expected outcome of any narrative situation.
There is also the matter that ultimately LLMs do not have a real understanding and opinions about the world and themes. They can give us description of trees, diffusion models can get us a picture of a tree, but they don’t know what a tree is. They don’t have the experiential and emotional ability to make their own mind of what a tree is and represents, they can only use and remix our words. For them to say something unique about trees, they are basically randomly trying stuff until something sticks, without no real basis of their own. We do not have true generalized AI to have this level of understanding and introspection.
I suppose that sufficiently advanced and thorough modelling might give them the appearance of these qualities… but at that point, why not just have the developers write these worlds and characters? Sure that content is much more limited than the potentially infinite LLM responses, but as you wring eternal content from an LLM, most likely you are going to end up leaving the scope of any parameters back into cliches and nonsense.
To be fair though, that depends on the type of game we are talking about. I doubt that a LLM’s driven Baldur’s Gate would be anywhere as good as the real thing by a long margin. But I suppose it could work for a game like Animal Crossing, where we don’t mind the little characters constantly rambling catchphrases and nonsense.
I mostly agree but I think that, in some cases, cliche is exactly what we need. AI could be used for the background dialogue generic NPCs have in open world games if used well.
Overall I think AI is nowhere near advanced enough to be used at a large scale in gaming but it’ll probably get there in 5 to 10 years if it continues advancing at this rate.
The main issue I see with it is that you need special hardware to run neural networks in a native environment and personal PCs don’t have that so you are stuck with always-online, machine learning or pre-processed data.
This is something I used to be excited for but I only have been losing interest the more I hear about AI. What are the chances this will lead to moving character arcs or profound messages? The way LLMs are today, the best we can hope for is Radiant Quests Plus. Not sure a game driven by AIs rambling semi-coherently forever will be more entertaining than something written by humans with a clear vision.
AI used to not even be able to do that a year or so ago, give it time and it’ll get there.
There are some fundamental obstacles to that. I don’t want, for instance, that a game AI does that which I tell it to do. I want to be surprised and presented with situations I haven’t considered. However, LLMs replicate language and symbol patterns according to how they are trained. Their tendency is to be cliche, because cliche is the most expected outcome of any narrative situation.
There is also the matter that ultimately LLMs do not have a real understanding and opinions about the world and themes. They can give us description of trees, diffusion models can get us a picture of a tree, but they don’t know what a tree is. They don’t have the experiential and emotional ability to make their own mind of what a tree is and represents, they can only use and remix our words. For them to say something unique about trees, they are basically randomly trying stuff until something sticks, without no real basis of their own. We do not have true generalized AI to have this level of understanding and introspection.
I suppose that sufficiently advanced and thorough modelling might give them the appearance of these qualities… but at that point, why not just have the developers write these worlds and characters? Sure that content is much more limited than the potentially infinite LLM responses, but as you wring eternal content from an LLM, most likely you are going to end up leaving the scope of any parameters back into cliches and nonsense.
To be fair though, that depends on the type of game we are talking about. I doubt that a LLM’s driven Baldur’s Gate would be anywhere as good as the real thing by a long margin. But I suppose it could work for a game like Animal Crossing, where we don’t mind the little characters constantly rambling catchphrases and nonsense.
I mostly agree but I think that, in some cases, cliche is exactly what we need. AI could be used for the background dialogue generic NPCs have in open world games if used well.
Overall I think AI is nowhere near advanced enough to be used at a large scale in gaming but it’ll probably get there in 5 to 10 years if it continues advancing at this rate.
The main issue I see with it is that you need special hardware to run neural networks in a native environment and personal PCs don’t have that so you are stuck with always-online, machine learning or pre-processed data.