Half-Life: AI project to show how the shooter can look “photorealistic”

0
52

Is this what the future of game graphics looks like?

The legendary shooter Half-Life is now a little over 25 years old. Given the revolutionary status of the debut title from the then-humble developer studio Valve, fan projects are regularly launched.

Popular modifications usually focus on improving the graphics, with the aim of using ray tracing effects to give Half-Life a new lease of life. A clip is currently causing a stir on TikTok in which artificial intelligence is used to take things one step further and show Half-Life as a “photorealistic” adventure – but see for yourself!

For this, the AI model called “Gen-3 Alpha” from Runway ML was used, which acts here as a video-to-video model. As the company explained when the AI was introduced, Gen-3 Alpha is intended to serve as a “significant step towards building general world models”; what is meant are AI models that can represent situations and interactions that are as comprehensive as possible.

Using a prompt specially customized for Half-Life, the user “Soundtrick” was able to generate the video below. However, the actual geometry of the game (or otherwise precise data points) is not used. Instead, “it’s all based on the final frame that the game renders,” as Soundtrick explains.

How realistic is the video actually? As nice as a photorealistic remake of Half-Life sounds, there are still some hurdles to overcome. On Soundtrick’s YouTube channel, numerous animations can be seen within the three minutes of the clip that are wooden or jerky.

In particular, facial expressions and hands still seem to be too challenging for the AI model to produce a consistently good-looking solution. It is also striking that only “real” or realistic elements can be seen in the entire clip; you won’t find the head crabs from Half-Life here – even if we’re not sure whether we really want to imagine the parasitic critters in a photorealistic way.

Of course, the processing time is also an issue. When Gen-3 Alpha was presented, Runway ML stated that it took around 45 seconds to generate a five-second clip. So a real-time calculation, as GPU manufacturer Nvidia, among others, hopes to achieve in the distant future, does not take place.

So there is still a very long way to go before we get a photorealistic Half-Life – but would you even welcome such a new edition? Which classics could still benefit from a graphical upgrade? Let us know in the comments!