• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Poll: Do you care about ray tracing / upscaling?

Page 41 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Do you care about ray tracing / upscaling?


  • Total voters
    255
The fact that people are kind of glossing over the fact they were using two 5090's is pretty insane as well. This slop will never run on worth a damn on anything less than a 5090 if that.


This isn't necessarily true.
They're probably at the stage where they're using the freshly trained model with the original number of parameters and weight sizes. For end-user implementation they're probably going to distill the model through a teacher<->student method and reduce weight sizes e.g. from FP16 to NVFP4 for much higher performance on Blackwell GPUs, while keeping >95% of the original model's accuracy and performance. If this is a dense model, there's a chance they'll gain a lot of performance again by going into a mixture of experts architecture.



It'll be a bit like what's done on the 20B parameter LLMs on FP4 getting 98% of the accuracy of the 200B FP16 ones despite using up 1/40th of the size and running 20x faster on the same chip.
 
Great another DLSS 5 shill!

sigh
he completely misses the point. People will marvel at the graphics without DLSS 5.

Doc Potato continues to not impress me with many of his hot takes over years. Clearly he is smart but on the less technical side of things he comes off as an industry promoter to maintain access. Baffle and wow me with the hardware theories and specs I barely or don't understand but don't think that sells me on the stuff I can see with my own eyes and deal with when playing the games.

1. This is rockstar and not the old rockstar that did gta5 and rdr2. This is neo-rockstar with all kinds of drama among those working there. Both management and dev team.
2. I have a feeling many will notice it if the game doesn't run well and even then the graphics already look good without this new AI dlss5 stuff.
3. See #1. Its rockstar, they paid $10k to a modder who told them how to fix their years long buggy gta online log in problem that led to frequent failures to connect. I know because I put up with it for years. Technical competency is not something I associate with Rockstar games.

Edit: I didn't even think about it till reading the tweets but gta6 is console first so no dlss 5 till PC version, a year or so after console launch (if it makes it out this year even).
 
Last edited:
Edit: I didn't even think about it till reading the tweets but gta6 is console first so no dlss 5 till PC version, a year or so after console launch (if it makes it out this year even).
as per rumour helix is out nov-dec 2027
gta pc version is out mar-2028 or after
 
It does feel like Nvidia realised that AMD will catch up with ray tracing and since new consoles will focus on that devs will actually have to optimise this stuff, which will port nicely to PCs resulting in pretty much only meaningful Nvidia advantage being massively eroded.

Would have been better if they made some move to make PhysX v2 or something like that, or just a bunch of new exciting techs that AMD filed patents for, shame.
 
It does feel like Nvidia realised that AMD will catch up with ray tracing and since new consoles will focus on that devs will actually have to optimise this stuff
It makes sense for them to double down on AI features, it's their focus and strength right now. DLSS is their moat on the consumer side, which is quite ironic considering they initially used it as a performance crutch to get RT implementation going. We may see a role reversal in the future.
 
as per rumour helix is out nov-dec 2027
gta pc version is out mar-2028 or after

Is there any actual confirmation that GTA6 will release exclusively on console this November?
They mentioned PS5 and Series consoles back in 2023 when they planned originally for a 2025 release.

In the meantime, we got:
- a delay for May 2026
- another delay for November 2026
- Series consoles collapsing, both in console and in software purchases
- lower-midrange GPUs going into laptops (RTX5060) becoming as powerful as consoles
- upper-midrange GPUs (9060XT, 5060Ti) becoming as powerful as the PS5 Pro
- Microsoft announcing their next-gen coming in 2027 will run PC games.
- gaming PC handhelds lifting up


So not only are they getting substantially less revenue from the dead current-gen xbox, there's a lot more revenue on the table from people able to run GTA6 in PS5/SeriesX settings on the PC and on top of that there's a bunch waiting for the game to come to the PC so they can play it on their handheld as well.

Even the double-dipping method doesn't make that much sense, as Rockstar knows they're getting most of their revenue from the GTA Online sequel, and for Online to thrive they need the PC crowd which is where most of the whales are living.
 
Is there any actual confirmation that GTA6 will release exclusively on console this November?

Yes

So not only are they getting substantially less revenue from the dead current-gen xbox, there's a lot more revenue on the table from people able to run GTA6 in PS5/SeriesX settings on the PC and on top of that there's a bunch waiting for the game to come to the PC so they can play it on their handheld as well.

Even the double-dipping method doesn't make that much sense, as Rockstar knows they're getting most of their revenue from the GTA Online sequel, and for Online to thrive they need the PC crowd which is where most of the whales are living.

GTA Online is probably not going to be fully baked by the time the game releases on console.
 
Looks like Alex bid his time and now comes out with a more nuanced look than DF had initially. Here's another timestamp discussing aspects raised by @NTMBK with regards to accurate light representation from a model that lacks this input.


I haven't watched the video yet, but surely it's better than leather jacket man telling gamers that they are wrong? After saying how great scarcity is for them and to hell with you peasants no long ago?

I bet it's a half assed "we're sorry" while still saying it's good and the future because they rely on being an Nvidia mouthpiece.
 
With this thread being old enough, even if most gamers would say RT with upscaling is essential now, I think the answer is still "it depends" for consumer cards and gaming.
The acceleration of ML/DL has made RT and ray denoising non-negotiable in Computer Vision in automotive, and defense sectors though. I recently made my career move to an OEM that caters to both markets. Let's say it's not just Nvidia, there's Qualcomm there too. But the industry is shaped in Nvidia's image. It is Nvidia's world and everyone is living in it. I personally only have all-AMD home systems going back a long time now, even as AMD makes itself ever more irrelevant for OEMs. Now, I have totally accepted that AMD is squarely out of the game even in terms of software stack for industries.
 
Tom Henderson says Ubisoft and Capcom devs were unaware of their games propping up at GTC with DLSS5:



No, AFAIK there isn't anything official about console exclusivity since 2023 which was a very long time ago.


DF released damage control video...

If it's fair that we can criticize their video, it must also be fair that they can apologize and redeem themselves.
I was surprised to see Alex speaking ill of anything Nvidia. That was truly shocking to me.
 
I know how I feel about the vibe of cheap sloppy glossy AI faces, but what annoys me is what we don't know about DLSS5 so far. To which extent can you customize how its applied, on types of materials, foliage, shadows, and faces? And the contrast/sharpening boost and tone mapping and fake HDR look as well. It could be cheap effects applied after the fact, or it could be amazing custom trained models for very specific actors/faces and materials. So far we don't know which and its hard to trust Nvidia as well.
 
what annoys me is what we don't know about DLSS5 so far
Here is what we know - Nvidia with their real best in the world devs worked on it for 3 years and they currently need an additional 5090 to run it, just a few optimisations left and it should be ready for autumn ...

Could not care less about DLSS5 - AMD just give us a nice GPU range, including halo top die size with lots of real FP32 (**** dual issue junk) and nice hardware to do RT/PT, that will be the end of Nvidia in gaming.
 
It'll be a bit like what's done on the 20B parameter LLMs on FP4 getting 98% of the accuracy of the 200B FP16 ones despite using up 1/40th of the size and running 20x faster on the same chip.
I don't care if they can get this shit running on a GT 1030. Doesn't make it any better.
Rn it's a myopic screen-space glorified slop filter. Broaden the context to world space and the model size and ms cost explodes.

Would have been better if they made some move to make PhysX v2 or something like that, or just a bunch of new exciting techs that AMD filed patents for, shame.
Neural PhysX is the easiest win for them, but they don't care.
More faith in AMD and MS defining the standards in DX13 underpinning the future of game design.
 
Bad news hombre, talent doing that is now either at Apple or AMD.
Pretty sure NV did an autogolpe with ML slopcels replacing actual real GFX people wrt driving rendering roadmaps.
They're gonna get blindsided by AMD in non-ML stuff.

It does feel like Nvidia realised that AMD will catch up with ray tracing and since new consoles will focus on that devs will actually have to optimise this stuff, which will port nicely to PCs resulting in pretty much only meaningful Nvidia advantage being massively eroded.
Nah prob more than that. Zero indication yet that NVIDIA is rebuilding architecture from scratch to drive work graphs. Huge cachemem, execution, and scheduling reset on gfx13. A bit of clever compiler-side engineering and benefits can happen on day one within existing pipelines.

Depends on how widespread it'll will be. but it's a major concern if devs start do separate PT pipeline based on work graph.

Could not care less about DLSS5 - AMD just give us a nice GPU range, including halo top die size with lots of real FP32 (**** dual issue junk) and nice hardware to do RT/PT, that will be the end of Nvidia in gaming.
Why the hate around dual issue?

Hopefully. They need to teach them a lesson.
Smart AMD would take the chance and push largest config of AT0 on consumer and take the perf crown. Only happens if NVIDIA is not fixing their core scaling woes.
Realistically AMD won't do it.
 
AMD never misses an opportunity to miss an opportunity in dGPU. They are fortunate Nvidia said hold my beer, or the firestorm over fine wine turning into corked wine would still be dominating the graphics news cycle.
 
Ew! The Resident Evil Requiem girl has the same feel as the countless AI char generators out there. They deserve being called a slop generator and more.

Please, I'd like to see a day when Nvidia is brought down now. I hope the AI bubble pops so hard that their gaming division is multiple times bigger than their enterprise division.
Tom Henderson says Ubisoft and Capcom devs were unaware of their games propping up at GTC with DLSS5:
Developers at CAPCOM tell Insider Gaming that the announcement and the publisher’s involvement were particularly shocking, as CAPCOM has previously been historically very “anti-AI” with projects such as Resident Evil Requiem and other unannounced projects in development. Some at the publisher fear that the DLSS 5 announcement could prompt a change in the publisher’s view on generative AI and its implementation in its games.
They are total bullies. The above is what bullies do. What stops bullies is eating a feast of humbleness. That means reducing the size to a fraction of a fraction of what it was previously.

Go Capcom Go...
 
Last edited:
I'd be curious to see this applied to older games. Anything PS3 era (maybe even PS2 era) has good enough faces that the AI can latch onto and "enhance". Replaying older games from that era with hallucinated graphics slathered on top of parts of each frame would be surreal.

Anything using other art styles like cell shading would probably be an absolute riot as well. Even modern titles with bland graphics might be more enjoyable to play after DLSS injects some nightmare fuel into the frame. Tell that the Gollum game wouldn't be better with a far sharper and detailed Gollum with fuller lips.

Like ray tracing, this is the sort of thing that a game has to be designed around or it won't work. We're at least a decade out from that being feasible.
 
You know, people said similar things about wanting AMD to focus on raster when Turing was announced.
That was 2018 and ray tracing with DLSS1 were totally unimpressive with Turin.

Now it's 2026 and AMD got enough money to focus on multiple things - including raster too.

Why the hate around dual issue?

Because it does not work anywhere near 2x it implies? Made original claims of "CUDA cores" totally misleading, I'd say even fraudulent, and AMD's implementation is even worse than Ampere.
 
Back
Top