• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Vega/Navi Rumors (Updated)

Page 85 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Yes, but how often do people really care about leaking the low-end parts anyway? 😛

I suspect we'll start to see some leaks coming out over the weekend and into next week. Hell, no even Apple can really keep their products completely secret anymore, so it's pretty unlikely Vega drops without any leaks.
1-2weeks before launch.If its computex then its more than 1month away.
 
1-2weeks before launch.If its computex then its more than 1month away.

Paper launch, maybe. If we were two weeks away from a hard launch, we'd see product box art all over the net right now along with die shots and a butt load of unverified performance figures.
 
Yes precisely. Cache is a well know technique in CPUs but in GPUs the main memory consumption of a game is not shaders nor vertices but high res textures, so it's not applicable because the data is too big to be cache efficient...


Wrong. You need the textures in VRAM for smooth framerate and as I said they are the biggest assets of a AAA game.


While I respect your "gut feel", my rational engineering background doesn't believe it

Well yes. If the textures required for a particular frame are in system RAM or god forbid on the HDD the game is going to chug.

The GPU technically only needs the textures required to complete each frame in VRAM when it goes to creates the frame. That takes nowhere near multi GB of memory. The only reason to load up the VRAM with textures not currently required for smoothness is because the game or drivers can't predict what data will be required next.

I'm also not sure about your argument about cache. A core i7 7700K has 8MB of L3 cache that runs in the 200-270GB/s range.
Intel-Core-i7-7700K_AIDA64-Cache-and-Memory.jpg


Modern GPUs have 500-1500 times that amount of memory available and HBM2 or wide GDDR5X memory subsystems are just as fast or faster. So why can't we use similar methods for large amounts of data stored in large fast memory pools that we do for small amounts of data stored in small fast cache?

Basically it seems like you are saying current GPUs are as efficient as possible with their use of VRAM and no other optimizations to reduce VRAM usage is possible.

That doesn't seem likely to me.
 
Last edited:
RX 560 will be released in 2 weeks and there are no leaks. Only official information. Actually, all previous leaks stated only it is 14 CU chip, ie. RX 460 rebrand. And we have found out that's not true, 2 days ago, when AMD presented RX 500 cards.

It's a rebrand of a low end part of a fully known architecture, the interest in leaks there is pretty small. Vega is totally different. And as you say it's 2 weeks till launch and we know everything of the card because of official information, there is nothing to leak. You won't see vega hard launch before computex with no leaks at this time.
 
Last edited:
A lot of gloom and doom here, but as I said I fully expect the top Vega card to beat the 1080ti in DX12 and Vulkan games. We can see from the refresh of Polaris, that at higher clocks the 580 gets significantly ahead of the 1060 in DX12 and Vulkan games, before it was equal in games like Forza 3, The Division, GOW, CIV6, while now the 580 is consistently ahead by 3-5 frames and obviously custom AIB cards with out of the box OC to 1425 and higher offer even better performance.

Sure the new added performance comes at a consumption cost, about 50W more than the 480's across the board, but it goes to show how fast AMD is under DX12 and Vulkan.

And those are just the closer games, games like Sniper Elite 4, Total War, Hitman, Doom, BF1, etc... perform up to 12-14fps faster.

In games like Doom and BF1 an overclocked 580 to 1470MHz comes at a striking distance of the 1070FE.

The issue will be in DX11, we can see that overall if the game is not specifically to the maximum optimized solely for AMD cards, AMD tends to lose in DX11 in 90% of the games.

So again the top Vega I expect it to be up to 15% faster in DX12 and Vulkan and up to 15% slower in DX11 games. Across the majority of games I expect the difference to be around 7-10% going in each others favor depending on API and game.
 
With AMD seeming to have quite a few AAA brands under their title now, and with them giving out "thousands of dev kits" could we finally start seeing a move towards DX12/Vulcan finally being adopted properly.
 
Hoping it's a derivative similar to this one at least.

8cab9180c99ceef1f1baca52a705dc91.jpg
You mean a horribly impractical energy-guzzling and screaming monster requiring extensive tuning and warm-up for it to work at all, and even then performing well only for a single, highly limited task for a very short period of time, with a relatively high chance of catastrophic, explosive failure? Yeah, that's not a GPU metaphor that's quite to my liking.

The GPU version of a dragster would be a 1000W power hog that could only run for long enough to finish 3DMark (and only 3DMark - no other software supported!) without breaking/catching fire (most of the time) and would require you to use an iGPU for all other tasks - with a built in LN2 reservoir that would dump onto the cooler when the benchmark run finished. Would that be kind of cool? Sure. Would I want one? Nope.
 
You mean a horribly impractical energy-guzzling and screaming monster requiring extensive tuning and warm-up for it to work at all, and even then performing well only for a single, highly limited task for a very short period of time, with a relatively high chance of catastrophic, explosive failure? Yeah, that's not a GPU metaphor that's quite to my liking.

The GPU version of a dragster would be a 1000W power hog that could only run for long enough to finish 3DMark (and only 3DMark - no other software supported!) without breaking/catching fire (most of the time) and would require you to use an iGPU for all other tasks - with a built in LN2 reservoir that would dump onto the cooler when the benchmark run finished. Would that be kind of cool? Sure. Would I want one? Nope.

...On those merits Vega the star burns up energy at such a massive rate that the thermals radiating off it can be felt hundreds of lightyears away......

That would make the GPU so hot that it could melt steel beams.
 
...On those merits Vega the star burns up energy at such a massive rate that the thermals radiating off it can be felt hundreds of lightyears away......

That would make the GPU so hot that it could melt steel beams.
True. Then again, stars are incredibly efficient at what they do. As such, it might melt steel beams, but it would run Crysis fast enough to melt your eyeballs as well 😉
 
Gallows humor?
The sad thing is I'm not even confident the product is worth buying...
It's just depressing. I'm just buying it because I have an R9 290, and a 4K freesync screen....

I'm just getting fed up with the AMD ecosystem for high end gamers. 4k 144hz announced for gsync and nothing for amd freesync still talked about.
No high end GPU.

It's just tiring.
 
3072 GCN core chip with 4 GB of HBM2 with 512 GB/s bandwidth will be enough to play in 4K@60Hz in epic/very high settings.

Would be nice if they announced a 144Hz 4k HDR FreeSync monitor or two along with the release of Vega.

A boy can dream, can't he?
 
The sad thing is I'm not even confident the product is worth buying...
It's just depressing. I'm just buying it because I have an R9 290, and a 4K freesync screen....

I'm just getting fed up with the AMD ecosystem for high end gamers. 4k 144hz announced for gsync and nothing for amd freesync still talked about.
No high end GPU.

It's just tiring.
It sucks but it seems to happen to both vendors every 5-10 years. Same thing happens to NV with the 480 GTX. ATI had the DX11 HD 5870 for 9 months before NV had a DX11 card.

Now AMD is lagging.
 
The sad thing is I'm not even confident the product is worth buying...
It's just depressing. I'm just buying it because I have an R9 290, and a 4K freesync screen....

I'm just getting fed up with the AMD ecosystem for high end gamers. 4k 144hz announced for gsync and nothing for amd freesync still talked about.
No high end GPU.

It's just tiring.
While I understand the upgrade itch, I'd suggest you spend more time enjoying your games at slightly lower resolution/quality and less time being depressed about not being able to play them at 4k60.
 
The sad thing is I'm not even confident the product is worth buying...
It's just depressing. I'm just buying it because I have an R9 290, and a 4K freesync screen....

I'm just getting fed up with the AMD ecosystem for high end gamers. 4k 144hz announced for gsync and nothing for amd freesync still talked about.
No high end GPU.

It's just tiring.

OK, well it's not like nVidia offers something for 4k, 144hz either. So, unless there is a 3rd option out there, I guess we are all screwed.
 
OK, well it's not like nVidia offers something for 4k, 144hz either. So, unless there is a 3rd option out there, I guess we are all screwed.
If you're not trying to do max settings in everything, Titan Xp indeed provides a 4K 100FPS-140FPS ish experience in a lot of cases.
Heck my 390 can do 4K at 60FPS in BF1 if I drop some settings.

This is one of the greatest myths of 4K gaming IMO. That the hardware to do it doesn't exist. Even a 980 Ti can do it very reasonably.
 
quote: ''...On those merits Vega the star burns up energy at such a massive rate that the thermals radiating off it can be felt hundreds of lightyears away...... That would make the GPU so hot that it could melt steel beams. ''
True. Then again, stars are incredibly efficient at what they do. As such, it might melt steel beams, but it would run Crysis fast enough to melt your eyeballs as well 😉

Then again, on those merits, Fermi, Maxwell, Pascal and Volta... they are all dead already
 
If you're not trying to do max settings in everything, Titan Xp indeed provides a 4K 100FPS-140FPS ish experience in a lot of cases.
Heck my 390 can do 4K at 60FPS in BF1 if I drop some settings.

This is one of the greatest myths of 4K gaming IMO. That the hardware to do it doesn't exist. Even a 980 Ti can do it very reasonably.
what's worse, is people claiming something is not true 4k ultra/max settings without AA.
 
I've been optimistically hoping for a launch alongside Prey, although some discussion on the AMD reddit noted that we would traditionally have seen a boatload of leaks by now if that was the case. So it's looking like:

-AMD have been managed to be practically watertight and have kept us all in the dark quite deliberately; Vega gets announced at the start of May for an imminent hard launch. This does tie in with some of the more recent rumors in this thread but, they are rumors. Not saying it's impossible but I wouldn't bet on it.

Or:

-We get some sort of official announcement or the dreaded paper launch to coincide with Prey, which would be sort of embarrassing, but not as bad as what may come to pass..

Or:

-Prey comes and goes with complete silence on Vega, so much for all that promotion and hype. Paper launch at Computex for availability in late June, just squeezing into H2 2017?

We could be looking at anything between a week and 2 months before we known anything more for sure. Patience is a virtue....
 
Download the AMD Freesync windmill demo, there's a red bar test pattern that works really well for telling if it's working. (It's running on a display on the floor of my store here and every customer that has seen it and done the on/off test has been amazed at the difference)

@tential https://community.amd.com/thread/180553

Edit: Found the link!

I should be seeing a lot of screen tearing with AMD freesync enabled right?
Ya, I probably should have the main features of my monitor working... brb.....

Edit: This is why I said to that other user that he may just not have his freesync working if he saw screen tearing with it enabled.....
 
Yeah I showed my wife the windmill demo and she was shocked at the difference.

I'm genuinely worried that Vega will blow me straight out of the 144hz zone of my freesync range and i'll have to enable Vsync, except I don't want input lag.

If that's the case, i'll get a 1440 or 4k ultrawide with freesync, mmm maybe with tasty HDR.
 
Status
Not open for further replies.
Back
Top