• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD Ryzen (Summit Ridge) Benchmarks Thread (use new thread)

Page 175 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Trust me, you could be: if a badly taken photo of an AM4 BIOS were to show a 6c/12t overclocked to 4.7Ghz you would enter a completely different state of hype.

In fact, the mere mention of the words above will prompt some people to ask if it's true.

Meh, before the FX range released there were Intel shills making up false hype by claiming things to be better than they actually were so people would be disappointed.

A badly taken photo normally raises suspicions anyway. It's a good way to hide dodgy photoshopping.

I have high hopes for Ryzen's overclocking throughout the range, i'm even going to invest in my first ever watercooler to try and push it further, but at this point what will be will be. If it's 3.6 or 4.6, i'll love it regardless.
 
Meh, before the FX range released there were Intel shills making up false hype by claiming things to be better than they actually were so people would be disappointed.

Seriously? Is this the real life?

At any rate, I've been meaning to get a new rig and now I'm pretty sure it will be Zen. My very first build was AMD, but has been a long time since I could justify an AMD chip.
 
With no benchmark suite's with none of the popular gaming or CPU benchmarks but a select few being revealed and not much else to go on really, I don't think they have received their final revision yet, I mean, come on, give us the info we really want!! I just think they aren't because they cant do that yet and we're destined for more launch push-backs. Fun at first, but now getting tired of waiting and wondering.
 
I think that the MB BIOS problem rumor could be a not optimal XFR performance... Maybe this is the reason to not show ST bench... XFR is critical for ST benches...
 
that's the thing. If it was really impressive, they would had told us. That's why I am worried.

Concerned about gaming perf..?.

http://www.guru3d.com/news-story/fr...ng-sample-amd-ryzen-processor-benchmarks.html

index.php


Now just imagine that the retail SKUs will have 20% better ST perf than the early sample used in this review..
 
The amount of folks thinking that more cores slightly lower clocks hurts gaming is astonishing.
The cause is the incompetence of some reviewers that create artificial bottlenecks by testing high end GPUs at low resolution.They get a data point of no value and ,apparently, many people draw false conclusions.
 
The amount of folks thinking that more cores slightly lower clocks hurts gaming is astonishing.
Look at chart above, man. And that's AAA titles, don't get me started on lower end titles that are single thread bottlenecked even with midrange GPUs.
The cause is the incompetence of some reviewers that create artificial bottlenecks by testing high end GPUs at low resolution.They get a data point of no value and ,apparently, many people draw false conclusions.
If you do not know what you are looking at, it is easy to be incompetent, i agree 🙂.
 
Look at chart above, man. And that's AAA titles, don't get me started on lower end titles that are single thread bottlenecked even with midrange GPUs.

If you do not know what you are looking at, it is easy to be incompetent, i agree 🙂.

Would be useful if you would make an attempt to get informed before talking.
My entire point was that people like you end up believing the very opposite of where the truth is.
You are better off with more cores , at realistic resolutions.

If you test a GTX 1080 at 1080p to induce a bottleneck that shows higher clocks getting 165FPS instead of 150FPS for many cores, it doesn't help anyone, except Intel's marketing
Do the same tests at 1440p and 4k and the reality changes quite a bit. There are some games that like many cores, some that like larger caches and very few are clocks bound at realistic resolutions for the GPU utilized. You are better off with many cores.
 
Concerned about gaming perf..?.

http://www.guru3d.com/news-story/fr...ng-sample-amd-ryzen-processor-benchmarks.html

index.php


Now just imagine that the retail SKUs will have 20% better ST perf than the early sample used in this review..

So about +3% in those games at such res... IF the Launch freq - ES tested = +20%.

(6400->6500 gets +3% in games at such res).

And imagine what that does to 12v power.

I bet Intel slashes prices, launches another model higher than 6900k. Just as happened with the QX9770.

Sent from HTC 10
(Opinions are own)
 
The cause is the incompetence of some reviewers that create artificial bottlenecks by testing high end GPUs at low resolution.They get a data point of no value and ,apparently, many people draw false conclusions.
1080 is used by some who want to be able to get a high enough FPS to take advantage of high refresh rates. There is also the issue of not masking the performance of CPUs by having the GPU be the main bottleneck.

However, it is true that it's much more ideal to have higher resolution results to look at as well, for comparison. Having a ton of extra FPS at a lower resolution than you're going to play the game at is fairly meaningless unless it helps in minimums and percentiles (smoothness) in the higher resolution. In that case, though, the higher resolution testing should show that anyway.
 
Would be useful if you would make an attempt to get informed before talking.
My entire point was that people like you end up believing the very opposite of where the truth is.
You are better off with more cores , at realistic resolutions.

If you test a GTX 1080 at 1080p to induce a bottleneck that shows higher clocks getting 165FPS instead of 150FPS for many cores, it doesn't help anyone, except Intel's marketing
Do the same tests at 1440p and 4k and the reality changes quite a bit. There are some games that like many cores, some that like larger caches and very few are clocks bound at realistic resolutions for the GPU utilized. You are better off with many cores.
No people like you end up believing the very opposite of where the truth is because they confuse the games with the gpu benches the games come with.
 
Status
Not open for further replies.
Back
Top