• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD Ryzen Gen 2 Set For Q2 2018

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Just one side comment. Ryzen 3 is free only if every die is only capable of 4c. But that isn't going to be the case, I wouldn't be surprised if most chips are R5 or R7 capable. If that's the case R3 basically costs AMD as much as RR.
 
Actually, R3 is free if the die would have been discarded because of:
A damaged L3 cache cell
3 or more failed CPU cores
A failed IF link for use in TR or EPIC
A failure in the microcode that makes SMT work
Processors that had three or more cores that couldn't make the needed clock binning for the higher spec parts.

Essentially, R3 dies can have just a single minor defect that results in major issues, but still enables use as an R3 spec. No foundry system is perfect. There will be defective dies. R3 enables cost recovery while also putting pricing pressure on Intel's lower spec parts. Keeping a ceiling on Intel's small die process is beneficial for both AMD and the consumer.

I personally like the idea of desktop RR, but I can see where AMD wouldn't want to produce it either. I imagine that its still ramping to full production as well at this time, and the greater margins can be had in the mobile space.
 
Yes but for market reasons AMD would be forced to sell good dies as R3's to maintain stock. Replacing these with APU's would allow AMD to treat an APU less bad chip as a more limited item with little savings and little stock for people who want a "oc" special without a iGPU.

Either way AMD is selling a big chip cheap. But if they keep the DT APU's R3 cheap they can at least push volume letting them have an outlet that can keep production costs down if RR isn't adopted as much as it should be in moblie.
 
Maybe on the lower range, with disabled SMT. Guessing:

512 SPs, 4c/4t@3.2/3.6(3.7xfr) at $110.

Something like that would be tremendous value. If I had to guess, I think it more likely we'll see something like 4C/4T@2.8/3.4/3.45XFR w/ 4MB L3 and 384SP for around $110-120.

That'd still make the 1200 relevant due to higher base clock and more L3 cache.

On the mid and upper side 640+ SPs 4c/8t at similar frequencies and elite frequencies for $170, $200+.

I'd be very surprised if there isn't a fully enabled top model, perhaps with 95W TDP. Just imagine. 4C/8T@3.6/4.0/4.1XFR with 704SP coupled with 3200MHz memory. That'd make a very nice daily driver system, without breaking the bank.
 
I'm thinking the demand for the 8 core threadrippers is limited. Someone sinking ~$400 for a TR board will likely not skimp and future proof the build with at least 12 cores. SR and the rare PR quadcore dies are better disposed of in the low end market.

i agree the demand is probably limited. Nevertheless there are good arguments for it. I'm most likely pulling the trigger on a Threadripper build and the argument for me building one is future work in a related media production field. I (probably may) need the lanes and m.2 straight into the CPU, plus quad channel memory, so that's the better platform. But for video, if using Davinci Resolve for example, the CPU will do compression/decompression of video files while the GPU will do the actual visual processing. So, if this is a build that isn't guaranteed to get work but is a bit speculative budget is still an issue, and the tradeoff is actually interesting: Spend $200 more on a 1920x or put those $200 on a more powerful GPU?

Yes, $400 for a board is a lot, but even at $400 for the 8-core it is 100% of the board cost, and more with the other CPUs. And it's worth considering that the 8-core CPU doesn't seem overpriced actually, the boards are just pricey. As far as I can tell it's the highest binned non-server 8-core CPU that AMD has, and it's only marginally more expensive than the 1800x. So we're really paying for the platform, and as such if one could be 'ok' with an 1800x but needs TR4 then why not be equally 'ok' with a 1900x? And there's still the path to upgrade to 12 or 16 cores later.

Sorry if this was off-topic btw. Just ignore in that case.
 
To be fair, PulseAudio did have really serious problems when it first came out. Like didn't even work half the time.

Yeah it sucks, but most of the major non-hobbyist distros default to it. I'm surprised anyone actually got ALSA running through Mint. Mint should default to Pulseaudio because it's "easy" and because Lennart Poettering but whatever.

Bottom line is I would only expect to see ALSA on maybe Slackware or Gentoo installs. Mint? Really?

Pulseaudio mostly works now. Mostly.

edit: wait Pulseaudio isn't even supposed to use the ALSA kernel drivers, is it?
 
Yeah it sucks, but most of the major non-hobbyist distros default to it. I'm surprised anyone actually got ALSA running through Mint. Mint should default to Pulseaudio because it's "easy" and because Lennart Poettering but whatever.

Bottom line is I would only expect to see ALSA on maybe Slackware or Gentoo installs. Mint? Really?

Pulseaudio mostly works now. Mostly.

edit: wait Pulseaudio isn't even supposed to use the ALSA kernel drivers, is it?
I don't think so.
 
I've got to say, "Not well".

I built a "test mule" rig, around an AMD A8-9600 APU (Bristol Ridge). In an Asus Prime B350-E micro-ATX mobo. Which allows for overclocking of the BR APUs.

I installed Windows 10, works like a charm, everything works, great!

I installed Linux Mint 18.2 Cinnamon. No HDMI audio. No matter what I do.

I upgraded to Mint 18.3. Still no HDMI audio. Installed newest kernel, installed "daily ALSA" driver DKMS too, still no HDMI audio. Strangely, too, the analog audio doesn't show up in Linux at all either.

Booted Mint 18.3 Mate, no audio on the LiveUSB either.

Basically, it's screwed. No workable Linux audio for me. I have a monitor with speakers, I don't want to have to plug in a separate set of analog speakers every time I want to use a Linux PC.

The wierd part is, it's detected perfectly fine, and ACTS like it's actually working. Just ... no sound. Again, Windows 10 works fine. So it's gotta be Linux.

Unsure if it's the board, or the APU itself, that's incompatible with Linux.

And if Linux doesn't support BR (I can't install either the 15.2 Cats, or the AMD GPUPRO driver on Mint 18.3 either), then how is it going to support RR? I suspect that it wont.

Which is sad, these would make great Linux boxes, with an AMD APU, some DDR4, a mobo, and an SSD. Certainly cost- and performance-competitive with Intel's G4560.

Edit: This is all with the default open-source drivers in Mint 18.2 and 18.3. I was not able to get any of the proprietary drivers to install.

I did try an ATI DVI-to-HDMI adapter, off of the DVI port, and that worked in Windows 10, but not in Linux, for sound. It will do 4K though.
Linux kernel 4.15 allows audio over hdmi. Vega is supported. I don't know about BR though.
 
Linux kernel 4.15 allows audio over hdmi. Vega is supported. I don't know about BR though.

Very good to know. Thanks!

The prebuilt Ubuntu kernels end at 4.13, at least what's visible under the Mint Updater. Is there an easy way to get the latest and greatest easily? Since Ubuntu modifies kernels heavily I don't know if it's a good idea to build your own kernel when using Mint/Ubuntu.
 
Very good to know. Thanks!

The prebuilt Ubuntu kernels end at 4.13, at least what's visible under the Mint Updater. Is there an easy way to get the latest and greatest easily? Since Ubuntu modifies kernels heavily I don't know if it's a good idea to build your own kernel when using Mint/Ubuntu.
Rolling releases are latest and greatest.
 
2017-09-27-image-3.jpg


My guess is that, for the 2nd generation, Pinnacle Ridge is limited to 6 and 8 cores parts, with Raven Ridge making up for the 4 cores parts.

Raven Ridge will easily outsell Pinnacle Ridge with Lenovo, HP, and Dell buying Raven Ridge by the crates to put in their boring black box PCs.
 
Last edited:
2017-09-27-image-3.jpg


My guess is that, for the 2nd generation, Pinnacle Ridge is limited to 6 and 8 cores parts, with Raven Ridge making up for the 4 cores parts.

Raven Ridge will easily outsell Pinnacle Ridge with Lenovo, HP, and Dell buying Raven Ridge by the crates to put in their boring black box PCs.
Depends on what yields will look like. If they have a decent amount of quad core Pinnacles, they won't just throw them away.
 
Depends on what yields will look like. If they have a decent amount of quad core Pinnacles, they won't just throw them away.

Raven Ridge with non-functional GPU probably outnumbers Pinnacle Ridge with 4 non-functioning cores.

AMD has accidentally released Ryzen 3 1200 with 8C/8T and Ryzen 5 1400, Ryzen 5 1500X, and Ryzen 5 1600/X with 8C/16T.

Consequently, we know that Summit Ridge has quite good yield and since Pinnacle Ridge is basically Summit Ridge on a refined process, we can expect the same.

Raven Ridge, on the other hand, has rather large on die-GPU that can easily be non-functional.
 
Raven Ridge with non-functional GPU probably outnumbers Pinnacle Ridge with 4 non-functioning cores.

AMD has accidentally released Ryzen 3 1200 with 8C/8T and Ryzen 5 1400, Ryzen 5 1500X, and Ryzen 5 1600/X with 8C/16T.

Consequently, we know that Summit Ridge has quite good yield and since Pinnacle Ridge is basically Summit Ridge on a refined process, we can expect the same.

Raven Ridge, on the other hand, has rather large on die-GPU that can easily be non-functional.
There's also the question of performance delta between them. Raven is 14nm and designed for efficient operation, Pinnacle is 12nm and designed to get as much performance out of Summit as reasonably possible. Both might be used for quad cores, with Raven being used for slower SKU's.
 
There's also the question of performance delta between them. Raven is 14nm and designed for efficient operation, Pinnacle is 12nm and designed to get as much performance out of Summit as reasonably possible. Both might be used for quad cores, with Raven being used for slower SKU's.

You have to wonder how many of the best binned Pinnacle Ridge die (for higher clock) would also have four non-functioning cores.
 
Defects that force cores to be turned off don't necessarily mean it's poor quality silicon in performance.

That's not what I mean.

What I am saying is this:

Let's assume that Pinnacle Ridge has high yield.

There simply aren't many die that have 4-non functioning cores.

On top of that, you want ones that's also highly binned [in addition to have 4-non functioning cores].

There are even fewer of those.
 
I think you'd only need 2 defective cores on 1 ccx to be binned into the r5/r3 tier, since amd maintains core parity between CCXs. From there, any issues with smt or cache as others have pointed out would differentiate the r3s.

Now, I can see amd filling in the low end r3 tier with binned raven ridge +igpu simply because at that low end price range, having the igpu might be the deciding factor cost wise. a 4C/4T with most or half of the 11 cu gpu at the same price tier as a current r3 definitely helps it compete with the pentium gold and i3 parts in terms of cpu performance and overall value, while the better power gating of RR could give lower binned RR a better tdp vs the igpu-less 14nm r3.
 
Quick question; does anyone have any information on whether the RR die has the "full" 8MB L3 cache? Mobile RR only has 4MB L3, so if there is only 4MB physically present, that might be the differentiation between SR/PR-based and RR-based R3/5s?
 
Back
Top