DrMrLordX
Lifer
Press Shift+Enter to bring up the UI. This spawns a second mouse for the overlay. On the bottom panel, expand the "Profiler" category to view the number of draw calls.
Hmm okay, thanks. I'll mess with it next time I get the chance.
Press Shift+Enter to bring up the UI. This spawns a second mouse for the overlay. On the bottom panel, expand the "Profiler" category to view the number of draw calls.
https://youtu.be/gTqV6v1xeHs
https://www.youtube.com/watch?v=vXH6zynVSxg&feature=youtu.be
Took another stab at measuring Fallout 4 performance, this time with ENB installed.
Video #1: Using ENB with same settings as last video (1080p medium, max draw distances):
Code:https://youtu.be/gTqV6v1xeHs
Seems like installing ENB lowered my framerates? I was able to find a slow-er spot a few pixels away from the one in my last vid.
Video #2: Using ENB + Head1985's benchmark settings:
Code:https://www.youtube.com/watch?v=vXH6zynVSxg&feature=youtu.be
Unsure if using godrays on an AMD card was a good idea. Anyway, it managed to be a bit slower than 1080p Medium + max draw distances. I did get maximums above his 6700k though.
Overall, I was disappointed that I had to install something that lowers framerates just to test for draw calls. Maybe I'll uninstall ENB, go back to the same spot on the top of the Corvega factory and see if I can still get framerates below 60 fps.
edit: after uninstalling ENB and comparing my in-game position to that of the Head1985 settings video (still using his settings in 1080p), I was able to get fps down to 57 on top of Corvega examining the same spot. That compared to 51.8 or so using ENB. Maximums without using ENB (VSynv off) were up in the 190s. Is there a way to measure drawcalls that doesn't involve using ENB?
You made sure to have all the settings in ENB disabled, right?
Using godrays is very dumb, for two reasons. The first, is that it's entirely dependent on the GPU and has no CPU impact. The second, is that Bethesda used a nonsensical way of implementing them; they use tessellation, where any sane developer will just use a compute shader.
Maximum FPS is irrelevant.
I just went with the default configuration. If there's something I need to disable, please let me know so I can try it again.
That was my thinking. Head1985 had it in his testing suite though, so when I benched against his settings, I used it. His settings weren't that great when I was running ENB, but without it, my framerates were still quite good.
It is, but it isn't. When my max fps goes from 150-ish to in the 190s then something has changed, when viewing the same spot. It also affects average fps for the run (for better or worse).
Okay I got your config working.
OS: Win10 18963.rs_prerelease
CPU: R9 3900x @ 4.4 GHz
RAM: DDR4-3600 14-16-14-28 1T
GPU: AMD Radeon VII @ 2050 MHz/1200 MHz
GPU Driver: 19.8.1
First Save (Corvega)
Draw Calls: ~11690
FPS: ~57.8
Second Save (Diamond City)
Draw Calls: ~8010
FPS: ~66.0
That's a good ~12fps over Zen+. It's still got a wee bit of a distance from xLake, but it's nowhere near the 50% difference as with earlier Ryzen CPUs. That's pretty good.
The thing that's striking to me is that with "normal" settings, playing the vanilla game (no ENB effects), I find it difficult to bring framerates quite that low except in a few weird places. Seems like Bethsoft should have been able to rejigger Gamebryo/Creation Engine to address the issue. But of course Fallout 4 is an old game by now so, why would they? Not gonna waste my time with Fallout76 to see if they've done any better, either.
If you like the settlement system, it's real easy to get way higher draw calls than that. When I was working on Sanctuary Hills, the draw calls climbed to 17,000 on my i7 6700k and I was dipping down from 30fps into the high twenties. Which is a stupidly high number of draw calls, and I wasn't even halfway done with the settlement.
The solution to the draw call performance is to make use of Direct3D 12 or Vulkan, as this is precisely what they are designed to do; issue large amounts of draw calls very quickly, and scale almost linearly in performance with additional cores dedicated to the driver. Unfortunately, that will never be backported. New Vegas, the best game they've published, and one of the greatest games ever, is going to forever be stuck in Direct3D 9.
Good i will test it with my 3700x and 1080TI.I still have best results so far with my 6700k.Nobody was able to beat me🙂Here are the ENB ini files used in the benchmark I had the users run: https://mega.nz/#!3kUHTY7K!GnxkhGoqX1WjnqxUsVHMbPhfmO5zFmB5Ys8dcMva048
It's got everything disabled, so it should have no impact. When I ran ENB on my Phenom II x4 965, which was extremely limited in performance, I had no difference in fps. Even gained two frames during Fallout 4's earlier releases.
Edit: Actually, if you want to benchmark, I collected the config files, instructions and all, on this thread which I used to collect the results first, before making this thread: https://socialtechwork.com/threads/draw-call-performance-in-fallout-4.2501467/


Ok here are my results with 3700x
system:
3700x
2x16GB dual rank ram 3600mhz cl 16-17-19-36
1080TI
With 6700k at 4,5Ghz i have this:
First Save (Corvega)
Draw Calls: 11703
FPS: 71.8fps
Second Save (Diamond City)
Draw Calls: 8004
FPS: 85.5fps
with 3700x i have:
First Save (Corvega)
Draw Calls: 11776
FPS: 66.1fps
Second Save (Diamond City)
Draw Calls: 8029
FPS: 78.8fps
So 6700k is only 8.5% faster.Not bad.It also runs at 4500mhz vs cca 4250mhz 3700x
https://socialtechwork.com/threads/draw-call-performance-in-fallout-4.2501467/#post-38813107
Huh, just running an NV dGPU got you a lot of extra performance there. Goofy as heck.
So 6700k is only 8.5% faster.Not bad.It also runs at 4500mhz vs cca 4250mhz 3700x
https://socialtechwork.com/threads/draw-call-performance-in-fallout-4.2501467/#post-38813107
Anyone got a 9900k or 9700k that can test it against the ryzen 3000 series? Planning on getting a ryzen 3600 but the fo4 benchmarks got me a bit worried
Had this on my mind before moving to 9900KF, too bad I didn't find this thread and test it with 3600.
OS: Win10 Home
CPU: Intel 9900KF stock
RAM: 2x16GB DDR4 3600MHz 16-16-16-36
GPU: Gtx 1080Ti aorus waterforce
GPU Driver:441.20
First Save (Corvega)
Draw Calls:11700
FPS:74
Second Save (Diamond City)
Draw Calls: 8000
FPS: 91
Will check with Vega56 tomorrow.
Moving from 3600 to 9900KF was a huge difference in this game in CPU intensive places, like Swan's Pond or Faneuil Hall. My fps went from 35-40 to 60 solid. And 3600 was a big improvement over 1600 before.
One caveat, I didn't reinstall windows when going from 1600 to 3600, I decided to do so with 9900KF, a week after installation and it seemed to have improved the fps in this game.
With 6700k at 4,5Ghz i have this:
First Save (Corvega)
Draw Calls: 11703
FPS: 71.8fps
Second Save (Diamond City)
Draw Calls: 8004
FPS: 85.5fps
with 3700x i have:
First Save (Corvega)
Draw Calls: 11776
FPS: 66.1fps
Second Save (Diamond City)
Draw Calls: 8029
FPS: 78.8fps
Why are you comparing against a year old chip ? Why not the 3700x or 3800x ? The 2700x is old news and not near as good as the 3700x.The difference shouldn't be that high,as people have tested with the Ryzen 7 3700X with a GTX1080TI,which should not be much faster than the Ryzen 5 3600,as Fallout 4 won't use more than 4 to 6 threads properly. Also look at your Core i9 9900KF scores against a 4.5GHZ Core i7 6700K,most of the improvement is down to your CPU running closer to 5GHZ.
That is with a Ryzen 7 3700X at roughly 4.25GHZ and a Ryzen 5 3600 should be running at between 4.0~4.2GHZ or around that. Someone with a Ryzen 9 3900X was getting between 60~74FPS with a GTX1080TI in the Corvega plant depending on the memory configuration they used here.
So realistically there shouldn't be more than a 10% to 20% difference at most looking at the earlier results with Ryzen 3000. But what you are saying is more like a 50% to 60% increase.
The thing is the FPS difference seems rather large compared to tests which have been done:
With an RTX2080,a stock Core i5 9600K against a stock Ryzen 5 2600X is around 20% difference running around the region around Diamond city.
Here is a Ryzen 7 2700X against a Core i9 9900K in the same scene and the same GPU:
Around 20% to 25% difference. Here is a test around Swan's Pond done with a GTX1080:
https://www.youtube.com/watch?v=Mr2B0RJd7Nc
The Ryzen 7 2700X is at 4.2GHZ and the Core i7 8700K at 4.4GHZ,and the Core i7 8700K is at best 10% higher.
I found the area around Diamond City to be a bit more of a performance hog,than say immediately around Swan's Pond. Even with my Ryzen 5 2600,with a GTX1080 I certainly didn't see such low FPS around Swan's Pond at 1440p,so something does not seem right TBH. Built up settlements,yes I did see under more like that kind of FPS.
Why are you comparing against a year old chip ? Why not the 3700x or 3800x ? The 2700x is old news and not near as good as the 3700x.
OS: Win10 Home
CPU: Intel 9900KF stock
RAM: 2x16GB DDR4 3600MHz 16-16-16-36
GPU: Gtx 1080Ti aorus waterforce
GPU Driver:441.20
First Save (Corvega)
Draw Calls:11700
FPS:74
Second Save (Diamond City)
Draw Calls: 8000
FPS: 91
Will check with Vega56 tomorrow.
The difference shouldn't be that high,as people have tested with the Ryzen 7 3700X with a GTX1080TI,which should not be much faster than the Ryzen 5 3600,as Fallout 4 won't use more than 4 to 6 threads properly. Also look at your Core i9 9900KF scores against a 4.5GHZ Core i7 6700K,most of the improvement is down to your CPU running closer to 5GHZ.
That is with a Ryzen 7 3700X at roughly 4.25GHZ and a Ryzen 5 3600 should be running at between 4.0~4.2GHZ or around that. Someone with a Ryzen 9 3900X was getting between 60~74FPS with a GTX1080TI in the Corvega plant depending on the memory configuration they used here.
So realistically there shouldn't be more than a 10% to 20% difference at most looking at the earlier results with Ryzen 3000. But what you are saying is more like a 50% to 60% increase.
The thing is the FPS difference seems rather large compared to tests which have been done:
With an RTX2080,a stock Core i5 9600K against a stock Ryzen 5 2600X is around 20% difference running around the region around Diamond city.
Here is a Ryzen 7 2700X against a Core i9 9900K in the same scene and the same GPU:
Around 20% to 25% difference. Here is a test around Swan's Pond done with a GTX1080:
https://www.youtube.com/watch?v=Mr2B0RJd7Nc
The Ryzen 7 2700X is at 4.2GHZ and the Core i7 8700K at 4.4GHZ,and the Core i7 8700K is at best 10% higher.
I found the area around Diamond City to be a bit more of a performance hog,than say immediately around Swan's Pond. Even with my Ryzen 5 2600,with a GTX1080 I certainly didn't see such low FPS around Swan's Pond at 1440p,so something does not seem right TBH. Built up settlements,yes I did see under more like that kind of FPS.
Well in the Ryzen 5 3600 video I posted,there were parts on the top of buildings looking over Boston(it was a modded save too),and the FPS didn't seem to did so low as what you have seen. Here is another video from the same person at Ultra(medium shadows) using a Ryzen 5 3600 and a GTX1080TI in the built up part of central Boston:OS: Win10 Home
CPU: Intel 9900KF stock
RAM: 2x16GB DDR4 3600MHz 16-16-16-36
GPU: RX Vega56
GPU Driver:19.11.1
First Save (Corvega)
Draw Calls:11700
FPS:66
Second Save (Diamond City)
Draw Calls: 8000
FPS: 74
It isn't that big of a difference across the board, in some specific places I mentioned. In GTA V, I saw more modest gains, 10-20%, with performance even slightly worse in one. And that with a mutli-monitor setup which increases the CPU load further. Fallout 4 I just tested with 1080p.
As for the swan's pond video which you linked, his settings are even lower than the default high preset, while I benched on ultra preset, with change to depth of field to standard.
Shadows distance setting is a huge performance hit, and I wasn't expecting 9900KF to be so solid with it, 60 fps(rivatuner locked) while looking down from Corvega(where you get the repair bobblehead). I double checked if I had setup everything correctly. Dropped into the high 50s at the top of Trinity tower.
Similarly in GTA IV, when you leave the first safe house and turn left, fps would drop to 40s on 3600, but is solid 60fps locked on 9900KF. With newer games, such scenarios should be far fewer because development would account for Ryzen's strengths and weaknesses relative to intel, since they're going be a huge portion of the gaming population.