Please consider supporting us by disabling your ad blocker.
=ADK= Discord Link 04/24/2017Come join us in =ADK= Discord To download the Discord app go here: https://discordapp.com/ Discord is going to have a small learning curve over teamspeak so be prepared, but the fellow members as well as the Admins will gladly help you if you have any issues with installing or using the app. Once you have Discord installed all that's left to do is click the button below. Welcome to the future of the =ADK= Community. Click Here To Join!
Search the Community
Showing results for tags 'bottleneck'.
Found 1 result
=ADK= warspite posted a topic in General Tech TalkAll, For the gearheads/enthusiasts among us, there is some very interesting discussion and experimenting by Chip Curry at Chip Reviews on CPU scaling with the Frostbite 2 engine in multiplayer. Since I just made the jump to hexacore, I thought I would weigh in as well with what I've seen for myself (I will elaborate this post as I learn more). Keep in mind that this post is more oriented toward those considering a serious hardware upgrade and want to game at 120Hz and avoid a CPU bottleneck. We get post on here all the time about folks seeking to upgrade their rigs and most of them do their gaming in multiplayer. Unfortunately, the vast majority of benchmarks we use use to make those critical hardware decisions are based on single player time demonstration game modes that don't correlate very well to how a certain piece of hardware will perform during multiplayer game on a populated server. Nowhere have I seen the disparity between single player (campaign) mode and multiplayer more sharply than in Battlefield 3. I have seen tons of posts on the EA, Battlefield 3, and Nvidia forums of frustrated people who've sprung for the latest hotness or doubled their GPU power with SLI only get no benefit from it whatsoever on BF3's large multiplayer maps. It really sucks to see your FPS remain relatively unchanged while your card's GPU utilization hovers around 50%. Speaking only for Battlefield 3, what I can see with Intel processors is that Frostbite 2 loves physical cores. The more you can throw at them, the better. One thing I missed at the very beginning of the article , is that Chip mentions that he's running Win8. At the time, I didn't think that was a big deal until I noticed that under Win7, I saw no benefit to running the 3930K over the 3820 at the same clock speed - none. What I did see under Win7 is a lower average processor load across all CPU cores with no increase in FPS. Why this is so is a mystery to me, although I do have some guesses. After much fiddling and trying to coax Win7 into providing some kind of performance boost from adding the physical cores, I gave up and did a clean install of Win8 to see if I could duplicate Chip's results. Under Win8, I'm finally seeing the benefits of adding the 3930K's two additional cores. My frame rates are up significantly (more data to follow) and my average GPU (GTX-780) utilization has increased from 75% to 95%. I held off upgrading to Win8 for as long as I could, but in order to get the benefit of running a hexacore CPU with Frostbite 2, I had no other choice. So, looking to upgrade your rig for BF4 (and Frostbite 3) multiplayer? I would consider... - Running a hexacore or better CPU. If you want to game at 120Hz on BF3's large multiplayer maps, a hexacore CPU is a must. I don't have any data on how well the AMD 8-core CPUs perform in BF3 multiplayer, but I would be very interested to know! If the performance warrants it, an 8-core AMD processor may be the most economical. - Running Windows 8. YYMV, but personally I could get no benefit whatsoever from adding physical CPU cores while running Windows 7. The difference in CPU scaling between the two Windows versions is that huge. Just some preliminary observations from me as the clock tics down on BF4 and lots of folks are considering hardware upgrades. war