Please consider supporting us by disabling your ad blocker.
Search the Community
Showing results for tags 'frostbite 2'.
Found 2 results
All, I came across an article a little while back while I was tuning my system for BF3 and I thought I would share what I learned with you. Before I read the article, I didn't really know what I was looking at with respect to the performance meter, but now that I know I've found that it can be a handy tool and useful for providing insight into how your system is performing with the Frostbite 2 engine. Although I haven't seen it yet, I suspect Frostbite 3's performance meter will be similar. Variable used to show the performance meter: (entered using either the console (~ key) or by inserting the lines into your user.cfg file) Render.PerfOverlayEnable 1 Render.PerfOverlayVisible 1 Green line = GPU (video card) performance Yellow line = CPU performance Bottom red horizontal line = 0ms (time zero) - Vertical (from zero) = increasing response time Red horizontal lines = 20ms Red vertical lines = 3-second intervals What this example chart shows is that this guy's GPU lags his CPU in average processing time by about 8ms. In other words, his GPU is his system's bottleneck. Also, no lag spikes are present during this roughly 3.5 second time window. From what I can see (based on the average GPU processing rate (in milliseconds), this guy is churning a smooth 55.5 FPS (a nice picture on a 60Hz monitor). Later, I will post a screenshot of my performance meter for comparison (can't b/c I'm @ work right now ;) ) war FPS = 1/processing time in seconds (in this case 1/0.01803) 18.03 ms = 0.01803sec
=ADK= warspite posted a topic in General Tech TalkAll, For the gearheads/enthusiasts among us, there is some very interesting discussion and experimenting by Chip Curry at Chip Reviews on CPU scaling with the Frostbite 2 engine in multiplayer. Since I just made the jump to hexacore, I thought I would weigh in as well with what I've seen for myself (I will elaborate this post as I learn more). Keep in mind that this post is more oriented toward those considering a serious hardware upgrade and want to game at 120Hz and avoid a CPU bottleneck. We get post on here all the time about folks seeking to upgrade their rigs and most of them do their gaming in multiplayer. Unfortunately, the vast majority of benchmarks we use use to make those critical hardware decisions are based on single player time demonstration game modes that don't correlate very well to how a certain piece of hardware will perform during multiplayer game on a populated server. Nowhere have I seen the disparity between single player (campaign) mode and multiplayer more sharply than in Battlefield 3. I have seen tons of posts on the EA, Battlefield 3, and Nvidia forums of frustrated people who've sprung for the latest hotness or doubled their GPU power with SLI only get no benefit from it whatsoever on BF3's large multiplayer maps. It really sucks to see your FPS remain relatively unchanged while your card's GPU utilization hovers around 50%. Speaking only for Battlefield 3, what I can see with Intel processors is that Frostbite 2 loves physical cores. The more you can throw at them, the better. One thing I missed at the very beginning of the article , is that Chip mentions that he's running Win8. At the time, I didn't think that was a big deal until I noticed that under Win7, I saw no benefit to running the 3930K over the 3820 at the same clock speed - none. What I did see under Win7 is a lower average processor load across all CPU cores with no increase in FPS. Why this is so is a mystery to me, although I do have some guesses. After much fiddling and trying to coax Win7 into providing some kind of performance boost from adding the physical cores, I gave up and did a clean install of Win8 to see if I could duplicate Chip's results. Under Win8, I'm finally seeing the benefits of adding the 3930K's two additional cores. My frame rates are up significantly (more data to follow) and my average GPU (GTX-780) utilization has increased from 75% to 95%. I held off upgrading to Win8 for as long as I could, but in order to get the benefit of running a hexacore CPU with Frostbite 2, I had no other choice. So, looking to upgrade your rig for BF4 (and Frostbite 3) multiplayer? I would consider... - Running a hexacore or better CPU. If you want to game at 120Hz on BF3's large multiplayer maps, a hexacore CPU is a must. I don't have any data on how well the AMD 8-core CPUs perform in BF3 multiplayer, but I would be very interested to know! If the performance warrants it, an 8-core AMD processor may be the most economical. - Running Windows 8. YYMV, but personally I could get no benefit whatsoever from adding physical CPU cores while running Windows 7. The difference in CPU scaling between the two Windows versions is that huge. Just some preliminary observations from me as the clock tics down on BF4 and lots of folks are considering hardware upgrades. war