This site may earn chapter commissions from the links on this folio. Terms of employ.

Always since Gears of War Ultimate Edition came out last week, there'southward been a rumor floating around that one reason the game runs so poorly, with so much stuttering on AMD hardware, is because Nvidia'southward PhysX is actually running on the CPU. Nosotros were flagged almost this possibility last Wednesday, then I installed the base game and consulted with Jason Evangelho over at Forbes, who had written the initial article on Gears of State of war'south low functioning, to check performance settings and the like.

Update (three/11/2016): I'm inserting a signal of clarification hither well-nigh PhysX and how it functions. Nvidia historically licensed PhysX in 2 distinct ways — every bit a general software middleware solution for handling physics that was always intended to execute on the CPU (software PhysX), and as a GeForce-specific physics solution that added in-game visual effects and was intended to execute on Nvidia GPUs (hardware PhysX).

The problem with this stardom is that hardware PhysX tin can be executed on the CPU besides. This is a singled-out tertiary operating instance, best referred to as "Hardware PhysX executing in software." Some websites have claimed that Gears of War uses this fashion by default, therefore harming functioning on AMD GPUs. Our results abnegate this claim.

Original story beneath:

I used the congenital-in Windows performance monitoring tool, Perfmon, to catch a screen shot of what CPU utilization looked like within Gears of War when benchmarking at 4K on an AMD Radeon Fury X GPU. I besides checked the Windows\Apps folder to bank check the configuration files for PhysX. What I found — and I wish I had screenshots of this — was that every single game-related INI file independent the following: "bDisablePhysXHardwareSupport=Truthful" Since I was testing on an AMD Radeon R9 Fury 10, that's exactly what I wanted to see. I turned the system off and went back to working on other manufactures. (All tests below were run on a Haswell-E eight-core CPU).

Data from March 2. PhysX disabled according to INI.

Information from March ii. PhysX disabled according to INI.

Fast forrard to today, when reports are however surfacing of the "bDisablePhysXHardwareSupport" variable being set to Fake, rather than Truthful. I fired the testbed up again, allowed the game to update, checked the same INI files, and found that the value had changed. On Midweek, five files had defaulted that value to "True," significant PhysX should've been disabled.On Sunday, the value had changed to "Faux," which implies it'southward now enabled.

Data from March 6. PhysX enabled according to .INI.

Data from March 6. PhysX enabled co-ordinate to .INI.

If you compare the CPU graphs of Fake versus True, notwithstanding, you'll note they're more or less the same. Assuasive for some variation in when the criterion run started, and you've got a pattern of high spikes and dips. The average value for the disabled/Truthful run was thirteen.63% and for the enabled/fake run, 14.62%.

What about Nvidia? I dropped in a GTX 980 Ti, installed Nvidia'due south latest drivers, and ran the aforementioned, simple test. I allowed the criterion to run twice, then grabbed the concluding CPU utilization event.

Click to enlarge. Data from March 6. PhysX enabled in .INI.

Click to enlarge. Data from March six. PhysX enabled according to .INI.

The average CPU utilization on this graph isn't much lower, at 11.77%, merely the shape of the graph is distinctly different. The GTX 980 Ti's frame rate is roughly double that of the R9 Fury X (we benchmarked with ambient occlusion disabled, since that mode causes continual rendering errors on the AMD platform), but the CPU utilization doesn't go on spiking the fashion it does with the AMD cards.

Smoking gun or poorly optimized game?

Information technology'southward true that the .ini default setting for Gears of War appears to have changed between the original game and the latest update that's been pushed via the Windows Store. Simply at that place's no testify that this actually inverse anything near how the game performs on AMD cards. Nvidia'due south own website acknowledges that Gears of War uses HBAO+, but says nothing about hardware PhysX. Given the historic period of this version of the Unreal 3 engine, it'due south possible that this is a variable left over from when Ageia owned the PhysX API; Unreal 3 was the beginning game engine to feature Ageia support for hardware physics.

Right now, the situation is reminiscent of Arkham Knight. It's true, Nvidia cards by and large outperformed AMD cards in that title when it shipped, only the game itself was so horrendously optimized, the vendor pulled it altogether. Every bit of this writing, at that place's no evidence that hardware PhysX is active or related to this problem.

All nosotros take is evidence that the CPU usage pattern for the AMD GPU is different than the NV GPU. Since nosotros already know that the game isn't handling AMD GPUs properly, fifty-fifty with ambient occlusion disabled, we can't depict much information from that. Our ability to assemble more detailed functioning information is currently curtailed by limitations on the Windows Store. (None of the game's configuration files can be altered and saved — at to the lowest degree not using any permission techniques I'm familiar with.)

If yous're an AMD gamer, my advice is to stay clear of Gears of War Ultimate Edition for the time being. There's no  evidence that hardware PhysX is causing this problem, but the game runs unacceptably on Radeon hardware.

Update (3/11/2016):

After we ran with this piece, nosotros realized that while we can't edit the INI files of a Windows Store application, nosotros tin can alter how PhysX runs via the Nvidia Control Panel. Previously, the application was set to "Default," which means that if hardware PhysX was enabled, the game would execute that lawmaking on the GPU.

Nosotros retested the game in this mode and saw essentially identical results to our previous tests. The CPU utilization curve for GeForce cards remains somewhat different than information technology does for AMD GPUs, just it's consistent whether PhysX is forced to run on the GPU or the CPU.

If Gears of War actually used hardware PhysX, it would increase CPU utilization when we offloaded that task back on to Intel's Haswell-East. The fact that we see no difference should put to rest any claim that Gears of War is using PhysX to damage AMD's performance.