Latest AMD Drivers Reportedly "Cheat" On Benchmarks
As the competition between AMD and Nvidia heats up towards the holiday shopping season, Nvidia is questioning becnhmark optimization teqniques found in the latest AMD Catalyst graphics drivers, accusing AMD for decreasing image quality in order to score higher in benchmarks.
Nvidia's claim is based on some image quality findings uncovered recently by technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD's Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. Nvidia claims that these changes in AMD's default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. Nvidia says that its
GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.
According to reports from tech websites ComputerBase and PC Games Hardware (PCGH), they had to use the "High" Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default "Quality" setting in order to provide image quality that comes close to NVIDIA's default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. According to their findings, AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.
According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA's "default" driver settings.
ComputerBase also says that AMD drivers appear to treat games differently than the popular "AF Tester" (anisotropic filtering) benchmark tool from 3DCenter.org. They indicate that lower quality anisotropic filtering is used in actual games, but higher quality anisotropic filtering is displayed when the AF Tester tool is detected and run. Essentially, the anisotropic filtering quality highlighted by the AF Tester tool on AMD GPUs is not indicative of the lower quality of anisotropic filtering seen in real games on AMD GPUs.
NVIDIA's own driver team has verified specific behaviors in AMD's drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of "larger" and "smaller" varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems that AMD disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Nvidia's driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.
Nvidia also claims that for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default. Nvidia suggest reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2 unchecking the "Enable Surface Format Optimization" checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.
In the past, Nvidia was also accused for some GeForce FX and 3DMark03 optimizations. SInce then, the company vowed to never again perform any optimizations that could compromise image quality.
AMD has not commented on the reports yet.
According to reports from tech websites ComputerBase and PC Games Hardware (PCGH), they had to use the "High" Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default "Quality" setting in order to provide image quality that comes close to NVIDIA's default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. According to their findings, AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.
According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA's "default" driver settings.
ComputerBase also says that AMD drivers appear to treat games differently than the popular "AF Tester" (anisotropic filtering) benchmark tool from 3DCenter.org. They indicate that lower quality anisotropic filtering is used in actual games, but higher quality anisotropic filtering is displayed when the AF Tester tool is detected and run. Essentially, the anisotropic filtering quality highlighted by the AF Tester tool on AMD GPUs is not indicative of the lower quality of anisotropic filtering seen in real games on AMD GPUs.
NVIDIA's own driver team has verified specific behaviors in AMD's drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of "larger" and "smaller" varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems that AMD disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Nvidia's driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.
Nvidia also claims that for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default. Nvidia suggest reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2 unchecking the "Enable Surface Format Optimization" checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.
In the past, Nvidia was also accused for some GeForce FX and 3DMark03 optimizations. SInce then, the company vowed to never again perform any optimizations that could compromise image quality.
AMD has not commented on the reports yet.