Will Radeons finally deliver performance on the older API?
About three months ago we caught AMD and its poor performance in some games, the result of a very bad performance in DirectX 11. At the time, this problem was evident in the release of Dying Light 2, a game that by default is configured in the older technology, but which could also be played in DirectX 12 and made the performance discrepancy very evident.
But something on that front seems to have changed in the next driver, Adrenalin Software Preview Driver May: there are several performance gain scenarios in DirectX 11, and even the 8% gain in games based on this API is listed when running on AMD Radeon RX 6000 series cards.
However, we already have some test reports indicating gains even greater than that, reaching up to 24% in some games, and AMD itself indicates that the gain can reach 17% in some titles.
To check how much this replicates in our tests, we re-tested the original article, which included Dying Light 2, God of War, Fortnite and PUBG. In the first two we did a routine to perform the benchmark, while in the other two we played in similar snippets, to see how the experience was impacted.
– AMD Ryzen 7 3800X processor
– Cooler Noctua NH-U12S chromax.black
– Aorus X570 Ultra Gaming Motherboard
– Kingston Fury Memories 2x8GB @2400MHz DDR4 CL18
– Cooler Master V850 Source
– Open bench
– PowerColor Radeon RX 6800 XT Red Devil
– PowerColor Radeon RX 5700 XT Red Devil
We started our experiment with tests where we created a benchmark battery, moving around the same place in the game every time. The performance variation is evident in the graph below:
– Continues after advertising –
In this scenario it is evident that 1) really only the Radeon RX 6000 received improvements through this driver and 2) the improvements exist and are affecting both games. The Radeon RX 6800 XT went from a card with performance identical to the Radeon RX 5700 XT in Dying Light 2 to something closer to what it is capable of delivering in DX12, but it’s still an evolution that stops halfway. At least, it’s enough to put it above the RTX 3080, its direct rival in price.
In God of War, a more interesting scenario since there is no alternative of “running to DX12”, we have a relevant performance gain. Again, it’s clear that there would be room for more performance, with the RX 6800 XT having the potential for more performance than we seem to have on this graph, with it lagging considerably behind the RTX 3080.
Now, commenting on the two competitive games compared, where we have more difficulties to assemble an identical test due to the great variation in the gameplay caused by the other players, we have two different scenarios. In Fortnite, the performance gain was not evident, with DirectX 12 continuing as the best alternative to play. In PUBG, the change in performance is noticeable, with a frame rate that used to go to a paltry 50fps, improving to always run above 60fps, with less stuttering and, in general, staying above 100fps. It still seems like very little for an RX 6800 XT, but it’s already more viable to play at this new level of performance.
Remember that, unlike the screenshots indicate, the memories in the tests were at 2400MHz.
– Continues after advertising –
The conclusion is a Pyrrhic victory for AMD. On the one hand, great that we have a fix on the way. But it’s still not a robust fix for the problem, both because of the range in performance gain, and the fact that it alienates several of the company’s consumers in recent years, by leaving out RDNA and Polaris, the Radeon RX 5000 and RX 500 architectures. , for example. It’s too little, and too late.